Sep 9 04:58:42.744216 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 9 04:58:42.744236 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 03:38:34 -00 2025 Sep 9 04:58:42.744246 kernel: KASLR enabled Sep 9 04:58:42.744252 kernel: efi: EFI v2.7 by EDK II Sep 9 04:58:42.744257 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb221f18 Sep 9 04:58:42.744262 kernel: random: crng init done Sep 9 04:58:42.744269 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 9 04:58:42.744275 kernel: secureboot: Secure boot enabled Sep 9 04:58:42.744280 kernel: ACPI: Early table checksum verification disabled Sep 9 04:58:42.744288 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Sep 9 04:58:42.744293 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 9 04:58:42.744299 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744305 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744310 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744318 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744325 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744331 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744337 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744343 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744349 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:58:42.744355 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 9 04:58:42.744361 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 04:58:42.744366 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:58:42.744372 kernel: NODE_DATA(0) allocated [mem 0xdc737a00-0xdc73efff] Sep 9 04:58:42.744382 kernel: Zone ranges: Sep 9 04:58:42.744389 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:58:42.744395 kernel: DMA32 empty Sep 9 04:58:42.744401 kernel: Normal empty Sep 9 04:58:42.744407 kernel: Device empty Sep 9 04:58:42.744413 kernel: Movable zone start for each node Sep 9 04:58:42.744419 kernel: Early memory node ranges Sep 9 04:58:42.744425 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Sep 9 04:58:42.744431 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Sep 9 04:58:42.744437 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Sep 9 04:58:42.744443 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Sep 9 04:58:42.744448 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Sep 9 04:58:42.744454 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Sep 9 04:58:42.744461 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Sep 9 04:58:42.744468 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Sep 9 04:58:42.744474 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 9 04:58:42.744493 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:58:42.744500 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 9 04:58:42.744506 kernel: cma: Reserved 16 MiB at 0x00000000d7a00000 on node -1 Sep 9 04:58:42.744513 kernel: psci: probing for conduit method from ACPI. Sep 9 04:58:42.744521 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 04:58:42.744528 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 04:58:42.744534 kernel: psci: Trusted OS migration not required Sep 9 04:58:42.744540 kernel: psci: SMC Calling Convention v1.1 Sep 9 04:58:42.744547 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 9 04:58:42.744553 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 04:58:42.744560 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 04:58:42.744566 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 9 04:58:42.744573 kernel: Detected PIPT I-cache on CPU0 Sep 9 04:58:42.744580 kernel: CPU features: detected: GIC system register CPU interface Sep 9 04:58:42.744587 kernel: CPU features: detected: Spectre-v4 Sep 9 04:58:42.744593 kernel: CPU features: detected: Spectre-BHB Sep 9 04:58:42.744599 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 04:58:42.744606 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 04:58:42.744612 kernel: CPU features: detected: ARM erratum 1418040 Sep 9 04:58:42.744619 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 04:58:42.744625 kernel: alternatives: applying boot alternatives Sep 9 04:58:42.744632 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:58:42.744639 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 04:58:42.744645 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 04:58:42.744653 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 04:58:42.744660 kernel: Fallback order for Node 0: 0 Sep 9 04:58:42.744666 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 9 04:58:42.744672 kernel: Policy zone: DMA Sep 9 04:58:42.744678 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 04:58:42.744685 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 9 04:58:42.744691 kernel: software IO TLB: area num 4. Sep 9 04:58:42.744697 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 9 04:58:42.744704 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Sep 9 04:58:42.744710 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 04:58:42.744716 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 04:58:42.744730 kernel: rcu: RCU event tracing is enabled. Sep 9 04:58:42.744738 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 04:58:42.744745 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 04:58:42.744751 kernel: Tracing variant of Tasks RCU enabled. Sep 9 04:58:42.744758 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 04:58:42.744764 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 04:58:42.744771 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:58:42.744777 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:58:42.744784 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 04:58:42.744790 kernel: GICv3: 256 SPIs implemented Sep 9 04:58:42.744796 kernel: GICv3: 0 Extended SPIs implemented Sep 9 04:58:42.744802 kernel: Root IRQ handler: gic_handle_irq Sep 9 04:58:42.744810 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 9 04:58:42.744817 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 9 04:58:42.744823 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 9 04:58:42.744829 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 9 04:58:42.744836 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 9 04:58:42.744843 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 9 04:58:42.744849 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 9 04:58:42.744855 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 9 04:58:42.744862 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 04:58:42.744868 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:58:42.744875 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 9 04:58:42.744881 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 9 04:58:42.744889 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 9 04:58:42.744895 kernel: arm-pv: using stolen time PV Sep 9 04:58:42.744902 kernel: Console: colour dummy device 80x25 Sep 9 04:58:42.744908 kernel: ACPI: Core revision 20240827 Sep 9 04:58:42.744915 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 9 04:58:42.744922 kernel: pid_max: default: 32768 minimum: 301 Sep 9 04:58:42.744928 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 04:58:42.744935 kernel: landlock: Up and running. Sep 9 04:58:42.744941 kernel: SELinux: Initializing. Sep 9 04:58:42.744949 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:58:42.744956 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:58:42.744962 kernel: rcu: Hierarchical SRCU implementation. Sep 9 04:58:42.744969 kernel: rcu: Max phase no-delay instances is 400. Sep 9 04:58:42.744976 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 04:58:42.744982 kernel: Remapping and enabling EFI services. Sep 9 04:58:42.744989 kernel: smp: Bringing up secondary CPUs ... Sep 9 04:58:42.744995 kernel: Detected PIPT I-cache on CPU1 Sep 9 04:58:42.745002 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 9 04:58:42.745010 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 9 04:58:42.745021 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:58:42.745028 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 9 04:58:42.745036 kernel: Detected PIPT I-cache on CPU2 Sep 9 04:58:42.745043 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 9 04:58:42.745050 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 9 04:58:42.745057 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:58:42.745063 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 9 04:58:42.745070 kernel: Detected PIPT I-cache on CPU3 Sep 9 04:58:42.745078 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 9 04:58:42.745086 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 9 04:58:42.745093 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:58:42.745099 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 9 04:58:42.745106 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 04:58:42.745113 kernel: SMP: Total of 4 processors activated. Sep 9 04:58:42.745119 kernel: CPU: All CPU(s) started at EL1 Sep 9 04:58:42.745126 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 04:58:42.745133 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 04:58:42.745141 kernel: CPU features: detected: Common not Private translations Sep 9 04:58:42.745148 kernel: CPU features: detected: CRC32 instructions Sep 9 04:58:42.745155 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 9 04:58:42.745162 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 04:58:42.745169 kernel: CPU features: detected: LSE atomic instructions Sep 9 04:58:42.745175 kernel: CPU features: detected: Privileged Access Never Sep 9 04:58:42.745182 kernel: CPU features: detected: RAS Extension Support Sep 9 04:58:42.745189 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 04:58:42.745196 kernel: alternatives: applying system-wide alternatives Sep 9 04:58:42.745209 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 9 04:58:42.745218 kernel: Memory: 2422372K/2572288K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38976K init, 1038K bss, 127580K reserved, 16384K cma-reserved) Sep 9 04:58:42.745225 kernel: devtmpfs: initialized Sep 9 04:58:42.745232 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 04:58:42.745238 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 04:58:42.745245 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 04:58:42.745252 kernel: 0 pages in range for non-PLT usage Sep 9 04:58:42.745259 kernel: 508560 pages in range for PLT usage Sep 9 04:58:42.745266 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 04:58:42.745275 kernel: SMBIOS 3.0.0 present. Sep 9 04:58:42.745282 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 9 04:58:42.745289 kernel: DMI: Memory slots populated: 1/1 Sep 9 04:58:42.745296 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 04:58:42.745303 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 04:58:42.745310 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 04:58:42.745316 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 04:58:42.745323 kernel: audit: initializing netlink subsys (disabled) Sep 9 04:58:42.745330 kernel: audit: type=2000 audit(0.023:1): state=initialized audit_enabled=0 res=1 Sep 9 04:58:42.745339 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 04:58:42.745346 kernel: cpuidle: using governor menu Sep 9 04:58:42.745353 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 04:58:42.745360 kernel: ASID allocator initialised with 32768 entries Sep 9 04:58:42.745367 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 04:58:42.745374 kernel: Serial: AMBA PL011 UART driver Sep 9 04:58:42.745381 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 04:58:42.745388 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 04:58:42.745395 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 04:58:42.745404 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 04:58:42.745411 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 04:58:42.745418 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 04:58:42.745425 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 04:58:42.745432 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 04:58:42.745439 kernel: ACPI: Added _OSI(Module Device) Sep 9 04:58:42.745446 kernel: ACPI: Added _OSI(Processor Device) Sep 9 04:58:42.745453 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 04:58:42.745460 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 04:58:42.745468 kernel: ACPI: Interpreter enabled Sep 9 04:58:42.745475 kernel: ACPI: Using GIC for interrupt routing Sep 9 04:58:42.745562 kernel: ACPI: MCFG table detected, 1 entries Sep 9 04:58:42.745574 kernel: ACPI: CPU0 has been hot-added Sep 9 04:58:42.745582 kernel: ACPI: CPU1 has been hot-added Sep 9 04:58:42.745589 kernel: ACPI: CPU2 has been hot-added Sep 9 04:58:42.745596 kernel: ACPI: CPU3 has been hot-added Sep 9 04:58:42.745603 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 9 04:58:42.745610 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 04:58:42.745620 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 04:58:42.745759 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 04:58:42.745825 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 04:58:42.745884 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 04:58:42.745943 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 9 04:58:42.746000 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 9 04:58:42.746009 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 9 04:58:42.746019 kernel: PCI host bridge to bus 0000:00 Sep 9 04:58:42.746082 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 9 04:58:42.746138 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 9 04:58:42.746191 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 9 04:58:42.746254 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 04:58:42.746333 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 9 04:58:42.746412 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 04:58:42.746478 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 9 04:58:42.746559 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 9 04:58:42.746619 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 9 04:58:42.746678 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 9 04:58:42.746747 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 9 04:58:42.746811 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 9 04:58:42.746865 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 9 04:58:42.746921 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 9 04:58:42.746987 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 9 04:58:42.746996 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 9 04:58:42.747003 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 9 04:58:42.747010 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 9 04:58:42.747017 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 9 04:58:42.747024 kernel: iommu: Default domain type: Translated Sep 9 04:58:42.747031 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 04:58:42.747044 kernel: efivars: Registered efivars operations Sep 9 04:58:42.747051 kernel: vgaarb: loaded Sep 9 04:58:42.747058 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 04:58:42.747065 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 04:58:42.747072 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 04:58:42.747079 kernel: pnp: PnP ACPI init Sep 9 04:58:42.747149 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 9 04:58:42.747160 kernel: pnp: PnP ACPI: found 1 devices Sep 9 04:58:42.747169 kernel: NET: Registered PF_INET protocol family Sep 9 04:58:42.747176 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 04:58:42.747183 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 04:58:42.747190 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 04:58:42.747197 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 04:58:42.747210 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 04:58:42.747219 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 04:58:42.747226 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:58:42.747233 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:58:42.747242 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 04:58:42.747249 kernel: PCI: CLS 0 bytes, default 64 Sep 9 04:58:42.747256 kernel: kvm [1]: HYP mode not available Sep 9 04:58:42.747263 kernel: Initialise system trusted keyrings Sep 9 04:58:42.747269 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 04:58:42.747276 kernel: Key type asymmetric registered Sep 9 04:58:42.747283 kernel: Asymmetric key parser 'x509' registered Sep 9 04:58:42.747290 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 04:58:42.747297 kernel: io scheduler mq-deadline registered Sep 9 04:58:42.747305 kernel: io scheduler kyber registered Sep 9 04:58:42.747312 kernel: io scheduler bfq registered Sep 9 04:58:42.747320 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 9 04:58:42.747327 kernel: ACPI: button: Power Button [PWRB] Sep 9 04:58:42.747334 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 9 04:58:42.747406 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 9 04:58:42.747417 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 04:58:42.747424 kernel: thunder_xcv, ver 1.0 Sep 9 04:58:42.747431 kernel: thunder_bgx, ver 1.0 Sep 9 04:58:42.747440 kernel: nicpf, ver 1.0 Sep 9 04:58:42.747446 kernel: nicvf, ver 1.0 Sep 9 04:58:42.747547 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 04:58:42.747611 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T04:58:42 UTC (1757393922) Sep 9 04:58:42.747622 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 04:58:42.747629 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 04:58:42.747636 kernel: watchdog: NMI not fully supported Sep 9 04:58:42.747643 kernel: watchdog: Hard watchdog permanently disabled Sep 9 04:58:42.747654 kernel: NET: Registered PF_INET6 protocol family Sep 9 04:58:42.747661 kernel: Segment Routing with IPv6 Sep 9 04:58:42.747668 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 04:58:42.747675 kernel: NET: Registered PF_PACKET protocol family Sep 9 04:58:42.747682 kernel: Key type dns_resolver registered Sep 9 04:58:42.747689 kernel: registered taskstats version 1 Sep 9 04:58:42.747696 kernel: Loading compiled-in X.509 certificates Sep 9 04:58:42.747703 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 44d1e8b5c5ffbaa3cedd99c03d41580671fabec5' Sep 9 04:58:42.747710 kernel: Demotion targets for Node 0: null Sep 9 04:58:42.747718 kernel: Key type .fscrypt registered Sep 9 04:58:42.747731 kernel: Key type fscrypt-provisioning registered Sep 9 04:58:42.747738 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 04:58:42.747745 kernel: ima: Allocated hash algorithm: sha1 Sep 9 04:58:42.747752 kernel: ima: No architecture policies found Sep 9 04:58:42.747758 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 04:58:42.747765 kernel: clk: Disabling unused clocks Sep 9 04:58:42.747772 kernel: PM: genpd: Disabling unused power domains Sep 9 04:58:42.747779 kernel: Warning: unable to open an initial console. Sep 9 04:58:42.747788 kernel: Freeing unused kernel memory: 38976K Sep 9 04:58:42.747795 kernel: Run /init as init process Sep 9 04:58:42.747802 kernel: with arguments: Sep 9 04:58:42.747809 kernel: /init Sep 9 04:58:42.747815 kernel: with environment: Sep 9 04:58:42.747822 kernel: HOME=/ Sep 9 04:58:42.747829 kernel: TERM=linux Sep 9 04:58:42.747836 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 04:58:42.747843 systemd[1]: Successfully made /usr/ read-only. Sep 9 04:58:42.747854 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:58:42.747862 systemd[1]: Detected virtualization kvm. Sep 9 04:58:42.747870 systemd[1]: Detected architecture arm64. Sep 9 04:58:42.747877 systemd[1]: Running in initrd. Sep 9 04:58:42.747884 systemd[1]: No hostname configured, using default hostname. Sep 9 04:58:42.747892 systemd[1]: Hostname set to . Sep 9 04:58:42.747899 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:58:42.747907 systemd[1]: Queued start job for default target initrd.target. Sep 9 04:58:42.747915 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:58:42.747922 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:58:42.747930 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 04:58:42.747938 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:58:42.747945 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 04:58:42.747953 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 04:58:42.747963 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 04:58:42.747971 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 04:58:42.747978 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:58:42.747986 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:58:42.747993 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:58:42.748001 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:58:42.748008 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:58:42.748015 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:58:42.748024 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:58:42.748031 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:58:42.748039 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 04:58:42.748047 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 04:58:42.748054 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:58:42.748062 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:58:42.748069 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:58:42.748077 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:58:42.748084 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 04:58:42.748093 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:58:42.748101 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 04:58:42.748112 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 04:58:42.748119 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 04:58:42.748127 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:58:42.748134 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:58:42.748142 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:58:42.748149 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 04:58:42.748159 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:58:42.748166 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 04:58:42.748189 systemd-journald[244]: Collecting audit messages is disabled. Sep 9 04:58:42.748217 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:58:42.748226 systemd-journald[244]: Journal started Sep 9 04:58:42.748243 systemd-journald[244]: Runtime Journal (/run/log/journal/ffad0180eafd45589ef1d36a9c4d356d) is 6M, max 48.5M, 42.4M free. Sep 9 04:58:42.757563 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 04:58:42.757586 kernel: Bridge firewalling registered Sep 9 04:58:42.757595 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:58:42.740740 systemd-modules-load[245]: Inserted module 'overlay' Sep 9 04:58:42.754633 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 9 04:58:42.762516 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:58:42.762683 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:58:42.764764 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:58:42.766953 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 04:58:42.769445 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:58:42.782604 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:58:42.786589 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:58:42.788924 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:58:42.792855 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:58:42.794860 systemd-tmpfiles[273]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 04:58:42.794984 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:58:42.796747 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 04:58:42.799515 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:58:42.809758 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:58:42.821651 dracut-cmdline[288]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:58:42.838148 systemd-resolved[292]: Positive Trust Anchors: Sep 9 04:58:42.838169 systemd-resolved[292]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:58:42.838199 systemd-resolved[292]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:58:42.843173 systemd-resolved[292]: Defaulting to hostname 'linux'. Sep 9 04:58:42.844226 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:58:42.847562 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:58:42.896512 kernel: SCSI subsystem initialized Sep 9 04:58:42.900506 kernel: Loading iSCSI transport class v2.0-870. Sep 9 04:58:42.910510 kernel: iscsi: registered transport (tcp) Sep 9 04:58:42.921521 kernel: iscsi: registered transport (qla4xxx) Sep 9 04:58:42.921572 kernel: QLogic iSCSI HBA Driver Sep 9 04:58:42.938478 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:58:42.966198 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:58:42.968784 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:58:43.013024 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 04:58:43.015108 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 04:58:43.080541 kernel: raid6: neonx8 gen() 15783 MB/s Sep 9 04:58:43.095508 kernel: raid6: neonx4 gen() 15697 MB/s Sep 9 04:58:43.112504 kernel: raid6: neonx2 gen() 13144 MB/s Sep 9 04:58:43.129503 kernel: raid6: neonx1 gen() 10432 MB/s Sep 9 04:58:43.146502 kernel: raid6: int64x8 gen() 6852 MB/s Sep 9 04:58:43.163499 kernel: raid6: int64x4 gen() 7318 MB/s Sep 9 04:58:43.180499 kernel: raid6: int64x2 gen() 6058 MB/s Sep 9 04:58:43.197502 kernel: raid6: int64x1 gen() 5017 MB/s Sep 9 04:58:43.197523 kernel: raid6: using algorithm neonx8 gen() 15783 MB/s Sep 9 04:58:43.214508 kernel: raid6: .... xor() 11934 MB/s, rmw enabled Sep 9 04:58:43.214527 kernel: raid6: using neon recovery algorithm Sep 9 04:58:43.219650 kernel: xor: measuring software checksum speed Sep 9 04:58:43.219677 kernel: 8regs : 20807 MB/sec Sep 9 04:58:43.220746 kernel: 32regs : 21681 MB/sec Sep 9 04:58:43.220762 kernel: arm64_neon : 27007 MB/sec Sep 9 04:58:43.220773 kernel: xor: using function: arm64_neon (27007 MB/sec) Sep 9 04:58:43.274525 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 04:58:43.280400 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:58:43.282962 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:58:43.310440 systemd-udevd[502]: Using default interface naming scheme 'v255'. Sep 9 04:58:43.314526 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:58:43.316724 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 04:58:43.345028 dracut-pre-trigger[510]: rd.md=0: removing MD RAID activation Sep 9 04:58:43.368593 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:58:43.370839 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:58:43.424432 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:58:43.427307 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 04:58:43.483716 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 9 04:58:43.483906 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 04:58:43.486976 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 04:58:43.487028 kernel: GPT:9289727 != 19775487 Sep 9 04:58:43.487038 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 04:58:43.487047 kernel: GPT:9289727 != 19775487 Sep 9 04:58:43.487769 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 04:58:43.487805 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:58:43.490563 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:58:43.490684 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:58:43.494371 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:58:43.497137 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:58:43.520455 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 04:58:43.521656 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 04:58:43.528538 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:58:43.536786 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 04:58:43.544180 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:58:43.550181 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 04:58:43.551298 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 04:58:43.553271 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:58:43.555699 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:58:43.557397 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:58:43.559954 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 04:58:43.561639 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 04:58:43.586986 disk-uuid[592]: Primary Header is updated. Sep 9 04:58:43.586986 disk-uuid[592]: Secondary Entries is updated. Sep 9 04:58:43.586986 disk-uuid[592]: Secondary Header is updated. Sep 9 04:58:43.591610 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:58:43.594494 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:58:43.597512 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:58:44.599148 disk-uuid[596]: The operation has completed successfully. Sep 9 04:58:44.600160 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:58:44.623114 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 04:58:44.623227 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 04:58:44.648100 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 04:58:44.674642 sh[612]: Success Sep 9 04:58:44.686521 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 04:58:44.686570 kernel: device-mapper: uevent: version 1.0.3 Sep 9 04:58:44.686582 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 04:58:44.693508 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 04:58:44.717757 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 04:58:44.719375 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 04:58:44.732878 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 04:58:44.737603 kernel: BTRFS: device fsid 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (625) Sep 9 04:58:44.737630 kernel: BTRFS info (device dm-0): first mount of filesystem 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 Sep 9 04:58:44.739136 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:58:44.742599 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 04:58:44.742619 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 04:58:44.743621 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 04:58:44.744683 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:58:44.745758 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 04:58:44.746519 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 04:58:44.749146 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 04:58:44.771224 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (657) Sep 9 04:58:44.771262 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:58:44.772111 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:58:44.774499 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:58:44.774531 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:58:44.778504 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:58:44.779576 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 04:58:44.781313 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 04:58:44.847130 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:58:44.850037 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:58:44.894979 ignition[699]: Ignition 2.22.0 Sep 9 04:58:44.894995 ignition[699]: Stage: fetch-offline Sep 9 04:58:44.895022 ignition[699]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:58:44.896013 systemd-networkd[798]: lo: Link UP Sep 9 04:58:44.895030 ignition[699]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:58:44.896016 systemd-networkd[798]: lo: Gained carrier Sep 9 04:58:44.895112 ignition[699]: parsed url from cmdline: "" Sep 9 04:58:44.896702 systemd-networkd[798]: Enumeration completed Sep 9 04:58:44.895115 ignition[699]: no config URL provided Sep 9 04:58:44.897074 systemd-networkd[798]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:58:44.895119 ignition[699]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:58:44.897077 systemd-networkd[798]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:58:44.895126 ignition[699]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:58:44.897407 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:58:44.895145 ignition[699]: op(1): [started] loading QEMU firmware config module Sep 9 04:58:44.897504 systemd-networkd[798]: eth0: Link UP Sep 9 04:58:44.895155 ignition[699]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 04:58:44.897800 systemd-networkd[798]: eth0: Gained carrier Sep 9 04:58:44.900290 ignition[699]: op(1): [finished] loading QEMU firmware config module Sep 9 04:58:44.897809 systemd-networkd[798]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:58:44.898960 systemd[1]: Reached target network.target - Network. Sep 9 04:58:44.934534 systemd-networkd[798]: eth0: DHCPv4 address 10.0.0.72/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:58:44.956099 ignition[699]: parsing config with SHA512: 9b51888e85ff33c9199508a71408beb47ed961d309160f04118fc788eb139c956cfa50c97afde19732d6c44518d444579f82a889e36ef7b48aafb2c8b4241967 Sep 9 04:58:44.962503 unknown[699]: fetched base config from "system" Sep 9 04:58:44.962514 unknown[699]: fetched user config from "qemu" Sep 9 04:58:44.962864 ignition[699]: fetch-offline: fetch-offline passed Sep 9 04:58:44.962915 ignition[699]: Ignition finished successfully Sep 9 04:58:44.965456 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:58:44.967063 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 04:58:44.967827 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 04:58:44.997237 ignition[814]: Ignition 2.22.0 Sep 9 04:58:44.997254 ignition[814]: Stage: kargs Sep 9 04:58:44.997393 ignition[814]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:58:44.997401 ignition[814]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:58:44.998144 ignition[814]: kargs: kargs passed Sep 9 04:58:45.000420 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 04:58:44.998183 ignition[814]: Ignition finished successfully Sep 9 04:58:45.002239 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 04:58:45.028713 ignition[822]: Ignition 2.22.0 Sep 9 04:58:45.028730 ignition[822]: Stage: disks Sep 9 04:58:45.028859 ignition[822]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:58:45.028867 ignition[822]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:58:45.031641 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 04:58:45.029584 ignition[822]: disks: disks passed Sep 9 04:58:45.032986 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 04:58:45.029624 ignition[822]: Ignition finished successfully Sep 9 04:58:45.034319 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 04:58:45.035593 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:58:45.036981 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:58:45.038311 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:58:45.040704 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 04:58:45.072125 systemd-fsck[833]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 04:58:45.075456 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 04:58:45.077176 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 04:58:45.133507 kernel: EXT4-fs (vda9): mounted filesystem 88574756-967d-44b3-be66-46689c8baf27 r/w with ordered data mode. Quota mode: none. Sep 9 04:58:45.133543 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 04:58:45.134616 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 04:58:45.137271 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:58:45.139302 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 04:58:45.140233 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 04:58:45.140273 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 04:58:45.140297 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:58:45.156989 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 04:58:45.158865 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 04:58:45.163336 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (841) Sep 9 04:58:45.163365 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:58:45.163376 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:58:45.166575 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:58:45.166603 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:58:45.168168 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:58:45.191420 initrd-setup-root[867]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 04:58:45.195351 initrd-setup-root[874]: cut: /sysroot/etc/group: No such file or directory Sep 9 04:58:45.199128 initrd-setup-root[881]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 04:58:45.202325 initrd-setup-root[888]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 04:58:45.265153 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 04:58:45.266966 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 04:58:45.268345 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 04:58:45.281535 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:58:45.296577 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 04:58:45.311158 ignition[957]: INFO : Ignition 2.22.0 Sep 9 04:58:45.311158 ignition[957]: INFO : Stage: mount Sep 9 04:58:45.312467 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:58:45.312467 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:58:45.312467 ignition[957]: INFO : mount: mount passed Sep 9 04:58:45.312467 ignition[957]: INFO : Ignition finished successfully Sep 9 04:58:45.315564 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 04:58:45.317301 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 04:58:45.738982 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 04:58:45.740520 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:58:45.763499 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (969) Sep 9 04:58:45.765135 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:58:45.765165 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:58:45.768524 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:58:45.768564 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:58:45.768594 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:58:45.808617 ignition[986]: INFO : Ignition 2.22.0 Sep 9 04:58:45.808617 ignition[986]: INFO : Stage: files Sep 9 04:58:45.810127 ignition[986]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:58:45.810127 ignition[986]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:58:45.810127 ignition[986]: DEBUG : files: compiled without relabeling support, skipping Sep 9 04:58:45.812899 ignition[986]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 04:58:45.812899 ignition[986]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 04:58:45.812899 ignition[986]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 04:58:45.812899 ignition[986]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 04:58:45.812899 ignition[986]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 04:58:45.812517 unknown[986]: wrote ssh authorized keys file for user: core Sep 9 04:58:45.819168 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 9 04:58:45.819168 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 9 04:58:45.892704 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 04:58:46.135943 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 9 04:58:46.135943 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:58:46.139492 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:58:46.156357 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 9 04:58:46.592165 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 04:58:46.798694 systemd-networkd[798]: eth0: Gained IPv6LL Sep 9 04:58:47.015130 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 04:58:47.015130 ignition[986]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 04:58:47.018571 ignition[986]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:58:47.020206 ignition[986]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:58:47.020206 ignition[986]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 04:58:47.020206 ignition[986]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 04:58:47.020206 ignition[986]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:58:47.020206 ignition[986]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:58:47.020206 ignition[986]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 04:58:47.020206 ignition[986]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 04:58:47.035540 ignition[986]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:58:47.039242 ignition[986]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:58:47.041693 ignition[986]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 04:58:47.041693 ignition[986]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 04:58:47.041693 ignition[986]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 04:58:47.041693 ignition[986]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:58:47.041693 ignition[986]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:58:47.041693 ignition[986]: INFO : files: files passed Sep 9 04:58:47.041693 ignition[986]: INFO : Ignition finished successfully Sep 9 04:58:47.042115 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 04:58:47.045736 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 04:58:47.049963 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 04:58:47.065389 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 04:58:47.066529 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 04:58:47.068477 initrd-setup-root-after-ignition[1014]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 04:58:47.070816 initrd-setup-root-after-ignition[1017]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:58:47.070816 initrd-setup-root-after-ignition[1017]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:58:47.074188 initrd-setup-root-after-ignition[1021]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:58:47.074568 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:58:47.076740 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 04:58:47.079169 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 04:58:47.118462 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 04:58:47.119393 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 04:58:47.120758 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 04:58:47.122438 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 04:58:47.124128 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 04:58:47.124956 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 04:58:47.150704 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:58:47.152953 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 04:58:47.173565 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:58:47.175452 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:58:47.177319 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 04:58:47.178819 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 04:58:47.179664 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:58:47.181568 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 04:58:47.182610 systemd[1]: Stopped target basic.target - Basic System. Sep 9 04:58:47.184091 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 04:58:47.185521 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:58:47.187385 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 04:58:47.188992 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:58:47.190599 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 04:58:47.192154 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:58:47.193770 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 04:58:47.195526 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 04:58:47.196925 systemd[1]: Stopped target swap.target - Swaps. Sep 9 04:58:47.198225 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 04:58:47.198357 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:58:47.200256 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:58:47.201805 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:58:47.203530 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 04:58:47.204598 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:58:47.206084 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 04:58:47.206213 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 04:58:47.208467 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 04:58:47.208608 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:58:47.210439 systemd[1]: Stopped target paths.target - Path Units. Sep 9 04:58:47.211841 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 04:58:47.216559 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:58:47.217693 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 04:58:47.219319 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 04:58:47.220640 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 04:58:47.220725 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:58:47.221915 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 04:58:47.221988 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:58:47.223435 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 04:58:47.223561 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:58:47.225167 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 04:58:47.225281 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 04:58:47.227387 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 04:58:47.229608 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 04:58:47.230518 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 04:58:47.230626 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:58:47.232403 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 04:58:47.232509 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:58:47.237165 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 04:58:47.237602 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 04:58:47.245972 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 04:58:47.252327 ignition[1041]: INFO : Ignition 2.22.0 Sep 9 04:58:47.252327 ignition[1041]: INFO : Stage: umount Sep 9 04:58:47.254658 ignition[1041]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:58:47.254658 ignition[1041]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:58:47.254658 ignition[1041]: INFO : umount: umount passed Sep 9 04:58:47.254658 ignition[1041]: INFO : Ignition finished successfully Sep 9 04:58:47.256663 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 04:58:47.256759 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 04:58:47.257938 systemd[1]: Stopped target network.target - Network. Sep 9 04:58:47.259156 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 04:58:47.259219 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 04:58:47.260519 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 04:58:47.260554 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 04:58:47.261922 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 04:58:47.261963 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 04:58:47.263239 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 04:58:47.263276 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 04:58:47.264791 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 04:58:47.266193 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 04:58:47.272759 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 04:58:47.272894 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 04:58:47.276454 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 04:58:47.278228 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 04:58:47.278306 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:58:47.282175 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:58:47.282421 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 04:58:47.283310 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 04:58:47.287052 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 04:58:47.287200 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 04:58:47.289056 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 04:58:47.289086 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:58:47.291528 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 04:58:47.292926 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 04:58:47.292977 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:58:47.294908 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 04:58:47.294953 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:58:47.297790 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 04:58:47.297831 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 04:58:47.299495 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:58:47.303163 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 04:58:47.303456 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 04:58:47.303563 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 04:58:47.306879 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 04:58:47.306959 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 04:58:47.314122 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 04:58:47.319645 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:58:47.320895 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 04:58:47.320929 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 04:58:47.322345 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 04:58:47.322372 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:58:47.323838 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 04:58:47.323884 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:58:47.326115 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 04:58:47.326160 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 04:58:47.328198 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 04:58:47.328243 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:58:47.331279 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 04:58:47.332798 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 04:58:47.332853 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:58:47.335401 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 04:58:47.335444 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:58:47.338206 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 04:58:47.338268 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:58:47.340816 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 04:58:47.340859 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:58:47.342446 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:58:47.342497 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:58:47.345623 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 04:58:47.345728 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 04:58:47.350469 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 04:58:47.350605 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 04:58:47.352419 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 04:58:47.354618 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 04:58:47.374341 systemd[1]: Switching root. Sep 9 04:58:47.406536 systemd-journald[244]: Journal stopped Sep 9 04:58:48.267461 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 9 04:58:48.267530 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 04:58:48.268080 kernel: SELinux: policy capability open_perms=1 Sep 9 04:58:48.268103 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 04:58:48.268115 kernel: SELinux: policy capability always_check_network=0 Sep 9 04:58:48.268127 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 04:58:48.268137 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 04:58:48.268146 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 04:58:48.268158 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 04:58:48.268167 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 04:58:48.268200 kernel: audit: type=1403 audit(1757393927.703:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 04:58:48.268218 systemd[1]: Successfully loaded SELinux policy in 43.837ms. Sep 9 04:58:48.268234 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.404ms. Sep 9 04:58:48.268246 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:58:48.268259 systemd[1]: Detected virtualization kvm. Sep 9 04:58:48.268270 systemd[1]: Detected architecture arm64. Sep 9 04:58:48.268280 systemd[1]: Detected first boot. Sep 9 04:58:48.268290 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:58:48.268300 kernel: NET: Registered PF_VSOCK protocol family Sep 9 04:58:48.268310 zram_generator::config[1086]: No configuration found. Sep 9 04:58:48.268321 systemd[1]: Populated /etc with preset unit settings. Sep 9 04:58:48.268332 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 04:58:48.268344 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 04:58:48.268355 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 04:58:48.268365 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 04:58:48.268376 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 04:58:48.268386 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 04:58:48.268397 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 04:58:48.268407 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 04:58:48.268420 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 04:58:48.268430 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 04:58:48.268442 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 04:58:48.268452 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 04:58:48.268463 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:58:48.268473 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:58:48.268520 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 04:58:48.268534 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 04:58:48.268545 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 04:58:48.268556 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:58:48.268566 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 04:58:48.268580 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:58:48.270581 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:58:48.270606 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 04:58:48.270617 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 04:58:48.270628 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 04:58:48.270638 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 04:58:48.270649 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:58:48.270665 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:58:48.270677 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:58:48.270693 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:58:48.270704 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 04:58:48.270715 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 04:58:48.270725 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 04:58:48.270736 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:58:48.270747 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:58:48.270757 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:58:48.270768 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 04:58:48.270780 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 04:58:48.270790 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 04:58:48.270801 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 04:58:48.270814 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 04:58:48.270824 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 04:58:48.270834 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 04:58:48.270846 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 04:58:48.270856 systemd[1]: Reached target machines.target - Containers. Sep 9 04:58:48.270869 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 04:58:48.270879 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:58:48.270890 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:58:48.270900 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 04:58:48.270911 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:58:48.270922 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:58:48.270933 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:58:48.270943 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 04:58:48.270960 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:58:48.270973 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 04:58:48.270983 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 04:58:48.270994 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 04:58:48.271005 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 04:58:48.271015 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 04:58:48.271026 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:58:48.271037 kernel: fuse: init (API version 7.41) Sep 9 04:58:48.271048 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:58:48.271060 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:58:48.271070 kernel: ACPI: bus type drm_connector registered Sep 9 04:58:48.271079 kernel: loop: module loaded Sep 9 04:58:48.271090 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:58:48.271100 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 04:58:48.271139 systemd-journald[1154]: Collecting audit messages is disabled. Sep 9 04:58:48.271165 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 04:58:48.271190 systemd-journald[1154]: Journal started Sep 9 04:58:48.271216 systemd-journald[1154]: Runtime Journal (/run/log/journal/ffad0180eafd45589ef1d36a9c4d356d) is 6M, max 48.5M, 42.4M free. Sep 9 04:58:48.069138 systemd[1]: Queued start job for default target multi-user.target. Sep 9 04:58:48.090418 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 04:58:48.090774 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 04:58:48.276112 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:58:48.276164 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 04:58:48.276805 systemd[1]: Stopped verity-setup.service. Sep 9 04:58:48.281004 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:58:48.281606 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 04:58:48.282634 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 04:58:48.283547 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 04:58:48.284457 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 04:58:48.285518 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 04:58:48.286419 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 04:58:48.289504 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 04:58:48.290687 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:58:48.291833 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 04:58:48.291993 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 04:58:48.293166 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:58:48.293335 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:58:48.294476 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:58:48.294637 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:58:48.295714 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:58:48.295863 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:58:48.297060 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 04:58:48.297225 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 04:58:48.298402 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:58:48.298568 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:58:48.299722 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:58:48.300832 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:58:48.302024 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 04:58:48.303307 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 04:58:48.315498 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:58:48.317556 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 04:58:48.319361 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 04:58:48.320500 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 04:58:48.320546 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:58:48.322134 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 04:58:48.330271 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 04:58:48.331332 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:58:48.332688 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 04:58:48.334590 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 04:58:48.335715 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:58:48.336965 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 04:58:48.338432 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:58:48.343831 systemd-journald[1154]: Time spent on flushing to /var/log/journal/ffad0180eafd45589ef1d36a9c4d356d is 23.243ms for 882 entries. Sep 9 04:58:48.343831 systemd-journald[1154]: System Journal (/var/log/journal/ffad0180eafd45589ef1d36a9c4d356d) is 8M, max 195.6M, 187.6M free. Sep 9 04:58:48.378092 systemd-journald[1154]: Received client request to flush runtime journal. Sep 9 04:58:48.378131 kernel: loop0: detected capacity change from 0 to 203944 Sep 9 04:58:48.378144 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 04:58:48.341014 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:58:48.342881 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 04:58:48.346280 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:58:48.349232 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:58:48.350740 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 04:58:48.354444 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 04:58:48.358280 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 04:58:48.361868 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 04:58:48.365687 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 04:58:48.370717 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Sep 9 04:58:48.370727 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Sep 9 04:58:48.377670 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:58:48.379721 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:58:48.381268 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 04:58:48.387641 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 04:58:48.390498 kernel: loop1: detected capacity change from 0 to 100632 Sep 9 04:58:48.397542 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 04:58:48.414513 kernel: loop2: detected capacity change from 0 to 119368 Sep 9 04:58:48.419144 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 04:58:48.421983 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:58:48.443569 kernel: loop3: detected capacity change from 0 to 203944 Sep 9 04:58:48.447300 systemd-tmpfiles[1224]: ACLs are not supported, ignoring. Sep 9 04:58:48.447317 systemd-tmpfiles[1224]: ACLs are not supported, ignoring. Sep 9 04:58:48.450498 kernel: loop4: detected capacity change from 0 to 100632 Sep 9 04:58:48.450358 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:58:48.455505 kernel: loop5: detected capacity change from 0 to 119368 Sep 9 04:58:48.460272 (sd-merge)[1226]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 04:58:48.460772 (sd-merge)[1226]: Merged extensions into '/usr'. Sep 9 04:58:48.464062 systemd[1]: Reload requested from client PID 1202 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 04:58:48.464084 systemd[1]: Reloading... Sep 9 04:58:48.523517 zram_generator::config[1252]: No configuration found. Sep 9 04:58:48.586903 ldconfig[1197]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 04:58:48.662141 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 04:58:48.662557 systemd[1]: Reloading finished in 198 ms. Sep 9 04:58:48.678040 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 04:58:48.679305 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 04:58:48.693565 systemd[1]: Starting ensure-sysext.service... Sep 9 04:58:48.695158 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:58:48.703768 systemd[1]: Reload requested from client PID 1287 ('systemctl') (unit ensure-sysext.service)... Sep 9 04:58:48.703783 systemd[1]: Reloading... Sep 9 04:58:48.707975 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 04:58:48.708010 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 04:58:48.708255 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 04:58:48.708444 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 04:58:48.709090 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 04:58:48.709337 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Sep 9 04:58:48.709386 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Sep 9 04:58:48.712085 systemd-tmpfiles[1288]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:58:48.712098 systemd-tmpfiles[1288]: Skipping /boot Sep 9 04:58:48.717819 systemd-tmpfiles[1288]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:58:48.717837 systemd-tmpfiles[1288]: Skipping /boot Sep 9 04:58:48.755534 zram_generator::config[1315]: No configuration found. Sep 9 04:58:48.887511 systemd[1]: Reloading finished in 183 ms. Sep 9 04:58:48.908440 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 04:58:48.915514 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:58:48.926652 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:58:48.928944 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 04:58:48.931011 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 04:58:48.935670 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:58:48.938746 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:58:48.942619 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 04:58:48.954742 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 04:58:48.956313 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 04:58:48.961597 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:58:48.963069 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:58:48.965211 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:58:48.977672 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:58:48.978621 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:58:48.978793 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:58:48.980913 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 04:58:48.983340 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 04:58:48.984999 augenrules[1381]: No rules Sep 9 04:58:48.985052 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:58:48.985262 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:58:48.987268 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:58:48.987440 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:58:48.988878 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:58:48.989022 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:58:48.991806 systemd-udevd[1361]: Using default interface naming scheme 'v255'. Sep 9 04:58:48.993674 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 04:58:48.995330 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:58:48.997529 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:58:48.999074 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 04:58:49.010372 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:58:49.011336 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:58:49.013026 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:58:49.015019 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:58:49.027834 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:58:49.030078 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:58:49.032750 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:58:49.032894 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:58:49.033009 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 04:58:49.034084 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:58:49.035762 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:58:49.039507 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:58:49.040890 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 04:58:49.042332 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:58:49.042507 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:58:49.043814 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:58:49.043980 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:58:49.048882 systemd[1]: Finished ensure-sysext.service. Sep 9 04:58:49.053920 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:58:49.054139 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:58:49.055347 augenrules[1394]: /sbin/augenrules: No change Sep 9 04:58:49.062110 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:58:49.062976 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:58:49.063049 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:58:49.065603 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 04:58:49.071589 augenrules[1454]: No rules Sep 9 04:58:49.075234 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:58:49.080036 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:58:49.084256 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 04:58:49.146638 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:58:49.148848 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 04:58:49.170561 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 04:58:49.212042 systemd-networkd[1452]: lo: Link UP Sep 9 04:58:49.212050 systemd-networkd[1452]: lo: Gained carrier Sep 9 04:58:49.212849 systemd-networkd[1452]: Enumeration completed Sep 9 04:58:49.212966 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:58:49.213259 systemd-resolved[1354]: Positive Trust Anchors: Sep 9 04:58:49.213268 systemd-resolved[1354]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:58:49.213300 systemd-resolved[1354]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:58:49.213360 systemd-networkd[1452]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:58:49.213363 systemd-networkd[1452]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:58:49.213858 systemd-networkd[1452]: eth0: Link UP Sep 9 04:58:49.213959 systemd-networkd[1452]: eth0: Gained carrier Sep 9 04:58:49.213972 systemd-networkd[1452]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:58:49.216599 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 04:58:49.218739 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 04:58:49.224361 systemd-resolved[1354]: Defaulting to hostname 'linux'. Sep 9 04:58:49.228859 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:58:49.230577 systemd[1]: Reached target network.target - Network. Sep 9 04:58:49.231374 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:58:49.236560 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 04:58:49.237580 systemd-networkd[1452]: eth0: DHCPv4 address 10.0.0.72/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:58:49.237924 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:58:49.238259 systemd-timesyncd[1453]: Network configuration changed, trying to establish connection. Sep 9 04:58:49.238924 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 04:58:49.240124 systemd-timesyncd[1453]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 04:58:49.240257 systemd-timesyncd[1453]: Initial clock synchronization to Tue 2025-09-09 04:58:49.270274 UTC. Sep 9 04:58:49.241729 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 04:58:49.242846 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 04:58:49.243896 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 04:58:49.243927 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:58:49.244691 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 04:58:49.245934 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 04:58:49.246983 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 04:58:49.248062 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:58:49.251189 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 04:58:49.253679 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 04:58:49.256459 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 04:58:49.257793 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 04:58:49.258816 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 04:58:49.267757 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 04:58:49.269253 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 04:58:49.272526 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 04:58:49.273801 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 04:58:49.282258 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:58:49.283254 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:58:49.284081 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:58:49.284112 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:58:49.285126 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 04:58:49.287051 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 04:58:49.288814 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 04:58:49.298298 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 04:58:49.301104 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 04:58:49.302001 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 04:58:49.302983 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 04:58:49.306098 jq[1495]: false Sep 9 04:58:49.305925 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 04:58:49.307817 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 04:58:49.309798 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 04:58:49.314635 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 04:58:49.317682 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:58:49.317812 extend-filesystems[1496]: Found /dev/vda6 Sep 9 04:58:49.320522 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 04:58:49.320948 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 04:58:49.321572 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 04:58:49.323121 extend-filesystems[1496]: Found /dev/vda9 Sep 9 04:58:49.325213 extend-filesystems[1496]: Checking size of /dev/vda9 Sep 9 04:58:49.328440 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 04:58:49.337520 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 04:58:49.337657 jq[1518]: true Sep 9 04:58:49.339098 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 04:58:49.339328 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 04:58:49.339613 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 04:58:49.339769 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 04:58:49.341589 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 04:58:49.343526 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 04:58:49.358095 update_engine[1510]: I20250909 04:58:49.357887 1510 main.cc:92] Flatcar Update Engine starting Sep 9 04:58:49.360529 extend-filesystems[1496]: Resized partition /dev/vda9 Sep 9 04:58:49.361476 extend-filesystems[1537]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 04:58:49.371243 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 04:58:49.370909 (ntainerd)[1534]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 04:58:49.374521 jq[1525]: true Sep 9 04:58:49.381653 tar[1523]: linux-arm64/helm Sep 9 04:58:49.394593 dbus-daemon[1493]: [system] SELinux support is enabled Sep 9 04:58:49.395677 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 04:58:49.400921 update_engine[1510]: I20250909 04:58:49.400869 1510 update_check_scheduler.cc:74] Next update check in 7m29s Sep 9 04:58:49.402421 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:58:49.412709 systemd[1]: Started update-engine.service - Update Engine. Sep 9 04:58:49.413898 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 04:58:49.413923 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 04:58:49.415228 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 04:58:49.415251 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 04:58:49.418861 systemd-logind[1504]: Watching system buttons on /dev/input/event0 (Power Button) Sep 9 04:58:49.419643 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 04:58:49.420804 systemd-logind[1504]: New seat seat0. Sep 9 04:58:49.424334 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 04:58:49.435502 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 04:58:49.451346 extend-filesystems[1537]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 04:58:49.451346 extend-filesystems[1537]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 04:58:49.451346 extend-filesystems[1537]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 04:58:49.459123 extend-filesystems[1496]: Resized filesystem in /dev/vda9 Sep 9 04:58:49.457262 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 04:58:49.457475 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 04:58:49.471673 locksmithd[1546]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 04:58:49.472855 bash[1564]: Updated "/home/core/.ssh/authorized_keys" Sep 9 04:58:49.476896 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 04:58:49.478715 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 04:58:49.550116 containerd[1534]: time="2025-09-09T04:58:49Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 04:58:49.551254 containerd[1534]: time="2025-09-09T04:58:49.551206320Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 04:58:49.562309 containerd[1534]: time="2025-09-09T04:58:49.562260840Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.64µs" Sep 9 04:58:49.562309 containerd[1534]: time="2025-09-09T04:58:49.562302360Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 04:58:49.562425 containerd[1534]: time="2025-09-09T04:58:49.562321160Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 04:58:49.562520 containerd[1534]: time="2025-09-09T04:58:49.562499200Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 04:58:49.562546 containerd[1534]: time="2025-09-09T04:58:49.562522880Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 04:58:49.562565 containerd[1534]: time="2025-09-09T04:58:49.562551200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:58:49.562631 containerd[1534]: time="2025-09-09T04:58:49.562605840Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:58:49.562631 containerd[1534]: time="2025-09-09T04:58:49.562628880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:58:49.562893 containerd[1534]: time="2025-09-09T04:58:49.562870840Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:58:49.562893 containerd[1534]: time="2025-09-09T04:58:49.562891120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:58:49.562935 containerd[1534]: time="2025-09-09T04:58:49.562903160Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:58:49.562935 containerd[1534]: time="2025-09-09T04:58:49.562912400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 04:58:49.563498 containerd[1534]: time="2025-09-09T04:58:49.562984200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 04:58:49.563544 containerd[1534]: time="2025-09-09T04:58:49.563517640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:58:49.564501 containerd[1534]: time="2025-09-09T04:58:49.563579920Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:58:49.564501 containerd[1534]: time="2025-09-09T04:58:49.563605360Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 04:58:49.564501 containerd[1534]: time="2025-09-09T04:58:49.563686920Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 04:58:49.564501 containerd[1534]: time="2025-09-09T04:58:49.564069800Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 04:58:49.564501 containerd[1534]: time="2025-09-09T04:58:49.564155080Z" level=info msg="metadata content store policy set" policy=shared Sep 9 04:58:49.567892 containerd[1534]: time="2025-09-09T04:58:49.567856440Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 04:58:49.567960 containerd[1534]: time="2025-09-09T04:58:49.567924760Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 04:58:49.567960 containerd[1534]: time="2025-09-09T04:58:49.567945600Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 04:58:49.568015 containerd[1534]: time="2025-09-09T04:58:49.567958880Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 04:58:49.568015 containerd[1534]: time="2025-09-09T04:58:49.567970920Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 04:58:49.568015 containerd[1534]: time="2025-09-09T04:58:49.567982640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 04:58:49.568015 containerd[1534]: time="2025-09-09T04:58:49.567993520Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 04:58:49.568015 containerd[1534]: time="2025-09-09T04:58:49.568005080Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 04:58:49.568103 containerd[1534]: time="2025-09-09T04:58:49.568018000Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 04:58:49.568103 containerd[1534]: time="2025-09-09T04:58:49.568029360Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 04:58:49.568103 containerd[1534]: time="2025-09-09T04:58:49.568038720Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 04:58:49.568103 containerd[1534]: time="2025-09-09T04:58:49.568050920Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 04:58:49.568208 containerd[1534]: time="2025-09-09T04:58:49.568184240Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568215040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568230000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568240600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568250760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568267280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568277600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568287920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 04:58:49.568239 containerd[1534]: time="2025-09-09T04:58:49.568299040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 04:58:49.568449 containerd[1534]: time="2025-09-09T04:58:49.568309760Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 04:58:49.568449 containerd[1534]: time="2025-09-09T04:58:49.568319360Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 04:58:49.568537 containerd[1534]: time="2025-09-09T04:58:49.568520400Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 04:58:49.568573 containerd[1534]: time="2025-09-09T04:58:49.568540080Z" level=info msg="Start snapshots syncer" Sep 9 04:58:49.568573 containerd[1534]: time="2025-09-09T04:58:49.568565960Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 04:58:49.568818 containerd[1534]: time="2025-09-09T04:58:49.568781880Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 04:58:49.568929 containerd[1534]: time="2025-09-09T04:58:49.568832920Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 04:58:49.568929 containerd[1534]: time="2025-09-09T04:58:49.568899000Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 04:58:49.569042 containerd[1534]: time="2025-09-09T04:58:49.568995880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 04:58:49.569042 containerd[1534]: time="2025-09-09T04:58:49.569024080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 04:58:49.569042 containerd[1534]: time="2025-09-09T04:58:49.569035920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 04:58:49.569121 containerd[1534]: time="2025-09-09T04:58:49.569046000Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 04:58:49.569121 containerd[1534]: time="2025-09-09T04:58:49.569057400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 04:58:49.569121 containerd[1534]: time="2025-09-09T04:58:49.569069320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 04:58:49.569121 containerd[1534]: time="2025-09-09T04:58:49.569080080Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 04:58:49.569121 containerd[1534]: time="2025-09-09T04:58:49.569104800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569123840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569134240Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569160320Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569185760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569197520Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569209800Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569218040Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569230760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 04:58:49.569280 containerd[1534]: time="2025-09-09T04:58:49.569240880Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 04:58:49.569434 containerd[1534]: time="2025-09-09T04:58:49.569317680Z" level=info msg="runtime interface created" Sep 9 04:58:49.569434 containerd[1534]: time="2025-09-09T04:58:49.569323080Z" level=info msg="created NRI interface" Sep 9 04:58:49.569434 containerd[1534]: time="2025-09-09T04:58:49.569330640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 04:58:49.569434 containerd[1534]: time="2025-09-09T04:58:49.569342880Z" level=info msg="Connect containerd service" Sep 9 04:58:49.569434 containerd[1534]: time="2025-09-09T04:58:49.569368920Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 04:58:49.570782 containerd[1534]: time="2025-09-09T04:58:49.570753520Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:58:49.637658 containerd[1534]: time="2025-09-09T04:58:49.637472600Z" level=info msg="Start subscribing containerd event" Sep 9 04:58:49.637658 containerd[1534]: time="2025-09-09T04:58:49.637550960Z" level=info msg="Start recovering state" Sep 9 04:58:49.638335 containerd[1534]: time="2025-09-09T04:58:49.638159520Z" level=info msg="Start event monitor" Sep 9 04:58:49.638335 containerd[1534]: time="2025-09-09T04:58:49.638195200Z" level=info msg="Start cni network conf syncer for default" Sep 9 04:58:49.638335 containerd[1534]: time="2025-09-09T04:58:49.638206400Z" level=info msg="Start streaming server" Sep 9 04:58:49.638335 containerd[1534]: time="2025-09-09T04:58:49.638215000Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 04:58:49.638335 containerd[1534]: time="2025-09-09T04:58:49.638221880Z" level=info msg="runtime interface starting up..." Sep 9 04:58:49.638335 containerd[1534]: time="2025-09-09T04:58:49.638227200Z" level=info msg="starting plugins..." Sep 9 04:58:49.638335 containerd[1534]: time="2025-09-09T04:58:49.638240680Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 04:58:49.638864 containerd[1534]: time="2025-09-09T04:58:49.638835960Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 04:58:49.638864 containerd[1534]: time="2025-09-09T04:58:49.638896640Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 04:58:49.639042 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 04:58:49.640598 containerd[1534]: time="2025-09-09T04:58:49.640272320Z" level=info msg="containerd successfully booted in 0.090483s" Sep 9 04:58:49.691065 tar[1523]: linux-arm64/LICENSE Sep 9 04:58:49.691258 tar[1523]: linux-arm64/README.md Sep 9 04:58:49.709519 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 04:58:50.446619 systemd-networkd[1452]: eth0: Gained IPv6LL Sep 9 04:58:50.451984 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 04:58:50.453477 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 04:58:50.455711 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 04:58:50.457847 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:58:50.459771 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 04:58:50.485680 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 04:58:50.489607 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 04:58:50.489830 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 04:58:50.491346 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 04:58:50.969232 sshd_keygen[1511]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 04:58:50.992705 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 04:58:50.996285 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 04:58:51.000407 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:58:51.003798 (kubelet)[1626]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:58:51.004064 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 04:58:51.004261 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 04:58:51.006810 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 04:58:51.016342 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 04:58:51.019318 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 04:58:51.021876 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 04:58:51.023006 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 04:58:51.023950 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 04:58:51.025585 systemd[1]: Startup finished in 1.991s (kernel) + 5.109s (initrd) + 3.366s (userspace) = 10.467s. Sep 9 04:58:51.358443 kubelet[1626]: E0909 04:58:51.358334 1626 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:58:51.360791 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:58:51.360922 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:58:51.361241 systemd[1]: kubelet.service: Consumed 755ms CPU time, 257M memory peak. Sep 9 04:58:55.713315 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 04:58:55.714999 systemd[1]: Started sshd@0-10.0.0.72:22-10.0.0.1:41864.service - OpenSSH per-connection server daemon (10.0.0.1:41864). Sep 9 04:58:55.794978 sshd[1646]: Accepted publickey for core from 10.0.0.1 port 41864 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:58:55.796756 sshd-session[1646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:55.806134 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 04:58:55.807497 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 04:58:55.813476 systemd-logind[1504]: New session 1 of user core. Sep 9 04:58:55.824877 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 04:58:55.829752 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 04:58:55.840358 (systemd)[1651]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 04:58:55.842806 systemd-logind[1504]: New session c1 of user core. Sep 9 04:58:55.948938 systemd[1651]: Queued start job for default target default.target. Sep 9 04:58:55.971405 systemd[1651]: Created slice app.slice - User Application Slice. Sep 9 04:58:55.971432 systemd[1651]: Reached target paths.target - Paths. Sep 9 04:58:55.971469 systemd[1651]: Reached target timers.target - Timers. Sep 9 04:58:55.972582 systemd[1651]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 04:58:55.981924 systemd[1651]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 04:58:55.981984 systemd[1651]: Reached target sockets.target - Sockets. Sep 9 04:58:55.982017 systemd[1651]: Reached target basic.target - Basic System. Sep 9 04:58:55.982043 systemd[1651]: Reached target default.target - Main User Target. Sep 9 04:58:55.982069 systemd[1651]: Startup finished in 133ms. Sep 9 04:58:55.982190 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 04:58:55.984218 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 04:58:56.054467 systemd[1]: Started sshd@1-10.0.0.72:22-10.0.0.1:41870.service - OpenSSH per-connection server daemon (10.0.0.1:41870). Sep 9 04:58:56.123089 sshd[1662]: Accepted publickey for core from 10.0.0.1 port 41870 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:58:56.124534 sshd-session[1662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:56.129267 systemd-logind[1504]: New session 2 of user core. Sep 9 04:58:56.136691 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 04:58:56.192162 sshd[1665]: Connection closed by 10.0.0.1 port 41870 Sep 9 04:58:56.192470 sshd-session[1662]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:56.202472 systemd[1]: sshd@1-10.0.0.72:22-10.0.0.1:41870.service: Deactivated successfully. Sep 9 04:58:56.205881 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 04:58:56.207533 systemd-logind[1504]: Session 2 logged out. Waiting for processes to exit. Sep 9 04:58:56.208503 systemd[1]: Started sshd@2-10.0.0.72:22-10.0.0.1:41882.service - OpenSSH per-connection server daemon (10.0.0.1:41882). Sep 9 04:58:56.210072 systemd-logind[1504]: Removed session 2. Sep 9 04:58:56.259878 sshd[1671]: Accepted publickey for core from 10.0.0.1 port 41882 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:58:56.262017 sshd-session[1671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:56.268864 systemd-logind[1504]: New session 3 of user core. Sep 9 04:58:56.276649 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 04:58:56.324048 sshd[1675]: Connection closed by 10.0.0.1 port 41882 Sep 9 04:58:56.323918 sshd-session[1671]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:56.333338 systemd[1]: sshd@2-10.0.0.72:22-10.0.0.1:41882.service: Deactivated successfully. Sep 9 04:58:56.334737 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 04:58:56.335790 systemd-logind[1504]: Session 3 logged out. Waiting for processes to exit. Sep 9 04:58:56.337415 systemd[1]: Started sshd@3-10.0.0.72:22-10.0.0.1:41894.service - OpenSSH per-connection server daemon (10.0.0.1:41894). Sep 9 04:58:56.341251 systemd-logind[1504]: Removed session 3. Sep 9 04:58:56.389508 sshd[1681]: Accepted publickey for core from 10.0.0.1 port 41894 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:58:56.391170 sshd-session[1681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:56.395477 systemd-logind[1504]: New session 4 of user core. Sep 9 04:58:56.407632 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 04:58:56.460302 sshd[1684]: Connection closed by 10.0.0.1 port 41894 Sep 9 04:58:56.460769 sshd-session[1681]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:56.468353 systemd[1]: sshd@3-10.0.0.72:22-10.0.0.1:41894.service: Deactivated successfully. Sep 9 04:58:56.469718 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 04:58:56.470347 systemd-logind[1504]: Session 4 logged out. Waiting for processes to exit. Sep 9 04:58:56.472165 systemd[1]: Started sshd@4-10.0.0.72:22-10.0.0.1:41902.service - OpenSSH per-connection server daemon (10.0.0.1:41902). Sep 9 04:58:56.479571 systemd-logind[1504]: Removed session 4. Sep 9 04:58:56.539294 sshd[1690]: Accepted publickey for core from 10.0.0.1 port 41902 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:58:56.539766 sshd-session[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:56.543913 systemd-logind[1504]: New session 5 of user core. Sep 9 04:58:56.549623 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 04:58:56.604867 sudo[1694]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 04:58:56.605134 sudo[1694]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:58:56.618271 sudo[1694]: pam_unix(sudo:session): session closed for user root Sep 9 04:58:56.620760 sshd[1693]: Connection closed by 10.0.0.1 port 41902 Sep 9 04:58:56.620584 sshd-session[1690]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:56.627305 systemd[1]: sshd@4-10.0.0.72:22-10.0.0.1:41902.service: Deactivated successfully. Sep 9 04:58:56.629721 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 04:58:56.630824 systemd-logind[1504]: Session 5 logged out. Waiting for processes to exit. Sep 9 04:58:56.632745 systemd[1]: Started sshd@5-10.0.0.72:22-10.0.0.1:41908.service - OpenSSH per-connection server daemon (10.0.0.1:41908). Sep 9 04:58:56.634083 systemd-logind[1504]: Removed session 5. Sep 9 04:58:56.692749 sshd[1700]: Accepted publickey for core from 10.0.0.1 port 41908 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:58:56.696170 sshd-session[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:56.700734 systemd-logind[1504]: New session 6 of user core. Sep 9 04:58:56.706647 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 04:58:56.758732 sudo[1705]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 04:58:56.759292 sudo[1705]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:58:56.840058 sudo[1705]: pam_unix(sudo:session): session closed for user root Sep 9 04:58:56.845395 sudo[1704]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 04:58:56.845671 sudo[1704]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:58:56.857545 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:58:56.897429 augenrules[1727]: No rules Sep 9 04:58:56.898710 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:58:56.899564 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:58:56.900736 sudo[1704]: pam_unix(sudo:session): session closed for user root Sep 9 04:58:56.903042 sshd[1703]: Connection closed by 10.0.0.1 port 41908 Sep 9 04:58:56.902615 sshd-session[1700]: pam_unix(sshd:session): session closed for user core Sep 9 04:58:56.918580 systemd[1]: sshd@5-10.0.0.72:22-10.0.0.1:41908.service: Deactivated successfully. Sep 9 04:58:56.920052 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 04:58:56.920700 systemd-logind[1504]: Session 6 logged out. Waiting for processes to exit. Sep 9 04:58:56.922421 systemd[1]: Started sshd@6-10.0.0.72:22-10.0.0.1:41920.service - OpenSSH per-connection server daemon (10.0.0.1:41920). Sep 9 04:58:56.923790 systemd-logind[1504]: Removed session 6. Sep 9 04:58:56.983720 sshd[1736]: Accepted publickey for core from 10.0.0.1 port 41920 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:58:56.984888 sshd-session[1736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:58:56.989059 systemd-logind[1504]: New session 7 of user core. Sep 9 04:58:57.004631 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 04:58:57.055683 sudo[1740]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 04:58:57.055929 sudo[1740]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:58:57.352598 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 04:58:57.365770 (dockerd)[1761]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 04:58:57.561053 dockerd[1761]: time="2025-09-09T04:58:57.560985988Z" level=info msg="Starting up" Sep 9 04:58:57.561976 dockerd[1761]: time="2025-09-09T04:58:57.561950426Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 04:58:57.572258 dockerd[1761]: time="2025-09-09T04:58:57.572224801Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 04:58:57.604002 dockerd[1761]: time="2025-09-09T04:58:57.603683730Z" level=info msg="Loading containers: start." Sep 9 04:58:57.611511 kernel: Initializing XFRM netlink socket Sep 9 04:58:57.802576 systemd-networkd[1452]: docker0: Link UP Sep 9 04:58:57.805712 dockerd[1761]: time="2025-09-09T04:58:57.805674151Z" level=info msg="Loading containers: done." Sep 9 04:58:57.817705 dockerd[1761]: time="2025-09-09T04:58:57.817667590Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 04:58:57.817832 dockerd[1761]: time="2025-09-09T04:58:57.817751731Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 04:58:57.817857 dockerd[1761]: time="2025-09-09T04:58:57.817829304Z" level=info msg="Initializing buildkit" Sep 9 04:58:57.839074 dockerd[1761]: time="2025-09-09T04:58:57.839032800Z" level=info msg="Completed buildkit initialization" Sep 9 04:58:57.843653 dockerd[1761]: time="2025-09-09T04:58:57.843623752Z" level=info msg="Daemon has completed initialization" Sep 9 04:58:57.844077 dockerd[1761]: time="2025-09-09T04:58:57.843791033Z" level=info msg="API listen on /run/docker.sock" Sep 9 04:58:57.843851 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 04:58:58.432468 containerd[1534]: time="2025-09-09T04:58:58.431897079Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 04:58:58.985515 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4106120565.mount: Deactivated successfully. Sep 9 04:59:00.115610 containerd[1534]: time="2025-09-09T04:59:00.115549068Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:00.116650 containerd[1534]: time="2025-09-09T04:59:00.116615123Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=25652443" Sep 9 04:59:00.118011 containerd[1534]: time="2025-09-09T04:59:00.117957971Z" level=info msg="ImageCreate event name:\"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:00.121927 containerd[1534]: time="2025-09-09T04:59:00.121877529Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:00.123596 containerd[1534]: time="2025-09-09T04:59:00.123548943Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"25649241\" in 1.691614864s" Sep 9 04:59:00.123696 containerd[1534]: time="2025-09-09T04:59:00.123680273Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\"" Sep 9 04:59:00.125036 containerd[1534]: time="2025-09-09T04:59:00.125010669Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 04:59:01.229054 containerd[1534]: time="2025-09-09T04:59:01.228977421Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:01.229771 containerd[1534]: time="2025-09-09T04:59:01.229735084Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=22460311" Sep 9 04:59:01.230612 containerd[1534]: time="2025-09-09T04:59:01.230560890Z" level=info msg="ImageCreate event name:\"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:01.233501 containerd[1534]: time="2025-09-09T04:59:01.233409212Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:01.235500 containerd[1534]: time="2025-09-09T04:59:01.235413591Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"23997423\" in 1.110370291s" Sep 9 04:59:01.235500 containerd[1534]: time="2025-09-09T04:59:01.235454790Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\"" Sep 9 04:59:01.236122 containerd[1534]: time="2025-09-09T04:59:01.236048380Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 04:59:01.523871 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 04:59:01.525355 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:59:01.647911 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:59:01.651608 (kubelet)[2041]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:59:01.689415 kubelet[2041]: E0909 04:59:01.689337 2041 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:59:01.692215 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:59:01.692348 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:59:01.693629 systemd[1]: kubelet.service: Consumed 148ms CPU time, 107.9M memory peak. Sep 9 04:59:02.600271 containerd[1534]: time="2025-09-09T04:59:02.600198540Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:02.600878 containerd[1534]: time="2025-09-09T04:59:02.600838017Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=17125905" Sep 9 04:59:02.602111 containerd[1534]: time="2025-09-09T04:59:02.602072330Z" level=info msg="ImageCreate event name:\"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:02.604072 containerd[1534]: time="2025-09-09T04:59:02.604023187Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:02.605146 containerd[1534]: time="2025-09-09T04:59:02.605034827Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"18663035\" in 1.368949012s" Sep 9 04:59:02.605146 containerd[1534]: time="2025-09-09T04:59:02.605067855Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\"" Sep 9 04:59:02.605878 containerd[1534]: time="2025-09-09T04:59:02.605856862Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 04:59:03.464972 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1316713744.mount: Deactivated successfully. Sep 9 04:59:03.674131 containerd[1534]: time="2025-09-09T04:59:03.674075229Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:03.675301 containerd[1534]: time="2025-09-09T04:59:03.675270804Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=26916097" Sep 9 04:59:03.676211 containerd[1534]: time="2025-09-09T04:59:03.676170978Z" level=info msg="ImageCreate event name:\"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:03.678473 containerd[1534]: time="2025-09-09T04:59:03.678437066Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:03.679508 containerd[1534]: time="2025-09-09T04:59:03.679466745Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"26915114\" in 1.073576775s" Sep 9 04:59:03.679546 containerd[1534]: time="2025-09-09T04:59:03.679512022Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\"" Sep 9 04:59:03.679948 containerd[1534]: time="2025-09-09T04:59:03.679916192Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 04:59:04.370573 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3434020954.mount: Deactivated successfully. Sep 9 04:59:05.232346 containerd[1534]: time="2025-09-09T04:59:05.232281749Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:05.235215 containerd[1534]: time="2025-09-09T04:59:05.235172341Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 9 04:59:05.236521 containerd[1534]: time="2025-09-09T04:59:05.236475074Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:05.279684 containerd[1534]: time="2025-09-09T04:59:05.279640973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:05.281520 containerd[1534]: time="2025-09-09T04:59:05.281462199Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.601513621s" Sep 9 04:59:05.281520 containerd[1534]: time="2025-09-09T04:59:05.281512475Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 04:59:05.282014 containerd[1534]: time="2025-09-09T04:59:05.281981531Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 04:59:05.690357 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2810395698.mount: Deactivated successfully. Sep 9 04:59:05.695526 containerd[1534]: time="2025-09-09T04:59:05.695469896Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:59:05.696628 containerd[1534]: time="2025-09-09T04:59:05.696602148Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 9 04:59:05.697410 containerd[1534]: time="2025-09-09T04:59:05.697387831Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:59:05.699397 containerd[1534]: time="2025-09-09T04:59:05.699345634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:59:05.699979 containerd[1534]: time="2025-09-09T04:59:05.699952149Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 417.935874ms" Sep 9 04:59:05.700012 containerd[1534]: time="2025-09-09T04:59:05.699982771Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 04:59:05.700534 containerd[1534]: time="2025-09-09T04:59:05.700514112Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 04:59:06.158292 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3128393080.mount: Deactivated successfully. Sep 9 04:59:08.088943 containerd[1534]: time="2025-09-09T04:59:08.088888572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:08.089615 containerd[1534]: time="2025-09-09T04:59:08.089580660Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537163" Sep 9 04:59:08.090531 containerd[1534]: time="2025-09-09T04:59:08.090507648Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:08.093239 containerd[1534]: time="2025-09-09T04:59:08.093208924Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:08.095156 containerd[1534]: time="2025-09-09T04:59:08.095113048Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.394568156s" Sep 9 04:59:08.095191 containerd[1534]: time="2025-09-09T04:59:08.095156794Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 9 04:59:11.773870 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 04:59:11.775223 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:59:11.936271 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:59:11.947734 (kubelet)[2203]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:59:11.982361 kubelet[2203]: E0909 04:59:11.982298 2203 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:59:11.984804 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:59:11.984947 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:59:11.985229 systemd[1]: kubelet.service: Consumed 134ms CPU time, 107.3M memory peak. Sep 9 04:59:14.017415 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:59:14.017556 systemd[1]: kubelet.service: Consumed 134ms CPU time, 107.3M memory peak. Sep 9 04:59:14.019316 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:59:14.042443 systemd[1]: Reload requested from client PID 2217 ('systemctl') (unit session-7.scope)... Sep 9 04:59:14.042460 systemd[1]: Reloading... Sep 9 04:59:14.109525 zram_generator::config[2264]: No configuration found. Sep 9 04:59:14.394640 systemd[1]: Reloading finished in 351 ms. Sep 9 04:59:14.447239 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 04:59:14.447334 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 04:59:14.447615 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:59:14.447671 systemd[1]: kubelet.service: Consumed 91ms CPU time, 95.1M memory peak. Sep 9 04:59:14.451311 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:59:14.557221 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:59:14.561364 (kubelet)[2306]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:59:14.594283 kubelet[2306]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:59:14.594283 kubelet[2306]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 04:59:14.594283 kubelet[2306]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:59:14.594658 kubelet[2306]: I0909 04:59:14.594340 2306 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:59:15.318886 kubelet[2306]: I0909 04:59:15.318838 2306 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 04:59:15.318886 kubelet[2306]: I0909 04:59:15.318870 2306 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:59:15.319125 kubelet[2306]: I0909 04:59:15.319109 2306 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 04:59:15.337742 kubelet[2306]: E0909 04:59:15.337697 2306 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.72:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:59:15.338131 kubelet[2306]: I0909 04:59:15.338095 2306 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:59:15.345436 kubelet[2306]: I0909 04:59:15.345321 2306 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:59:15.350500 kubelet[2306]: I0909 04:59:15.350038 2306 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:59:15.350500 kubelet[2306]: I0909 04:59:15.350347 2306 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 04:59:15.350500 kubelet[2306]: I0909 04:59:15.350458 2306 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:59:15.350777 kubelet[2306]: I0909 04:59:15.350505 2306 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:59:15.350869 kubelet[2306]: I0909 04:59:15.350840 2306 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:59:15.350869 kubelet[2306]: I0909 04:59:15.350850 2306 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 04:59:15.351134 kubelet[2306]: I0909 04:59:15.351116 2306 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:59:15.353373 kubelet[2306]: I0909 04:59:15.353347 2306 kubelet.go:408] "Attempting to sync node with API server" Sep 9 04:59:15.353459 kubelet[2306]: I0909 04:59:15.353448 2306 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:59:15.353560 kubelet[2306]: I0909 04:59:15.353551 2306 kubelet.go:314] "Adding apiserver pod source" Sep 9 04:59:15.353733 kubelet[2306]: I0909 04:59:15.353723 2306 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:59:15.358182 kubelet[2306]: W0909 04:59:15.358141 2306 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 04:59:15.358291 kubelet[2306]: E0909 04:59:15.358272 2306 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.72:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:59:15.358382 kubelet[2306]: W0909 04:59:15.358200 2306 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 04:59:15.358448 kubelet[2306]: E0909 04:59:15.358435 2306 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.72:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:59:15.358532 kubelet[2306]: I0909 04:59:15.358497 2306 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:59:15.359270 kubelet[2306]: I0909 04:59:15.359251 2306 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:59:15.359578 kubelet[2306]: W0909 04:59:15.359565 2306 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 04:59:15.362089 kubelet[2306]: I0909 04:59:15.362075 2306 server.go:1274] "Started kubelet" Sep 9 04:59:15.362354 kubelet[2306]: I0909 04:59:15.362301 2306 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:59:15.362510 kubelet[2306]: I0909 04:59:15.362205 2306 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:59:15.362757 kubelet[2306]: I0909 04:59:15.362737 2306 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:59:15.363164 kubelet[2306]: E0909 04:59:15.363149 2306 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:59:15.363381 kubelet[2306]: I0909 04:59:15.363349 2306 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:59:15.363581 kubelet[2306]: I0909 04:59:15.363561 2306 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:59:15.363769 kubelet[2306]: I0909 04:59:15.363748 2306 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 04:59:15.363831 kubelet[2306]: I0909 04:59:15.363817 2306 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 04:59:15.363964 kubelet[2306]: I0909 04:59:15.363946 2306 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:59:15.364380 kubelet[2306]: W0909 04:59:15.364325 2306 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 04:59:15.364419 kubelet[2306]: E0909 04:59:15.364387 2306 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.72:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:59:15.364419 kubelet[2306]: I0909 04:59:15.363603 2306 server.go:449] "Adding debug handlers to kubelet server" Sep 9 04:59:15.365503 kubelet[2306]: E0909 04:59:15.364326 2306 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.72:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.72:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1863847dc7af3168 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 04:59:15.360551272 +0000 UTC m=+0.796196205,LastTimestamp:2025-09-09 04:59:15.360551272 +0000 UTC m=+0.796196205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 04:59:15.365503 kubelet[2306]: I0909 04:59:15.365423 2306 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:59:15.365622 kubelet[2306]: I0909 04:59:15.365549 2306 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:59:15.365764 kubelet[2306]: E0909 04:59:15.365741 2306 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:59:15.366111 kubelet[2306]: E0909 04:59:15.366079 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="200ms" Sep 9 04:59:15.367045 kubelet[2306]: I0909 04:59:15.367027 2306 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:59:15.380527 kubelet[2306]: I0909 04:59:15.380451 2306 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:59:15.381741 kubelet[2306]: I0909 04:59:15.381508 2306 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:59:15.381741 kubelet[2306]: I0909 04:59:15.381529 2306 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 04:59:15.381741 kubelet[2306]: I0909 04:59:15.381548 2306 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 04:59:15.381741 kubelet[2306]: E0909 04:59:15.381596 2306 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:59:15.381933 kubelet[2306]: I0909 04:59:15.381914 2306 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 04:59:15.381980 kubelet[2306]: I0909 04:59:15.381970 2306 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 04:59:15.382038 kubelet[2306]: I0909 04:59:15.382030 2306 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:59:15.384280 kubelet[2306]: W0909 04:59:15.384227 2306 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.72:6443: connect: connection refused Sep 9 04:59:15.384354 kubelet[2306]: E0909 04:59:15.384295 2306 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.72:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.72:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:59:15.466041 kubelet[2306]: E0909 04:59:15.465992 2306 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:59:15.482234 kubelet[2306]: E0909 04:59:15.482189 2306 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 04:59:15.508276 kubelet[2306]: I0909 04:59:15.508168 2306 policy_none.go:49] "None policy: Start" Sep 9 04:59:15.509068 kubelet[2306]: I0909 04:59:15.509046 2306 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 04:59:15.509068 kubelet[2306]: I0909 04:59:15.509074 2306 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:59:15.515773 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 04:59:15.534523 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 04:59:15.537717 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 04:59:15.554404 kubelet[2306]: I0909 04:59:15.554370 2306 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:59:15.554795 kubelet[2306]: I0909 04:59:15.554781 2306 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:59:15.555015 kubelet[2306]: I0909 04:59:15.554797 2306 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:59:15.555076 kubelet[2306]: I0909 04:59:15.555021 2306 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:59:15.557332 kubelet[2306]: E0909 04:59:15.557310 2306 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 04:59:15.566839 kubelet[2306]: E0909 04:59:15.566784 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="400ms" Sep 9 04:59:15.657135 kubelet[2306]: I0909 04:59:15.656959 2306 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 04:59:15.658098 kubelet[2306]: E0909 04:59:15.658065 2306 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 04:59:15.693088 systemd[1]: Created slice kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice - libcontainer container kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice. Sep 9 04:59:15.720375 systemd[1]: Created slice kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice - libcontainer container kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice. Sep 9 04:59:15.743936 systemd[1]: Created slice kubepods-burstable-podeee96a5d2f569f8950a31211aa4049a7.slice - libcontainer container kubepods-burstable-podeee96a5d2f569f8950a31211aa4049a7.slice. Sep 9 04:59:15.771750 kubelet[2306]: I0909 04:59:15.771705 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:15.771750 kubelet[2306]: I0909 04:59:15.771747 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:59:15.771866 kubelet[2306]: I0909 04:59:15.771767 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:15.771866 kubelet[2306]: I0909 04:59:15.771783 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:15.771866 kubelet[2306]: I0909 04:59:15.771799 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:15.771866 kubelet[2306]: I0909 04:59:15.771814 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:15.771950 kubelet[2306]: I0909 04:59:15.771865 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:15.771950 kubelet[2306]: I0909 04:59:15.771898 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:15.771950 kubelet[2306]: I0909 04:59:15.771920 2306 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:15.860278 kubelet[2306]: I0909 04:59:15.860243 2306 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 04:59:15.860617 kubelet[2306]: E0909 04:59:15.860570 2306 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 04:59:15.967841 kubelet[2306]: E0909 04:59:15.967743 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.72:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.72:6443: connect: connection refused" interval="800ms" Sep 9 04:59:16.018996 containerd[1534]: time="2025-09-09T04:59:16.018958152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,}" Sep 9 04:59:16.037423 containerd[1534]: time="2025-09-09T04:59:16.037376645Z" level=info msg="connecting to shim 4bebed466bdaaefd12e66812a7f9572df9ff7b056b178ef309106d91c34270fe" address="unix:///run/containerd/s/6e06dc1bdb536c4496b30c76a7219a91c054484e755cd7ee6f932c8e115c6813" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:16.042658 containerd[1534]: time="2025-09-09T04:59:16.042557072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,}" Sep 9 04:59:16.048726 containerd[1534]: time="2025-09-09T04:59:16.048669707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:eee96a5d2f569f8950a31211aa4049a7,Namespace:kube-system,Attempt:0,}" Sep 9 04:59:16.062672 systemd[1]: Started cri-containerd-4bebed466bdaaefd12e66812a7f9572df9ff7b056b178ef309106d91c34270fe.scope - libcontainer container 4bebed466bdaaefd12e66812a7f9572df9ff7b056b178ef309106d91c34270fe. Sep 9 04:59:16.067679 containerd[1534]: time="2025-09-09T04:59:16.067543480Z" level=info msg="connecting to shim 5722dc5c64448d9b622421f39ca4d8714e3f54bb51f107babb50a331e8ae7fb0" address="unix:///run/containerd/s/d82d9cac73508a503b46d6323c64027957193b104fe4a7472252f49031aeea3e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:16.085049 containerd[1534]: time="2025-09-09T04:59:16.084975826Z" level=info msg="connecting to shim cb819cc92d3f881ff74efeb69f3ba2d3d09e7dbbf81c072e87bb482dca72cf14" address="unix:///run/containerd/s/d5aac4cf405358bde23478575301abf236731b84473c2f4f7a795dec58a555e3" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:16.102678 systemd[1]: Started cri-containerd-5722dc5c64448d9b622421f39ca4d8714e3f54bb51f107babb50a331e8ae7fb0.scope - libcontainer container 5722dc5c64448d9b622421f39ca4d8714e3f54bb51f107babb50a331e8ae7fb0. Sep 9 04:59:16.105934 systemd[1]: Started cri-containerd-cb819cc92d3f881ff74efeb69f3ba2d3d09e7dbbf81c072e87bb482dca72cf14.scope - libcontainer container cb819cc92d3f881ff74efeb69f3ba2d3d09e7dbbf81c072e87bb482dca72cf14. Sep 9 04:59:16.108742 containerd[1534]: time="2025-09-09T04:59:16.108673180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"4bebed466bdaaefd12e66812a7f9572df9ff7b056b178ef309106d91c34270fe\"" Sep 9 04:59:16.113109 containerd[1534]: time="2025-09-09T04:59:16.112127158Z" level=info msg="CreateContainer within sandbox \"4bebed466bdaaefd12e66812a7f9572df9ff7b056b178ef309106d91c34270fe\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 04:59:16.122305 containerd[1534]: time="2025-09-09T04:59:16.122258490Z" level=info msg="Container 45915b6b06f49eeb0360df2e9bcb3f8207a1636f48b2a6df362ce820c14e0623: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:16.131304 containerd[1534]: time="2025-09-09T04:59:16.131141621Z" level=info msg="CreateContainer within sandbox \"4bebed466bdaaefd12e66812a7f9572df9ff7b056b178ef309106d91c34270fe\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"45915b6b06f49eeb0360df2e9bcb3f8207a1636f48b2a6df362ce820c14e0623\"" Sep 9 04:59:16.131953 containerd[1534]: time="2025-09-09T04:59:16.131926818Z" level=info msg="StartContainer for \"45915b6b06f49eeb0360df2e9bcb3f8207a1636f48b2a6df362ce820c14e0623\"" Sep 9 04:59:16.134689 containerd[1534]: time="2025-09-09T04:59:16.134560386Z" level=info msg="connecting to shim 45915b6b06f49eeb0360df2e9bcb3f8207a1636f48b2a6df362ce820c14e0623" address="unix:///run/containerd/s/6e06dc1bdb536c4496b30c76a7219a91c054484e755cd7ee6f932c8e115c6813" protocol=ttrpc version=3 Sep 9 04:59:16.146589 containerd[1534]: time="2025-09-09T04:59:16.146530846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"5722dc5c64448d9b622421f39ca4d8714e3f54bb51f107babb50a331e8ae7fb0\"" Sep 9 04:59:16.150205 containerd[1534]: time="2025-09-09T04:59:16.150157045Z" level=info msg="CreateContainer within sandbox \"5722dc5c64448d9b622421f39ca4d8714e3f54bb51f107babb50a331e8ae7fb0\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 04:59:16.152190 containerd[1534]: time="2025-09-09T04:59:16.152081003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:eee96a5d2f569f8950a31211aa4049a7,Namespace:kube-system,Attempt:0,} returns sandbox id \"cb819cc92d3f881ff74efeb69f3ba2d3d09e7dbbf81c072e87bb482dca72cf14\"" Sep 9 04:59:16.156223 containerd[1534]: time="2025-09-09T04:59:16.156191732Z" level=info msg="CreateContainer within sandbox \"cb819cc92d3f881ff74efeb69f3ba2d3d09e7dbbf81c072e87bb482dca72cf14\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 04:59:16.156709 systemd[1]: Started cri-containerd-45915b6b06f49eeb0360df2e9bcb3f8207a1636f48b2a6df362ce820c14e0623.scope - libcontainer container 45915b6b06f49eeb0360df2e9bcb3f8207a1636f48b2a6df362ce820c14e0623. Sep 9 04:59:16.159314 containerd[1534]: time="2025-09-09T04:59:16.158703818Z" level=info msg="Container 1329e16e0fb0edb51d46152b061382b7312e59d46c42222aaf7f52e88a425896: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:16.165123 containerd[1534]: time="2025-09-09T04:59:16.165075144Z" level=info msg="CreateContainer within sandbox \"5722dc5c64448d9b622421f39ca4d8714e3f54bb51f107babb50a331e8ae7fb0\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1329e16e0fb0edb51d46152b061382b7312e59d46c42222aaf7f52e88a425896\"" Sep 9 04:59:16.165597 containerd[1534]: time="2025-09-09T04:59:16.165570239Z" level=info msg="StartContainer for \"1329e16e0fb0edb51d46152b061382b7312e59d46c42222aaf7f52e88a425896\"" Sep 9 04:59:16.165698 containerd[1534]: time="2025-09-09T04:59:16.165583963Z" level=info msg="Container f3e96fe9ca213340c9a1942d138b98e919aa390800b14c72e944abd796a71d1a: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:16.166638 containerd[1534]: time="2025-09-09T04:59:16.166601322Z" level=info msg="connecting to shim 1329e16e0fb0edb51d46152b061382b7312e59d46c42222aaf7f52e88a425896" address="unix:///run/containerd/s/d82d9cac73508a503b46d6323c64027957193b104fe4a7472252f49031aeea3e" protocol=ttrpc version=3 Sep 9 04:59:16.174251 containerd[1534]: time="2025-09-09T04:59:16.174128536Z" level=info msg="CreateContainer within sandbox \"cb819cc92d3f881ff74efeb69f3ba2d3d09e7dbbf81c072e87bb482dca72cf14\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f3e96fe9ca213340c9a1942d138b98e919aa390800b14c72e944abd796a71d1a\"" Sep 9 04:59:16.175619 containerd[1534]: time="2025-09-09T04:59:16.175586410Z" level=info msg="StartContainer for \"f3e96fe9ca213340c9a1942d138b98e919aa390800b14c72e944abd796a71d1a\"" Sep 9 04:59:16.178559 containerd[1534]: time="2025-09-09T04:59:16.178460983Z" level=info msg="connecting to shim f3e96fe9ca213340c9a1942d138b98e919aa390800b14c72e944abd796a71d1a" address="unix:///run/containerd/s/d5aac4cf405358bde23478575301abf236731b84473c2f4f7a795dec58a555e3" protocol=ttrpc version=3 Sep 9 04:59:16.187944 systemd[1]: Started cri-containerd-1329e16e0fb0edb51d46152b061382b7312e59d46c42222aaf7f52e88a425896.scope - libcontainer container 1329e16e0fb0edb51d46152b061382b7312e59d46c42222aaf7f52e88a425896. Sep 9 04:59:16.207910 containerd[1534]: time="2025-09-09T04:59:16.207424514Z" level=info msg="StartContainer for \"45915b6b06f49eeb0360df2e9bcb3f8207a1636f48b2a6df362ce820c14e0623\" returns successfully" Sep 9 04:59:16.207691 systemd[1]: Started cri-containerd-f3e96fe9ca213340c9a1942d138b98e919aa390800b14c72e944abd796a71d1a.scope - libcontainer container f3e96fe9ca213340c9a1942d138b98e919aa390800b14c72e944abd796a71d1a. Sep 9 04:59:16.236730 containerd[1534]: time="2025-09-09T04:59:16.236061489Z" level=info msg="StartContainer for \"1329e16e0fb0edb51d46152b061382b7312e59d46c42222aaf7f52e88a425896\" returns successfully" Sep 9 04:59:16.259988 containerd[1534]: time="2025-09-09T04:59:16.259939067Z" level=info msg="StartContainer for \"f3e96fe9ca213340c9a1942d138b98e919aa390800b14c72e944abd796a71d1a\" returns successfully" Sep 9 04:59:16.261672 kubelet[2306]: I0909 04:59:16.261639 2306 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 04:59:16.262066 kubelet[2306]: E0909 04:59:16.262035 2306 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.72:6443/api/v1/nodes\": dial tcp 10.0.0.72:6443: connect: connection refused" node="localhost" Sep 9 04:59:17.063835 kubelet[2306]: I0909 04:59:17.063806 2306 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 04:59:17.837444 kubelet[2306]: E0909 04:59:17.837403 2306 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 04:59:17.909428 kubelet[2306]: I0909 04:59:17.909273 2306 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 04:59:17.909428 kubelet[2306]: E0909 04:59:17.909319 2306 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 9 04:59:18.358046 kubelet[2306]: I0909 04:59:18.358008 2306 apiserver.go:52] "Watching apiserver" Sep 9 04:59:18.364677 kubelet[2306]: I0909 04:59:18.364646 2306 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 04:59:19.995696 systemd[1]: Reload requested from client PID 2580 ('systemctl') (unit session-7.scope)... Sep 9 04:59:19.995710 systemd[1]: Reloading... Sep 9 04:59:20.052520 zram_generator::config[2623]: No configuration found. Sep 9 04:59:20.220709 systemd[1]: Reloading finished in 224 ms. Sep 9 04:59:20.249872 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:59:20.261730 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:59:20.261949 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:59:20.261993 systemd[1]: kubelet.service: Consumed 1.170s CPU time, 127.9M memory peak. Sep 9 04:59:20.264656 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:59:20.413176 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:59:20.417043 (kubelet)[2665]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:59:20.467731 kubelet[2665]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:59:20.467731 kubelet[2665]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 04:59:20.467731 kubelet[2665]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:59:20.468073 kubelet[2665]: I0909 04:59:20.467766 2665 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:59:20.473752 kubelet[2665]: I0909 04:59:20.473719 2665 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 04:59:20.474533 kubelet[2665]: I0909 04:59:20.473862 2665 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:59:20.474533 kubelet[2665]: I0909 04:59:20.474085 2665 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 04:59:20.475423 kubelet[2665]: I0909 04:59:20.475392 2665 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 04:59:20.477274 kubelet[2665]: I0909 04:59:20.477251 2665 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:59:20.480857 kubelet[2665]: I0909 04:59:20.480837 2665 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:59:20.483700 kubelet[2665]: I0909 04:59:20.483679 2665 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:59:20.483812 kubelet[2665]: I0909 04:59:20.483794 2665 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 04:59:20.483912 kubelet[2665]: I0909 04:59:20.483892 2665 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:59:20.484347 kubelet[2665]: I0909 04:59:20.483914 2665 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:59:20.484479 kubelet[2665]: I0909 04:59:20.484352 2665 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:59:20.484479 kubelet[2665]: I0909 04:59:20.484364 2665 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 04:59:20.484479 kubelet[2665]: I0909 04:59:20.484403 2665 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:59:20.484775 kubelet[2665]: I0909 04:59:20.484761 2665 kubelet.go:408] "Attempting to sync node with API server" Sep 9 04:59:20.484815 kubelet[2665]: I0909 04:59:20.484788 2665 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:59:20.484815 kubelet[2665]: I0909 04:59:20.484808 2665 kubelet.go:314] "Adding apiserver pod source" Sep 9 04:59:20.484862 kubelet[2665]: I0909 04:59:20.484817 2665 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:59:20.486616 kubelet[2665]: I0909 04:59:20.486585 2665 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:59:20.487071 kubelet[2665]: I0909 04:59:20.487025 2665 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:59:20.491394 kubelet[2665]: I0909 04:59:20.491367 2665 server.go:1274] "Started kubelet" Sep 9 04:59:20.491579 kubelet[2665]: I0909 04:59:20.491539 2665 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:59:20.491843 kubelet[2665]: I0909 04:59:20.491796 2665 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:59:20.492062 kubelet[2665]: I0909 04:59:20.492045 2665 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:59:20.492381 kubelet[2665]: I0909 04:59:20.492359 2665 server.go:449] "Adding debug handlers to kubelet server" Sep 9 04:59:20.497674 kubelet[2665]: E0909 04:59:20.497652 2665 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:59:20.498805 kubelet[2665]: I0909 04:59:20.498780 2665 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:59:20.499018 kubelet[2665]: I0909 04:59:20.499000 2665 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:59:20.499325 kubelet[2665]: E0909 04:59:20.499291 2665 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:59:20.499325 kubelet[2665]: I0909 04:59:20.499327 2665 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 04:59:20.499610 kubelet[2665]: I0909 04:59:20.499577 2665 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 04:59:20.499882 kubelet[2665]: I0909 04:59:20.499838 2665 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:59:20.501227 kubelet[2665]: I0909 04:59:20.501132 2665 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:59:20.501649 kubelet[2665]: I0909 04:59:20.501400 2665 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:59:20.504413 kubelet[2665]: I0909 04:59:20.504360 2665 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:59:20.515730 kubelet[2665]: I0909 04:59:20.515683 2665 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:59:20.516903 kubelet[2665]: I0909 04:59:20.516879 2665 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:59:20.516903 kubelet[2665]: I0909 04:59:20.516902 2665 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 04:59:20.517022 kubelet[2665]: I0909 04:59:20.516918 2665 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 04:59:20.517022 kubelet[2665]: E0909 04:59:20.516959 2665 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:59:20.536286 kubelet[2665]: I0909 04:59:20.536257 2665 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 04:59:20.536286 kubelet[2665]: I0909 04:59:20.536279 2665 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 04:59:20.536463 kubelet[2665]: I0909 04:59:20.536299 2665 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:59:20.536463 kubelet[2665]: I0909 04:59:20.536450 2665 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 04:59:20.536463 kubelet[2665]: I0909 04:59:20.536460 2665 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 04:59:20.536671 kubelet[2665]: I0909 04:59:20.536478 2665 policy_none.go:49] "None policy: Start" Sep 9 04:59:20.537335 kubelet[2665]: I0909 04:59:20.537317 2665 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 04:59:20.537396 kubelet[2665]: I0909 04:59:20.537343 2665 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:59:20.537515 kubelet[2665]: I0909 04:59:20.537502 2665 state_mem.go:75] "Updated machine memory state" Sep 9 04:59:20.541593 kubelet[2665]: I0909 04:59:20.541573 2665 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:59:20.541765 kubelet[2665]: I0909 04:59:20.541725 2665 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:59:20.541765 kubelet[2665]: I0909 04:59:20.541744 2665 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:59:20.541941 kubelet[2665]: I0909 04:59:20.541920 2665 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:59:20.625296 kubelet[2665]: E0909 04:59:20.625260 2665 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:20.643515 kubelet[2665]: I0909 04:59:20.643475 2665 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 04:59:20.649744 kubelet[2665]: I0909 04:59:20.649593 2665 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 9 04:59:20.649927 kubelet[2665]: I0909 04:59:20.649912 2665 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 04:59:20.701208 kubelet[2665]: I0909 04:59:20.701064 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:59:20.701208 kubelet[2665]: I0909 04:59:20.701104 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:20.701208 kubelet[2665]: I0909 04:59:20.701129 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:20.701208 kubelet[2665]: I0909 04:59:20.701149 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:20.701208 kubelet[2665]: I0909 04:59:20.701165 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:20.701429 kubelet[2665]: I0909 04:59:20.701181 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:20.701429 kubelet[2665]: I0909 04:59:20.701229 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/eee96a5d2f569f8950a31211aa4049a7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"eee96a5d2f569f8950a31211aa4049a7\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:20.703527 kubelet[2665]: I0909 04:59:20.701279 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:20.703527 kubelet[2665]: I0909 04:59:20.701754 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:59:21.485961 kubelet[2665]: I0909 04:59:21.485905 2665 apiserver.go:52] "Watching apiserver" Sep 9 04:59:21.500021 kubelet[2665]: I0909 04:59:21.499733 2665 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 04:59:21.536505 kubelet[2665]: E0909 04:59:21.536355 2665 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 04:59:21.567058 kubelet[2665]: I0909 04:59:21.566878 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.56684291 podStartE2EDuration="2.56684291s" podCreationTimestamp="2025-09-09 04:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:59:21.565935198 +0000 UTC m=+1.146003331" watchObservedRunningTime="2025-09-09 04:59:21.56684291 +0000 UTC m=+1.146911043" Sep 9 04:59:21.575914 kubelet[2665]: I0909 04:59:21.575155 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.5751383479999999 podStartE2EDuration="1.575138348s" podCreationTimestamp="2025-09-09 04:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:59:21.575111662 +0000 UTC m=+1.155179795" watchObservedRunningTime="2025-09-09 04:59:21.575138348 +0000 UTC m=+1.155206481" Sep 9 04:59:21.583406 kubelet[2665]: I0909 04:59:21.583333 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.5833190369999999 podStartE2EDuration="1.583319037s" podCreationTimestamp="2025-09-09 04:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:59:21.583029723 +0000 UTC m=+1.163097856" watchObservedRunningTime="2025-09-09 04:59:21.583319037 +0000 UTC m=+1.163387170" Sep 9 04:59:26.400633 kubelet[2665]: I0909 04:59:26.400600 2665 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 04:59:26.401274 kubelet[2665]: I0909 04:59:26.401154 2665 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 04:59:26.401316 containerd[1534]: time="2025-09-09T04:59:26.400899391Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 04:59:27.212901 systemd[1]: Created slice kubepods-besteffort-pod0e413938_ba18_4ab3_bae9_803fbe973dd1.slice - libcontainer container kubepods-besteffort-pod0e413938_ba18_4ab3_bae9_803fbe973dd1.slice. Sep 9 04:59:27.250832 kubelet[2665]: I0909 04:59:27.250777 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0e413938-ba18-4ab3-bae9-803fbe973dd1-kube-proxy\") pod \"kube-proxy-x29vp\" (UID: \"0e413938-ba18-4ab3-bae9-803fbe973dd1\") " pod="kube-system/kube-proxy-x29vp" Sep 9 04:59:27.250832 kubelet[2665]: I0909 04:59:27.250840 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0e413938-ba18-4ab3-bae9-803fbe973dd1-xtables-lock\") pod \"kube-proxy-x29vp\" (UID: \"0e413938-ba18-4ab3-bae9-803fbe973dd1\") " pod="kube-system/kube-proxy-x29vp" Sep 9 04:59:27.250975 kubelet[2665]: I0909 04:59:27.250858 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e413938-ba18-4ab3-bae9-803fbe973dd1-lib-modules\") pod \"kube-proxy-x29vp\" (UID: \"0e413938-ba18-4ab3-bae9-803fbe973dd1\") " pod="kube-system/kube-proxy-x29vp" Sep 9 04:59:27.250975 kubelet[2665]: I0909 04:59:27.250878 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8p5\" (UniqueName: \"kubernetes.io/projected/0e413938-ba18-4ab3-bae9-803fbe973dd1-kube-api-access-xl8p5\") pod \"kube-proxy-x29vp\" (UID: \"0e413938-ba18-4ab3-bae9-803fbe973dd1\") " pod="kube-system/kube-proxy-x29vp" Sep 9 04:59:27.429172 systemd[1]: Created slice kubepods-besteffort-pod89fd41ec_922c_458d_86d4_5dbc902881ed.slice - libcontainer container kubepods-besteffort-pod89fd41ec_922c_458d_86d4_5dbc902881ed.slice. Sep 9 04:59:27.453110 kubelet[2665]: I0909 04:59:27.453074 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6tx\" (UniqueName: \"kubernetes.io/projected/89fd41ec-922c-458d-86d4-5dbc902881ed-kube-api-access-pv6tx\") pod \"tigera-operator-58fc44c59b-9cvm2\" (UID: \"89fd41ec-922c-458d-86d4-5dbc902881ed\") " pod="tigera-operator/tigera-operator-58fc44c59b-9cvm2" Sep 9 04:59:27.453559 kubelet[2665]: I0909 04:59:27.453500 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/89fd41ec-922c-458d-86d4-5dbc902881ed-var-lib-calico\") pod \"tigera-operator-58fc44c59b-9cvm2\" (UID: \"89fd41ec-922c-458d-86d4-5dbc902881ed\") " pod="tigera-operator/tigera-operator-58fc44c59b-9cvm2" Sep 9 04:59:27.529188 containerd[1534]: time="2025-09-09T04:59:27.529139720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-x29vp,Uid:0e413938-ba18-4ab3-bae9-803fbe973dd1,Namespace:kube-system,Attempt:0,}" Sep 9 04:59:27.545944 containerd[1534]: time="2025-09-09T04:59:27.545907227Z" level=info msg="connecting to shim b63aa3d94bbda953da3cace4bde2f3295ea65b028ad149daa3649917de0b9ad9" address="unix:///run/containerd/s/76757728cfba124e8bbbbbd15e17c085a1868b9d3fba8ebf4593283575fb2b5a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:27.573645 systemd[1]: Started cri-containerd-b63aa3d94bbda953da3cace4bde2f3295ea65b028ad149daa3649917de0b9ad9.scope - libcontainer container b63aa3d94bbda953da3cace4bde2f3295ea65b028ad149daa3649917de0b9ad9. Sep 9 04:59:27.596644 containerd[1534]: time="2025-09-09T04:59:27.596606697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-x29vp,Uid:0e413938-ba18-4ab3-bae9-803fbe973dd1,Namespace:kube-system,Attempt:0,} returns sandbox id \"b63aa3d94bbda953da3cace4bde2f3295ea65b028ad149daa3649917de0b9ad9\"" Sep 9 04:59:27.600304 containerd[1534]: time="2025-09-09T04:59:27.600277653Z" level=info msg="CreateContainer within sandbox \"b63aa3d94bbda953da3cace4bde2f3295ea65b028ad149daa3649917de0b9ad9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 04:59:27.610515 containerd[1534]: time="2025-09-09T04:59:27.609507533Z" level=info msg="Container 1f0363570f5f3c810c3641fe022df91d9ef442f91d49f886224cee737b021139: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:27.616260 containerd[1534]: time="2025-09-09T04:59:27.616215256Z" level=info msg="CreateContainer within sandbox \"b63aa3d94bbda953da3cace4bde2f3295ea65b028ad149daa3649917de0b9ad9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1f0363570f5f3c810c3641fe022df91d9ef442f91d49f886224cee737b021139\"" Sep 9 04:59:27.618145 containerd[1534]: time="2025-09-09T04:59:27.618109025Z" level=info msg="StartContainer for \"1f0363570f5f3c810c3641fe022df91d9ef442f91d49f886224cee737b021139\"" Sep 9 04:59:27.619613 containerd[1534]: time="2025-09-09T04:59:27.619578079Z" level=info msg="connecting to shim 1f0363570f5f3c810c3641fe022df91d9ef442f91d49f886224cee737b021139" address="unix:///run/containerd/s/76757728cfba124e8bbbbbd15e17c085a1868b9d3fba8ebf4593283575fb2b5a" protocol=ttrpc version=3 Sep 9 04:59:27.637665 systemd[1]: Started cri-containerd-1f0363570f5f3c810c3641fe022df91d9ef442f91d49f886224cee737b021139.scope - libcontainer container 1f0363570f5f3c810c3641fe022df91d9ef442f91d49f886224cee737b021139. Sep 9 04:59:27.670475 containerd[1534]: time="2025-09-09T04:59:27.670435016Z" level=info msg="StartContainer for \"1f0363570f5f3c810c3641fe022df91d9ef442f91d49f886224cee737b021139\" returns successfully" Sep 9 04:59:27.732066 containerd[1534]: time="2025-09-09T04:59:27.732023574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-9cvm2,Uid:89fd41ec-922c-458d-86d4-5dbc902881ed,Namespace:tigera-operator,Attempt:0,}" Sep 9 04:59:27.752995 containerd[1534]: time="2025-09-09T04:59:27.752946201Z" level=info msg="connecting to shim 30e7f94726daf92ed12aa65c595be4d58739066686678bf262e5c161cf1ef277" address="unix:///run/containerd/s/c6c515a03e139ac6abcd6f1da564c97fdabc4837e37a4c711c78bf0b21bc324d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:27.780641 systemd[1]: Started cri-containerd-30e7f94726daf92ed12aa65c595be4d58739066686678bf262e5c161cf1ef277.scope - libcontainer container 30e7f94726daf92ed12aa65c595be4d58739066686678bf262e5c161cf1ef277. Sep 9 04:59:27.813440 containerd[1534]: time="2025-09-09T04:59:27.813385039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-9cvm2,Uid:89fd41ec-922c-458d-86d4-5dbc902881ed,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"30e7f94726daf92ed12aa65c595be4d58739066686678bf262e5c161cf1ef277\"" Sep 9 04:59:27.816386 containerd[1534]: time="2025-09-09T04:59:27.816365996Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 04:59:28.368530 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4293372442.mount: Deactivated successfully. Sep 9 04:59:28.557795 kubelet[2665]: I0909 04:59:28.557604 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-x29vp" podStartSLOduration=1.5575860750000001 podStartE2EDuration="1.557586075s" podCreationTimestamp="2025-09-09 04:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:59:28.555022339 +0000 UTC m=+8.135090472" watchObservedRunningTime="2025-09-09 04:59:28.557586075 +0000 UTC m=+8.137654208" Sep 9 04:59:29.060437 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2600357733.mount: Deactivated successfully. Sep 9 04:59:29.359926 containerd[1534]: time="2025-09-09T04:59:29.359813121Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:29.361146 containerd[1534]: time="2025-09-09T04:59:29.361105278Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 04:59:29.362572 containerd[1534]: time="2025-09-09T04:59:29.362534976Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:29.364860 containerd[1534]: time="2025-09-09T04:59:29.364823564Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:29.365577 containerd[1534]: time="2025-09-09T04:59:29.365556076Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.549162276s" Sep 9 04:59:29.365629 containerd[1534]: time="2025-09-09T04:59:29.365580560Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 04:59:29.368120 containerd[1534]: time="2025-09-09T04:59:29.368094143Z" level=info msg="CreateContainer within sandbox \"30e7f94726daf92ed12aa65c595be4d58739066686678bf262e5c161cf1ef277\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 04:59:29.374609 containerd[1534]: time="2025-09-09T04:59:29.374581651Z" level=info msg="Container 86e23844ae3a033dac50418004bea0b903be987360c15611b815d6f747db5d8d: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:29.380537 containerd[1534]: time="2025-09-09T04:59:29.380469548Z" level=info msg="CreateContainer within sandbox \"30e7f94726daf92ed12aa65c595be4d58739066686678bf262e5c161cf1ef277\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"86e23844ae3a033dac50418004bea0b903be987360c15611b815d6f747db5d8d\"" Sep 9 04:59:29.380939 containerd[1534]: time="2025-09-09T04:59:29.380914136Z" level=info msg="StartContainer for \"86e23844ae3a033dac50418004bea0b903be987360c15611b815d6f747db5d8d\"" Sep 9 04:59:29.383665 containerd[1534]: time="2025-09-09T04:59:29.383623429Z" level=info msg="connecting to shim 86e23844ae3a033dac50418004bea0b903be987360c15611b815d6f747db5d8d" address="unix:///run/containerd/s/c6c515a03e139ac6abcd6f1da564c97fdabc4837e37a4c711c78bf0b21bc324d" protocol=ttrpc version=3 Sep 9 04:59:29.408653 systemd[1]: Started cri-containerd-86e23844ae3a033dac50418004bea0b903be987360c15611b815d6f747db5d8d.scope - libcontainer container 86e23844ae3a033dac50418004bea0b903be987360c15611b815d6f747db5d8d. Sep 9 04:59:29.436036 containerd[1534]: time="2025-09-09T04:59:29.435990489Z" level=info msg="StartContainer for \"86e23844ae3a033dac50418004bea0b903be987360c15611b815d6f747db5d8d\" returns successfully" Sep 9 04:59:29.566955 kubelet[2665]: I0909 04:59:29.566893 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-9cvm2" podStartSLOduration=1.015306963 podStartE2EDuration="2.566734171s" podCreationTimestamp="2025-09-09 04:59:27 +0000 UTC" firstStartedPulling="2025-09-09 04:59:27.815084534 +0000 UTC m=+7.395152667" lastFinishedPulling="2025-09-09 04:59:29.366511782 +0000 UTC m=+8.946579875" observedRunningTime="2025-09-09 04:59:29.558404622 +0000 UTC m=+9.138472755" watchObservedRunningTime="2025-09-09 04:59:29.566734171 +0000 UTC m=+9.146802304" Sep 9 04:59:34.561501 update_engine[1510]: I20250909 04:59:34.561424 1510 update_attempter.cc:509] Updating boot flags... Sep 9 04:59:34.916063 sudo[1740]: pam_unix(sudo:session): session closed for user root Sep 9 04:59:34.918216 sshd[1739]: Connection closed by 10.0.0.1 port 41920 Sep 9 04:59:34.918825 sshd-session[1736]: pam_unix(sshd:session): session closed for user core Sep 9 04:59:34.922115 systemd[1]: sshd@6-10.0.0.72:22-10.0.0.1:41920.service: Deactivated successfully. Sep 9 04:59:34.923902 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 04:59:34.924069 systemd[1]: session-7.scope: Consumed 7.650s CPU time, 221.1M memory peak. Sep 9 04:59:34.925926 systemd-logind[1504]: Session 7 logged out. Waiting for processes to exit. Sep 9 04:59:34.927422 systemd-logind[1504]: Removed session 7. Sep 9 04:59:39.968300 systemd[1]: Created slice kubepods-besteffort-podc215c12a_1c6c_442d_a584_7ba9e8403260.slice - libcontainer container kubepods-besteffort-podc215c12a_1c6c_442d_a584_7ba9e8403260.slice. Sep 9 04:59:40.039181 kubelet[2665]: I0909 04:59:40.039100 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c215c12a-1c6c-442d-a584-7ba9e8403260-tigera-ca-bundle\") pod \"calico-typha-758d6b7598-m9xx6\" (UID: \"c215c12a-1c6c-442d-a584-7ba9e8403260\") " pod="calico-system/calico-typha-758d6b7598-m9xx6" Sep 9 04:59:40.039181 kubelet[2665]: I0909 04:59:40.039182 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c215c12a-1c6c-442d-a584-7ba9e8403260-typha-certs\") pod \"calico-typha-758d6b7598-m9xx6\" (UID: \"c215c12a-1c6c-442d-a584-7ba9e8403260\") " pod="calico-system/calico-typha-758d6b7598-m9xx6" Sep 9 04:59:40.039578 kubelet[2665]: I0909 04:59:40.039210 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt5nf\" (UniqueName: \"kubernetes.io/projected/c215c12a-1c6c-442d-a584-7ba9e8403260-kube-api-access-jt5nf\") pod \"calico-typha-758d6b7598-m9xx6\" (UID: \"c215c12a-1c6c-442d-a584-7ba9e8403260\") " pod="calico-system/calico-typha-758d6b7598-m9xx6" Sep 9 04:59:40.138633 systemd[1]: Created slice kubepods-besteffort-pod5d5bc9d2_ccd5_40fa_891a_a60cf268d8e9.slice - libcontainer container kubepods-besteffort-pod5d5bc9d2_ccd5_40fa_891a_a60cf268d8e9.slice. Sep 9 04:59:40.241309 kubelet[2665]: I0909 04:59:40.241206 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-node-certs\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.241515 kubelet[2665]: I0909 04:59:40.241498 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-cni-bin-dir\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.241826 kubelet[2665]: I0909 04:59:40.241691 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-lib-modules\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.241826 kubelet[2665]: I0909 04:59:40.241716 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwtbj\" (UniqueName: \"kubernetes.io/projected/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-kube-api-access-fwtbj\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.242555 kubelet[2665]: I0909 04:59:40.242534 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-tigera-ca-bundle\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.242708 kubelet[2665]: I0909 04:59:40.242601 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-cni-log-dir\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.242708 kubelet[2665]: I0909 04:59:40.242619 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-flexvol-driver-host\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.242708 kubelet[2665]: I0909 04:59:40.242644 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-cni-net-dir\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.242869 kubelet[2665]: I0909 04:59:40.242815 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-var-lib-calico\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.242869 kubelet[2665]: I0909 04:59:40.242841 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-var-run-calico\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.243027 kubelet[2665]: I0909 04:59:40.242954 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-policysync\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.243027 kubelet[2665]: I0909 04:59:40.242990 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9-xtables-lock\") pod \"calico-node-hllcp\" (UID: \"5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9\") " pod="calico-system/calico-node-hllcp" Sep 9 04:59:40.287314 containerd[1534]: time="2025-09-09T04:59:40.287275902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-758d6b7598-m9xx6,Uid:c215c12a-1c6c-442d-a584-7ba9e8403260,Namespace:calico-system,Attempt:0,}" Sep 9 04:59:40.332499 kubelet[2665]: E0909 04:59:40.332429 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kgwfz" podUID="ff1d9b5f-1e61-411f-bd7e-fee7b5332631" Sep 9 04:59:40.336823 containerd[1534]: time="2025-09-09T04:59:40.336675723Z" level=info msg="connecting to shim a2667e54c54b4931fcfa2427ea95d3a6b5c43ea1f0dab0678f9e3f08e1cad516" address="unix:///run/containerd/s/129926f20902dc73f1bec00d882302d6656350daa049dd57806a74871a7a8e87" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:40.361874 kubelet[2665]: E0909 04:59:40.361843 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.362514 kubelet[2665]: W0909 04:59:40.362002 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.362514 kubelet[2665]: E0909 04:59:40.362033 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.398935 systemd[1]: Started cri-containerd-a2667e54c54b4931fcfa2427ea95d3a6b5c43ea1f0dab0678f9e3f08e1cad516.scope - libcontainer container a2667e54c54b4931fcfa2427ea95d3a6b5c43ea1f0dab0678f9e3f08e1cad516. Sep 9 04:59:40.429159 kubelet[2665]: E0909 04:59:40.429125 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.429469 kubelet[2665]: W0909 04:59:40.429433 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.430071 kubelet[2665]: E0909 04:59:40.429916 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.432642 kubelet[2665]: E0909 04:59:40.432560 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.432642 kubelet[2665]: W0909 04:59:40.432580 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.432642 kubelet[2665]: E0909 04:59:40.432596 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.433561 kubelet[2665]: E0909 04:59:40.433541 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.433713 kubelet[2665]: W0909 04:59:40.433643 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.433713 kubelet[2665]: E0909 04:59:40.433663 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.433971 kubelet[2665]: E0909 04:59:40.433957 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.434478 kubelet[2665]: W0909 04:59:40.434040 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.434478 kubelet[2665]: E0909 04:59:40.434058 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.434895 kubelet[2665]: E0909 04:59:40.434878 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.435185 kubelet[2665]: W0909 04:59:40.434997 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.435185 kubelet[2665]: E0909 04:59:40.435016 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.436235 kubelet[2665]: E0909 04:59:40.435453 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.436235 kubelet[2665]: W0909 04:59:40.435818 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.436235 kubelet[2665]: E0909 04:59:40.435839 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.436733 kubelet[2665]: E0909 04:59:40.436680 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.437078 kubelet[2665]: W0909 04:59:40.437057 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.437374 kubelet[2665]: E0909 04:59:40.437355 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.439787 kubelet[2665]: E0909 04:59:40.439662 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.439787 kubelet[2665]: W0909 04:59:40.439678 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.439787 kubelet[2665]: E0909 04:59:40.439689 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.440089 kubelet[2665]: E0909 04:59:40.439979 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.440089 kubelet[2665]: W0909 04:59:40.439992 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.440089 kubelet[2665]: E0909 04:59:40.440004 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.440247 kubelet[2665]: E0909 04:59:40.440235 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.440399 kubelet[2665]: W0909 04:59:40.440292 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.440399 kubelet[2665]: E0909 04:59:40.440307 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.440752 kubelet[2665]: E0909 04:59:40.440737 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.441198 kubelet[2665]: W0909 04:59:40.441070 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.441198 kubelet[2665]: E0909 04:59:40.441093 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.441626 kubelet[2665]: E0909 04:59:40.441611 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.441821 kubelet[2665]: W0909 04:59:40.441784 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.441870 kubelet[2665]: E0909 04:59:40.441806 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.443736 kubelet[2665]: E0909 04:59:40.443718 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.443957 kubelet[2665]: W0909 04:59:40.443824 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.443957 kubelet[2665]: E0909 04:59:40.443845 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.444210 kubelet[2665]: E0909 04:59:40.444196 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.444365 kubelet[2665]: W0909 04:59:40.444267 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.444365 kubelet[2665]: E0909 04:59:40.444283 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.444677 kubelet[2665]: E0909 04:59:40.444664 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.444931 kubelet[2665]: W0909 04:59:40.444894 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.445083 kubelet[2665]: E0909 04:59:40.445071 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.445884 kubelet[2665]: E0909 04:59:40.445816 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.445884 kubelet[2665]: W0909 04:59:40.445830 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.445884 kubelet[2665]: E0909 04:59:40.445841 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.446074 kubelet[2665]: E0909 04:59:40.446054 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.446103 kubelet[2665]: W0909 04:59:40.446073 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.446103 kubelet[2665]: E0909 04:59:40.446087 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.447207 kubelet[2665]: E0909 04:59:40.446213 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.447207 kubelet[2665]: W0909 04:59:40.446224 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.447207 kubelet[2665]: E0909 04:59:40.446232 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.447207 kubelet[2665]: E0909 04:59:40.446355 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.447207 kubelet[2665]: W0909 04:59:40.446363 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.447207 kubelet[2665]: E0909 04:59:40.446370 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.447207 kubelet[2665]: E0909 04:59:40.447145 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.447207 kubelet[2665]: W0909 04:59:40.447154 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.447207 kubelet[2665]: E0909 04:59:40.447164 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.447815 containerd[1534]: time="2025-09-09T04:59:40.447777127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hllcp,Uid:5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9,Namespace:calico-system,Attempt:0,}" Sep 9 04:59:40.447929 kubelet[2665]: E0909 04:59:40.447913 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.447997 kubelet[2665]: W0909 04:59:40.447984 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.448055 kubelet[2665]: E0909 04:59:40.448045 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.448846 kubelet[2665]: I0909 04:59:40.448470 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff1d9b5f-1e61-411f-bd7e-fee7b5332631-kubelet-dir\") pod \"csi-node-driver-kgwfz\" (UID: \"ff1d9b5f-1e61-411f-bd7e-fee7b5332631\") " pod="calico-system/csi-node-driver-kgwfz" Sep 9 04:59:40.449296 kubelet[2665]: E0909 04:59:40.449102 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.449296 kubelet[2665]: W0909 04:59:40.449142 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.449296 kubelet[2665]: E0909 04:59:40.449159 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.449896 kubelet[2665]: E0909 04:59:40.449881 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.450063 kubelet[2665]: W0909 04:59:40.450047 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.450420 kubelet[2665]: E0909 04:59:40.450405 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.451320 kubelet[2665]: E0909 04:59:40.451303 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.451463 kubelet[2665]: W0909 04:59:40.451392 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.451463 kubelet[2665]: E0909 04:59:40.451411 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.451611 kubelet[2665]: I0909 04:59:40.451587 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ff1d9b5f-1e61-411f-bd7e-fee7b5332631-registration-dir\") pod \"csi-node-driver-kgwfz\" (UID: \"ff1d9b5f-1e61-411f-bd7e-fee7b5332631\") " pod="calico-system/csi-node-driver-kgwfz" Sep 9 04:59:40.452578 kubelet[2665]: E0909 04:59:40.452551 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.452578 kubelet[2665]: W0909 04:59:40.452565 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.452710 kubelet[2665]: E0909 04:59:40.452666 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.452909 kubelet[2665]: E0909 04:59:40.452887 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.453007 kubelet[2665]: W0909 04:59:40.452987 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.453064 kubelet[2665]: E0909 04:59:40.453053 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.455565 kubelet[2665]: E0909 04:59:40.455508 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.455565 kubelet[2665]: W0909 04:59:40.455530 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.455565 kubelet[2665]: E0909 04:59:40.455545 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.455795 kubelet[2665]: I0909 04:59:40.455678 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ff1d9b5f-1e61-411f-bd7e-fee7b5332631-varrun\") pod \"csi-node-driver-kgwfz\" (UID: \"ff1d9b5f-1e61-411f-bd7e-fee7b5332631\") " pod="calico-system/csi-node-driver-kgwfz" Sep 9 04:59:40.455984 kubelet[2665]: E0909 04:59:40.455954 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.455984 kubelet[2665]: W0909 04:59:40.455971 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.456079 kubelet[2665]: E0909 04:59:40.456067 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.456299 kubelet[2665]: I0909 04:59:40.456275 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svc7g\" (UniqueName: \"kubernetes.io/projected/ff1d9b5f-1e61-411f-bd7e-fee7b5332631-kube-api-access-svc7g\") pod \"csi-node-driver-kgwfz\" (UID: \"ff1d9b5f-1e61-411f-bd7e-fee7b5332631\") " pod="calico-system/csi-node-driver-kgwfz" Sep 9 04:59:40.456441 kubelet[2665]: E0909 04:59:40.456431 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.456535 kubelet[2665]: W0909 04:59:40.456521 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.456597 kubelet[2665]: E0909 04:59:40.456587 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.456930 kubelet[2665]: E0909 04:59:40.456912 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.457010 kubelet[2665]: W0909 04:59:40.456996 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.457102 kubelet[2665]: E0909 04:59:40.457060 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.457609 kubelet[2665]: E0909 04:59:40.457594 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.457869 kubelet[2665]: W0909 04:59:40.457731 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.457869 kubelet[2665]: E0909 04:59:40.457754 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.457869 kubelet[2665]: I0909 04:59:40.457773 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ff1d9b5f-1e61-411f-bd7e-fee7b5332631-socket-dir\") pod \"csi-node-driver-kgwfz\" (UID: \"ff1d9b5f-1e61-411f-bd7e-fee7b5332631\") " pod="calico-system/csi-node-driver-kgwfz" Sep 9 04:59:40.458055 kubelet[2665]: E0909 04:59:40.458040 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.458108 kubelet[2665]: W0909 04:59:40.458097 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.458163 kubelet[2665]: E0909 04:59:40.458153 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.458752 kubelet[2665]: E0909 04:59:40.458723 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.458752 kubelet[2665]: W0909 04:59:40.458738 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.458865 kubelet[2665]: E0909 04:59:40.458853 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.459128 kubelet[2665]: E0909 04:59:40.459092 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.459128 kubelet[2665]: W0909 04:59:40.459105 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.459128 kubelet[2665]: E0909 04:59:40.459115 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.459645 kubelet[2665]: E0909 04:59:40.459454 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.459645 kubelet[2665]: W0909 04:59:40.459602 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.459645 kubelet[2665]: E0909 04:59:40.459622 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.467277 containerd[1534]: time="2025-09-09T04:59:40.467237426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-758d6b7598-m9xx6,Uid:c215c12a-1c6c-442d-a584-7ba9e8403260,Namespace:calico-system,Attempt:0,} returns sandbox id \"a2667e54c54b4931fcfa2427ea95d3a6b5c43ea1f0dab0678f9e3f08e1cad516\"" Sep 9 04:59:40.475865 containerd[1534]: time="2025-09-09T04:59:40.475825229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 04:59:40.480954 containerd[1534]: time="2025-09-09T04:59:40.480916010Z" level=info msg="connecting to shim ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb" address="unix:///run/containerd/s/b848f5f2a9c59ee4b9eb3c54cc9757a7411a7d985550db872103fad5b9792d1b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:40.515655 systemd[1]: Started cri-containerd-ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb.scope - libcontainer container ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb. Sep 9 04:59:40.560749 kubelet[2665]: E0909 04:59:40.560717 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.560749 kubelet[2665]: W0909 04:59:40.560743 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.560917 kubelet[2665]: E0909 04:59:40.560761 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.561203 kubelet[2665]: E0909 04:59:40.561191 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.561250 kubelet[2665]: W0909 04:59:40.561203 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.561250 kubelet[2665]: E0909 04:59:40.561221 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.561430 kubelet[2665]: E0909 04:59:40.561415 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.561430 kubelet[2665]: W0909 04:59:40.561428 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.561504 kubelet[2665]: E0909 04:59:40.561439 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.561684 kubelet[2665]: E0909 04:59:40.561670 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.561684 kubelet[2665]: W0909 04:59:40.561683 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.561751 kubelet[2665]: E0909 04:59:40.561698 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.561919 kubelet[2665]: E0909 04:59:40.561900 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.561958 kubelet[2665]: W0909 04:59:40.561919 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.561958 kubelet[2665]: E0909 04:59:40.561934 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.562165 kubelet[2665]: E0909 04:59:40.562152 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.562165 kubelet[2665]: W0909 04:59:40.562164 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.562266 kubelet[2665]: E0909 04:59:40.562233 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.562372 kubelet[2665]: E0909 04:59:40.562356 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.562372 kubelet[2665]: W0909 04:59:40.562368 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.562462 kubelet[2665]: E0909 04:59:40.562392 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.562593 kubelet[2665]: E0909 04:59:40.562578 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.562593 kubelet[2665]: W0909 04:59:40.562592 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.562707 kubelet[2665]: E0909 04:59:40.562606 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.562764 kubelet[2665]: E0909 04:59:40.562755 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.562764 kubelet[2665]: W0909 04:59:40.562763 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.562853 kubelet[2665]: E0909 04:59:40.562786 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.562976 kubelet[2665]: E0909 04:59:40.562957 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.563013 kubelet[2665]: W0909 04:59:40.562981 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.563082 kubelet[2665]: E0909 04:59:40.563044 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.563247 kubelet[2665]: E0909 04:59:40.563233 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.563247 kubelet[2665]: W0909 04:59:40.563245 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.563347 kubelet[2665]: E0909 04:59:40.563324 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.563511 kubelet[2665]: E0909 04:59:40.563495 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.563559 kubelet[2665]: W0909 04:59:40.563521 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.563559 kubelet[2665]: E0909 04:59:40.563535 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.563791 kubelet[2665]: E0909 04:59:40.563776 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.563824 kubelet[2665]: W0909 04:59:40.563791 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.563867 kubelet[2665]: E0909 04:59:40.563826 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.564606 kubelet[2665]: E0909 04:59:40.564589 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.564606 kubelet[2665]: W0909 04:59:40.564606 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.564718 kubelet[2665]: E0909 04:59:40.564649 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.564819 kubelet[2665]: E0909 04:59:40.564806 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.564852 kubelet[2665]: W0909 04:59:40.564819 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.564940 kubelet[2665]: E0909 04:59:40.564883 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.565015 kubelet[2665]: E0909 04:59:40.564998 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.565015 kubelet[2665]: W0909 04:59:40.565013 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.565218 kubelet[2665]: E0909 04:59:40.565133 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.565218 kubelet[2665]: E0909 04:59:40.565169 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.565218 kubelet[2665]: W0909 04:59:40.565178 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.565324 kubelet[2665]: E0909 04:59:40.565300 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.565371 kubelet[2665]: E0909 04:59:40.565350 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.565371 kubelet[2665]: W0909 04:59:40.565359 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.565527 kubelet[2665]: E0909 04:59:40.565401 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.565601 kubelet[2665]: E0909 04:59:40.565588 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.565635 kubelet[2665]: W0909 04:59:40.565601 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.565635 kubelet[2665]: E0909 04:59:40.565617 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.565844 kubelet[2665]: E0909 04:59:40.565831 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.565844 kubelet[2665]: W0909 04:59:40.565844 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.565906 kubelet[2665]: E0909 04:59:40.565859 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.566530 kubelet[2665]: E0909 04:59:40.566512 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.566572 kubelet[2665]: W0909 04:59:40.566529 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.566572 kubelet[2665]: E0909 04:59:40.566548 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.566973 kubelet[2665]: E0909 04:59:40.566877 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.566973 kubelet[2665]: W0909 04:59:40.566893 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.566973 kubelet[2665]: E0909 04:59:40.566920 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.567193 kubelet[2665]: E0909 04:59:40.567172 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.567193 kubelet[2665]: W0909 04:59:40.567189 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.567265 kubelet[2665]: E0909 04:59:40.567201 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.567417 kubelet[2665]: E0909 04:59:40.567404 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.567453 kubelet[2665]: W0909 04:59:40.567417 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.567453 kubelet[2665]: E0909 04:59:40.567432 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.567692 kubelet[2665]: E0909 04:59:40.567666 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.567692 kubelet[2665]: W0909 04:59:40.567678 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.567692 kubelet[2665]: E0909 04:59:40.567688 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:40.569231 containerd[1534]: time="2025-09-09T04:59:40.569196785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hllcp,Uid:5d5bc9d2-ccd5-40fa-891a-a60cf268d8e9,Namespace:calico-system,Attempt:0,} returns sandbox id \"ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb\"" Sep 9 04:59:40.577995 kubelet[2665]: E0909 04:59:40.577974 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:40.577995 kubelet[2665]: W0909 04:59:40.577992 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:40.578093 kubelet[2665]: E0909 04:59:40.578006 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:41.702708 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3976304713.mount: Deactivated successfully. Sep 9 04:59:42.518172 kubelet[2665]: E0909 04:59:42.518120 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kgwfz" podUID="ff1d9b5f-1e61-411f-bd7e-fee7b5332631" Sep 9 04:59:42.932179 containerd[1534]: time="2025-09-09T04:59:42.932123937Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:42.933268 containerd[1534]: time="2025-09-09T04:59:42.933233450Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 04:59:42.934508 containerd[1534]: time="2025-09-09T04:59:42.934240556Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:42.936543 containerd[1534]: time="2025-09-09T04:59:42.936511026Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:42.937054 containerd[1534]: time="2025-09-09T04:59:42.937024860Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.46106346s" Sep 9 04:59:42.937115 containerd[1534]: time="2025-09-09T04:59:42.937057862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 04:59:42.938656 containerd[1534]: time="2025-09-09T04:59:42.938633766Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 04:59:42.965072 containerd[1534]: time="2025-09-09T04:59:42.964865853Z" level=info msg="CreateContainer within sandbox \"a2667e54c54b4931fcfa2427ea95d3a6b5c43ea1f0dab0678f9e3f08e1cad516\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 04:59:42.979551 containerd[1534]: time="2025-09-09T04:59:42.979501177Z" level=info msg="Container b0a1f7ee4e48638aad3588e3eef74bad48a65348ae1eb5b18e39c42569ae49f6: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:42.993802 containerd[1534]: time="2025-09-09T04:59:42.993756636Z" level=info msg="CreateContainer within sandbox \"a2667e54c54b4931fcfa2427ea95d3a6b5c43ea1f0dab0678f9e3f08e1cad516\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b0a1f7ee4e48638aad3588e3eef74bad48a65348ae1eb5b18e39c42569ae49f6\"" Sep 9 04:59:42.995519 containerd[1534]: time="2025-09-09T04:59:42.995143887Z" level=info msg="StartContainer for \"b0a1f7ee4e48638aad3588e3eef74bad48a65348ae1eb5b18e39c42569ae49f6\"" Sep 9 04:59:42.997097 containerd[1534]: time="2025-09-09T04:59:42.997053413Z" level=info msg="connecting to shim b0a1f7ee4e48638aad3588e3eef74bad48a65348ae1eb5b18e39c42569ae49f6" address="unix:///run/containerd/s/129926f20902dc73f1bec00d882302d6656350daa049dd57806a74871a7a8e87" protocol=ttrpc version=3 Sep 9 04:59:43.018651 systemd[1]: Started cri-containerd-b0a1f7ee4e48638aad3588e3eef74bad48a65348ae1eb5b18e39c42569ae49f6.scope - libcontainer container b0a1f7ee4e48638aad3588e3eef74bad48a65348ae1eb5b18e39c42569ae49f6. Sep 9 04:59:43.056763 containerd[1534]: time="2025-09-09T04:59:43.056522418Z" level=info msg="StartContainer for \"b0a1f7ee4e48638aad3588e3eef74bad48a65348ae1eb5b18e39c42569ae49f6\" returns successfully" Sep 9 04:59:43.672743 kubelet[2665]: E0909 04:59:43.672717 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.672743 kubelet[2665]: W0909 04:59:43.672737 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673149 kubelet[2665]: E0909 04:59:43.672755 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673149 kubelet[2665]: E0909 04:59:43.672915 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673149 kubelet[2665]: W0909 04:59:43.672922 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673149 kubelet[2665]: E0909 04:59:43.672930 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673149 kubelet[2665]: E0909 04:59:43.673045 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673149 kubelet[2665]: W0909 04:59:43.673052 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673149 kubelet[2665]: E0909 04:59:43.673059 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673295 kubelet[2665]: E0909 04:59:43.673186 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673295 kubelet[2665]: W0909 04:59:43.673194 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673295 kubelet[2665]: E0909 04:59:43.673201 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673361 kubelet[2665]: E0909 04:59:43.673322 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673361 kubelet[2665]: W0909 04:59:43.673330 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673361 kubelet[2665]: E0909 04:59:43.673337 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673464 kubelet[2665]: E0909 04:59:43.673448 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673464 kubelet[2665]: W0909 04:59:43.673458 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673543 kubelet[2665]: E0909 04:59:43.673466 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673604 kubelet[2665]: E0909 04:59:43.673593 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673604 kubelet[2665]: W0909 04:59:43.673602 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673667 kubelet[2665]: E0909 04:59:43.673610 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673743 kubelet[2665]: E0909 04:59:43.673728 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673743 kubelet[2665]: W0909 04:59:43.673739 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673796 kubelet[2665]: E0909 04:59:43.673746 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.673881 kubelet[2665]: E0909 04:59:43.673871 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.673881 kubelet[2665]: W0909 04:59:43.673880 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.673946 kubelet[2665]: E0909 04:59:43.673888 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.674014 kubelet[2665]: E0909 04:59:43.674000 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.674014 kubelet[2665]: W0909 04:59:43.674010 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.674074 kubelet[2665]: E0909 04:59:43.674020 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.674171 kubelet[2665]: E0909 04:59:43.674139 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.674171 kubelet[2665]: W0909 04:59:43.674149 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.674171 kubelet[2665]: E0909 04:59:43.674156 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.674306 kubelet[2665]: E0909 04:59:43.674279 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.674306 kubelet[2665]: W0909 04:59:43.674286 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.674306 kubelet[2665]: E0909 04:59:43.674294 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.674573 kubelet[2665]: E0909 04:59:43.674415 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.674573 kubelet[2665]: W0909 04:59:43.674421 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.674573 kubelet[2665]: E0909 04:59:43.674429 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.674573 kubelet[2665]: E0909 04:59:43.674560 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.674573 kubelet[2665]: W0909 04:59:43.674571 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.674682 kubelet[2665]: E0909 04:59:43.674579 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.674753 kubelet[2665]: E0909 04:59:43.674716 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.674753 kubelet[2665]: W0909 04:59:43.674726 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.674753 kubelet[2665]: E0909 04:59:43.674734 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.684548 kubelet[2665]: E0909 04:59:43.684522 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.684548 kubelet[2665]: W0909 04:59:43.684541 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.684667 kubelet[2665]: E0909 04:59:43.684556 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.684755 kubelet[2665]: E0909 04:59:43.684740 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.684755 kubelet[2665]: W0909 04:59:43.684751 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.684822 kubelet[2665]: E0909 04:59:43.684766 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.684934 kubelet[2665]: E0909 04:59:43.684918 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.684934 kubelet[2665]: W0909 04:59:43.684929 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.684985 kubelet[2665]: E0909 04:59:43.684942 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.685224 kubelet[2665]: E0909 04:59:43.685209 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.685268 kubelet[2665]: W0909 04:59:43.685224 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.685268 kubelet[2665]: E0909 04:59:43.685241 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.685399 kubelet[2665]: E0909 04:59:43.685384 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.685399 kubelet[2665]: W0909 04:59:43.685395 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.685454 kubelet[2665]: E0909 04:59:43.685408 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.685821 kubelet[2665]: E0909 04:59:43.685805 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.685858 kubelet[2665]: W0909 04:59:43.685822 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.685858 kubelet[2665]: E0909 04:59:43.685839 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.686536 kubelet[2665]: E0909 04:59:43.686518 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.686573 kubelet[2665]: W0909 04:59:43.686537 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.686595 kubelet[2665]: E0909 04:59:43.686571 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.686731 kubelet[2665]: E0909 04:59:43.686717 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.686731 kubelet[2665]: W0909 04:59:43.686730 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.686784 kubelet[2665]: E0909 04:59:43.686743 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.686913 kubelet[2665]: E0909 04:59:43.686899 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.686913 kubelet[2665]: W0909 04:59:43.686912 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.686970 kubelet[2665]: E0909 04:59:43.686938 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.687100 kubelet[2665]: E0909 04:59:43.687087 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.687100 kubelet[2665]: W0909 04:59:43.687098 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.687157 kubelet[2665]: E0909 04:59:43.687121 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.687292 kubelet[2665]: E0909 04:59:43.687279 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.687292 kubelet[2665]: W0909 04:59:43.687290 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.687341 kubelet[2665]: E0909 04:59:43.687304 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.687434 kubelet[2665]: E0909 04:59:43.687424 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.687459 kubelet[2665]: W0909 04:59:43.687434 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.687459 kubelet[2665]: E0909 04:59:43.687446 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.687633 kubelet[2665]: E0909 04:59:43.687621 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.687633 kubelet[2665]: W0909 04:59:43.687632 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.687680 kubelet[2665]: E0909 04:59:43.687645 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.687862 kubelet[2665]: E0909 04:59:43.687846 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.687890 kubelet[2665]: W0909 04:59:43.687861 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.687890 kubelet[2665]: E0909 04:59:43.687872 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.688005 kubelet[2665]: E0909 04:59:43.687994 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.688005 kubelet[2665]: W0909 04:59:43.688004 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.688059 kubelet[2665]: E0909 04:59:43.688018 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.688182 kubelet[2665]: E0909 04:59:43.688169 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.688182 kubelet[2665]: W0909 04:59:43.688180 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.688232 kubelet[2665]: E0909 04:59:43.688193 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.688410 kubelet[2665]: E0909 04:59:43.688392 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.688438 kubelet[2665]: W0909 04:59:43.688409 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.688438 kubelet[2665]: E0909 04:59:43.688424 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:43.688825 kubelet[2665]: E0909 04:59:43.688808 2665 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:59:43.688863 kubelet[2665]: W0909 04:59:43.688823 2665 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:59:43.688863 kubelet[2665]: E0909 04:59:43.688841 2665 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:59:44.012738 containerd[1534]: time="2025-09-09T04:59:44.012636441Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:44.014594 containerd[1534]: time="2025-09-09T04:59:44.014565472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 04:59:44.015548 containerd[1534]: time="2025-09-09T04:59:44.015520088Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:44.017452 containerd[1534]: time="2025-09-09T04:59:44.017401877Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:44.017902 containerd[1534]: time="2025-09-09T04:59:44.017865263Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.079068847s" Sep 9 04:59:44.017902 containerd[1534]: time="2025-09-09T04:59:44.017899545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 04:59:44.023705 containerd[1534]: time="2025-09-09T04:59:44.023661799Z" level=info msg="CreateContainer within sandbox \"ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 04:59:44.030998 containerd[1534]: time="2025-09-09T04:59:44.030837774Z" level=info msg="Container 795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:44.039296 containerd[1534]: time="2025-09-09T04:59:44.039256501Z" level=info msg="CreateContainer within sandbox \"ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1\"" Sep 9 04:59:44.040077 containerd[1534]: time="2025-09-09T04:59:44.040049907Z" level=info msg="StartContainer for \"795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1\"" Sep 9 04:59:44.041474 containerd[1534]: time="2025-09-09T04:59:44.041411946Z" level=info msg="connecting to shim 795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1" address="unix:///run/containerd/s/b848f5f2a9c59ee4b9eb3c54cc9757a7411a7d985550db872103fad5b9792d1b" protocol=ttrpc version=3 Sep 9 04:59:44.065648 systemd[1]: Started cri-containerd-795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1.scope - libcontainer container 795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1. Sep 9 04:59:44.100997 containerd[1534]: time="2025-09-09T04:59:44.100891629Z" level=info msg="StartContainer for \"795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1\" returns successfully" Sep 9 04:59:44.118662 systemd[1]: cri-containerd-795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1.scope: Deactivated successfully. Sep 9 04:59:44.118937 systemd[1]: cri-containerd-795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1.scope: Consumed 29ms CPU time, 6.3M memory peak, 4.1M written to disk. Sep 9 04:59:44.139505 containerd[1534]: time="2025-09-09T04:59:44.139402938Z" level=info msg="received exit event container_id:\"795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1\" id:\"795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1\" pid:3348 exited_at:{seconds:1757393984 nanos:134354566}" Sep 9 04:59:44.139608 containerd[1534]: time="2025-09-09T04:59:44.139515144Z" level=info msg="TaskExit event in podsandbox handler container_id:\"795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1\" id:\"795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1\" pid:3348 exited_at:{seconds:1757393984 nanos:134354566}" Sep 9 04:59:44.175091 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-795449b52c257500f93070898dd2e940e752d9ce0c297826cd38321878e89fa1-rootfs.mount: Deactivated successfully. Sep 9 04:59:44.518181 kubelet[2665]: E0909 04:59:44.518127 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kgwfz" podUID="ff1d9b5f-1e61-411f-bd7e-fee7b5332631" Sep 9 04:59:44.587667 kubelet[2665]: I0909 04:59:44.587638 2665 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:59:44.589176 containerd[1534]: time="2025-09-09T04:59:44.589126488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 04:59:44.605351 kubelet[2665]: I0909 04:59:44.605082 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-758d6b7598-m9xx6" podStartSLOduration=3.141652665 podStartE2EDuration="5.60506209s" podCreationTimestamp="2025-09-09 04:59:39 +0000 UTC" firstStartedPulling="2025-09-09 04:59:40.474810833 +0000 UTC m=+20.054878966" lastFinishedPulling="2025-09-09 04:59:42.938220258 +0000 UTC m=+22.518288391" observedRunningTime="2025-09-09 04:59:43.59722252 +0000 UTC m=+23.177290693" watchObservedRunningTime="2025-09-09 04:59:44.60506209 +0000 UTC m=+24.185130223" Sep 9 04:59:46.477654 kubelet[2665]: I0909 04:59:46.477620 2665 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:59:46.518504 kubelet[2665]: E0909 04:59:46.518069 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kgwfz" podUID="ff1d9b5f-1e61-411f-bd7e-fee7b5332631" Sep 9 04:59:47.362103 containerd[1534]: time="2025-09-09T04:59:47.362053116Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:47.363185 containerd[1534]: time="2025-09-09T04:59:47.363158729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 04:59:47.367016 containerd[1534]: time="2025-09-09T04:59:47.366818783Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:47.368738 containerd[1534]: time="2025-09-09T04:59:47.368707713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:47.374779 containerd[1534]: time="2025-09-09T04:59:47.374741641Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.785566191s" Sep 9 04:59:47.374907 containerd[1534]: time="2025-09-09T04:59:47.374873408Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 04:59:47.380060 containerd[1534]: time="2025-09-09T04:59:47.380023933Z" level=info msg="CreateContainer within sandbox \"ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 04:59:47.398741 containerd[1534]: time="2025-09-09T04:59:47.398690103Z" level=info msg="Container f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:47.407161 containerd[1534]: time="2025-09-09T04:59:47.407112865Z" level=info msg="CreateContainer within sandbox \"ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a\"" Sep 9 04:59:47.407877 containerd[1534]: time="2025-09-09T04:59:47.407850700Z" level=info msg="StartContainer for \"f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a\"" Sep 9 04:59:47.409344 containerd[1534]: time="2025-09-09T04:59:47.409304930Z" level=info msg="connecting to shim f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a" address="unix:///run/containerd/s/b848f5f2a9c59ee4b9eb3c54cc9757a7411a7d985550db872103fad5b9792d1b" protocol=ttrpc version=3 Sep 9 04:59:47.435651 systemd[1]: Started cri-containerd-f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a.scope - libcontainer container f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a. Sep 9 04:59:47.496576 containerd[1534]: time="2025-09-09T04:59:47.496533610Z" level=info msg="StartContainer for \"f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a\" returns successfully" Sep 9 04:59:47.973399 systemd[1]: cri-containerd-f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a.scope: Deactivated successfully. Sep 9 04:59:47.973746 systemd[1]: cri-containerd-f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a.scope: Consumed 453ms CPU time, 173.3M memory peak, 2.2M read from disk, 165.8M written to disk. Sep 9 04:59:47.976158 containerd[1534]: time="2025-09-09T04:59:47.976030318Z" level=info msg="received exit event container_id:\"f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a\" id:\"f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a\" pid:3409 exited_at:{seconds:1757393987 nanos:975694102}" Sep 9 04:59:47.976158 containerd[1534]: time="2025-09-09T04:59:47.976122883Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a\" id:\"f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a\" pid:3409 exited_at:{seconds:1757393987 nanos:975694102}" Sep 9 04:59:47.985737 containerd[1534]: time="2025-09-09T04:59:47.985276599Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:59:48.000362 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f32e72dd35cb55668712ed7683967e029b91d2c9e6c1603a75b068b87d56c27a-rootfs.mount: Deactivated successfully. Sep 9 04:59:48.088513 kubelet[2665]: I0909 04:59:48.088081 2665 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 04:59:48.149080 systemd[1]: Created slice kubepods-burstable-poda78e19f7_e135_4e75_9a5d_fc71b1732548.slice - libcontainer container kubepods-burstable-poda78e19f7_e135_4e75_9a5d_fc71b1732548.slice. Sep 9 04:59:48.157292 systemd[1]: Created slice kubepods-besteffort-pod33b022e7_d521_48f6_96bd_a9569f5800c6.slice - libcontainer container kubepods-besteffort-pod33b022e7_d521_48f6_96bd_a9569f5800c6.slice. Sep 9 04:59:48.162512 systemd[1]: Created slice kubepods-burstable-pod4a205d0f_8b34_4ade_831f_621b0059d51f.slice - libcontainer container kubepods-burstable-pod4a205d0f_8b34_4ade_831f_621b0059d51f.slice. Sep 9 04:59:48.171966 systemd[1]: Created slice kubepods-besteffort-podebb4a83d_5f9f_41ce_8042_1eac984fa7d5.slice - libcontainer container kubepods-besteffort-podebb4a83d_5f9f_41ce_8042_1eac984fa7d5.slice. Sep 9 04:59:48.179010 systemd[1]: Created slice kubepods-besteffort-pod4c0b831d_41fe_4470_964b_e4d28117b02b.slice - libcontainer container kubepods-besteffort-pod4c0b831d_41fe_4470_964b_e4d28117b02b.slice. Sep 9 04:59:48.185985 systemd[1]: Created slice kubepods-besteffort-pod0aaa75f8_a1cd_46bc_9799_0bb8fa9008b1.slice - libcontainer container kubepods-besteffort-pod0aaa75f8_a1cd_46bc_9799_0bb8fa9008b1.slice. Sep 9 04:59:48.190862 systemd[1]: Created slice kubepods-besteffort-podb1b0c4cf_4acb_4ff4_aa58_79ffc93c562a.slice - libcontainer container kubepods-besteffort-podb1b0c4cf_4acb_4ff4_aa58_79ffc93c562a.slice. Sep 9 04:59:48.216201 kubelet[2665]: I0909 04:59:48.216165 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebb4a83d-5f9f-41ce-8042-1eac984fa7d5-config\") pod \"goldmane-7988f88666-mrj5t\" (UID: \"ebb4a83d-5f9f-41ce-8042-1eac984fa7d5\") " pod="calico-system/goldmane-7988f88666-mrj5t" Sep 9 04:59:48.216201 kubelet[2665]: I0909 04:59:48.216207 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rt9\" (UniqueName: \"kubernetes.io/projected/33b022e7-d521-48f6-96bd-a9569f5800c6-kube-api-access-w5rt9\") pod \"calico-kube-controllers-5f86fdbd-xnjnq\" (UID: \"33b022e7-d521-48f6-96bd-a9569f5800c6\") " pod="calico-system/calico-kube-controllers-5f86fdbd-xnjnq" Sep 9 04:59:48.216357 kubelet[2665]: I0909 04:59:48.216231 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx2sm\" (UniqueName: \"kubernetes.io/projected/a78e19f7-e135-4e75-9a5d-fc71b1732548-kube-api-access-kx2sm\") pod \"coredns-7c65d6cfc9-m56jh\" (UID: \"a78e19f7-e135-4e75-9a5d-fc71b1732548\") " pod="kube-system/coredns-7c65d6cfc9-m56jh" Sep 9 04:59:48.216357 kubelet[2665]: I0909 04:59:48.216249 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8572\" (UniqueName: \"kubernetes.io/projected/ebb4a83d-5f9f-41ce-8042-1eac984fa7d5-kube-api-access-k8572\") pod \"goldmane-7988f88666-mrj5t\" (UID: \"ebb4a83d-5f9f-41ce-8042-1eac984fa7d5\") " pod="calico-system/goldmane-7988f88666-mrj5t" Sep 9 04:59:48.216357 kubelet[2665]: I0909 04:59:48.216270 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a78e19f7-e135-4e75-9a5d-fc71b1732548-config-volume\") pod \"coredns-7c65d6cfc9-m56jh\" (UID: \"a78e19f7-e135-4e75-9a5d-fc71b1732548\") " pod="kube-system/coredns-7c65d6cfc9-m56jh" Sep 9 04:59:48.216357 kubelet[2665]: I0909 04:59:48.216303 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b022e7-d521-48f6-96bd-a9569f5800c6-tigera-ca-bundle\") pod \"calico-kube-controllers-5f86fdbd-xnjnq\" (UID: \"33b022e7-d521-48f6-96bd-a9569f5800c6\") " pod="calico-system/calico-kube-controllers-5f86fdbd-xnjnq" Sep 9 04:59:48.216357 kubelet[2665]: I0909 04:59:48.216330 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-backend-key-pair\") pod \"whisker-6c6d659cc-r5jzx\" (UID: \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\") " pod="calico-system/whisker-6c6d659cc-r5jzx" Sep 9 04:59:48.216563 kubelet[2665]: I0909 04:59:48.216347 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwztg\" (UniqueName: \"kubernetes.io/projected/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-kube-api-access-kwztg\") pod \"whisker-6c6d659cc-r5jzx\" (UID: \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\") " pod="calico-system/whisker-6c6d659cc-r5jzx" Sep 9 04:59:48.216563 kubelet[2665]: I0909 04:59:48.216365 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5sxj\" (UniqueName: \"kubernetes.io/projected/4c0b831d-41fe-4470-964b-e4d28117b02b-kube-api-access-p5sxj\") pod \"calico-apiserver-6d964bcddb-t475x\" (UID: \"4c0b831d-41fe-4470-964b-e4d28117b02b\") " pod="calico-apiserver/calico-apiserver-6d964bcddb-t475x" Sep 9 04:59:48.216563 kubelet[2665]: I0909 04:59:48.216414 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-ca-bundle\") pod \"whisker-6c6d659cc-r5jzx\" (UID: \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\") " pod="calico-system/whisker-6c6d659cc-r5jzx" Sep 9 04:59:48.216563 kubelet[2665]: I0909 04:59:48.216431 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fdft\" (UniqueName: \"kubernetes.io/projected/4a205d0f-8b34-4ade-831f-621b0059d51f-kube-api-access-2fdft\") pod \"coredns-7c65d6cfc9-6nkl6\" (UID: \"4a205d0f-8b34-4ade-831f-621b0059d51f\") " pod="kube-system/coredns-7c65d6cfc9-6nkl6" Sep 9 04:59:48.216563 kubelet[2665]: I0909 04:59:48.216449 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4c0b831d-41fe-4470-964b-e4d28117b02b-calico-apiserver-certs\") pod \"calico-apiserver-6d964bcddb-t475x\" (UID: \"4c0b831d-41fe-4470-964b-e4d28117b02b\") " pod="calico-apiserver/calico-apiserver-6d964bcddb-t475x" Sep 9 04:59:48.216687 kubelet[2665]: I0909 04:59:48.216466 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb4a83d-5f9f-41ce-8042-1eac984fa7d5-goldmane-ca-bundle\") pod \"goldmane-7988f88666-mrj5t\" (UID: \"ebb4a83d-5f9f-41ce-8042-1eac984fa7d5\") " pod="calico-system/goldmane-7988f88666-mrj5t" Sep 9 04:59:48.216687 kubelet[2665]: I0909 04:59:48.216504 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ebb4a83d-5f9f-41ce-8042-1eac984fa7d5-goldmane-key-pair\") pod \"goldmane-7988f88666-mrj5t\" (UID: \"ebb4a83d-5f9f-41ce-8042-1eac984fa7d5\") " pod="calico-system/goldmane-7988f88666-mrj5t" Sep 9 04:59:48.216687 kubelet[2665]: I0909 04:59:48.216523 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1-calico-apiserver-certs\") pod \"calico-apiserver-6d964bcddb-nf8n5\" (UID: \"0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1\") " pod="calico-apiserver/calico-apiserver-6d964bcddb-nf8n5" Sep 9 04:59:48.216687 kubelet[2665]: I0909 04:59:48.216539 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a205d0f-8b34-4ade-831f-621b0059d51f-config-volume\") pod \"coredns-7c65d6cfc9-6nkl6\" (UID: \"4a205d0f-8b34-4ade-831f-621b0059d51f\") " pod="kube-system/coredns-7c65d6cfc9-6nkl6" Sep 9 04:59:48.216687 kubelet[2665]: I0909 04:59:48.216555 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpv7\" (UniqueName: \"kubernetes.io/projected/0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1-kube-api-access-xqpv7\") pod \"calico-apiserver-6d964bcddb-nf8n5\" (UID: \"0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1\") " pod="calico-apiserver/calico-apiserver-6d964bcddb-nf8n5" Sep 9 04:59:48.458810 containerd[1534]: time="2025-09-09T04:59:48.458695292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-m56jh,Uid:a78e19f7-e135-4e75-9a5d-fc71b1732548,Namespace:kube-system,Attempt:0,}" Sep 9 04:59:48.462505 containerd[1534]: time="2025-09-09T04:59:48.462298653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f86fdbd-xnjnq,Uid:33b022e7-d521-48f6-96bd-a9569f5800c6,Namespace:calico-system,Attempt:0,}" Sep 9 04:59:48.469613 containerd[1534]: time="2025-09-09T04:59:48.469580018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6nkl6,Uid:4a205d0f-8b34-4ade-831f-621b0059d51f,Namespace:kube-system,Attempt:0,}" Sep 9 04:59:48.477294 containerd[1534]: time="2025-09-09T04:59:48.477186199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mrj5t,Uid:ebb4a83d-5f9f-41ce-8042-1eac984fa7d5,Namespace:calico-system,Attempt:0,}" Sep 9 04:59:48.485386 containerd[1534]: time="2025-09-09T04:59:48.485345363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-t475x,Uid:4c0b831d-41fe-4470-964b-e4d28117b02b,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:59:48.490687 containerd[1534]: time="2025-09-09T04:59:48.490498874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-nf8n5,Uid:0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:59:48.495628 containerd[1534]: time="2025-09-09T04:59:48.495592261Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c6d659cc-r5jzx,Uid:b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a,Namespace:calico-system,Attempt:0,}" Sep 9 04:59:48.532935 systemd[1]: Created slice kubepods-besteffort-podff1d9b5f_1e61_411f_bd7e_fee7b5332631.slice - libcontainer container kubepods-besteffort-podff1d9b5f_1e61_411f_bd7e_fee7b5332631.slice. Sep 9 04:59:48.537412 containerd[1534]: time="2025-09-09T04:59:48.537360529Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kgwfz,Uid:ff1d9b5f-1e61-411f-bd7e-fee7b5332631,Namespace:calico-system,Attempt:0,}" Sep 9 04:59:48.602173 containerd[1534]: time="2025-09-09T04:59:48.602115784Z" level=error msg="Failed to destroy network for sandbox \"3b55f02376c8afdb3009f074615d4c0d367c48d37688ac007e5c8dab0d0a7ae8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.604140 containerd[1534]: time="2025-09-09T04:59:48.604071512Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-nf8n5,Uid:0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b55f02376c8afdb3009f074615d4c0d367c48d37688ac007e5c8dab0d0a7ae8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.604411 kubelet[2665]: E0909 04:59:48.604277 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b55f02376c8afdb3009f074615d4c0d367c48d37688ac007e5c8dab0d0a7ae8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.606198 kubelet[2665]: E0909 04:59:48.606149 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b55f02376c8afdb3009f074615d4c0d367c48d37688ac007e5c8dab0d0a7ae8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d964bcddb-nf8n5" Sep 9 04:59:48.606302 kubelet[2665]: E0909 04:59:48.606203 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b55f02376c8afdb3009f074615d4c0d367c48d37688ac007e5c8dab0d0a7ae8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d964bcddb-nf8n5" Sep 9 04:59:48.606302 kubelet[2665]: E0909 04:59:48.606254 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6d964bcddb-nf8n5_calico-apiserver(0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6d964bcddb-nf8n5_calico-apiserver(0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b55f02376c8afdb3009f074615d4c0d367c48d37688ac007e5c8dab0d0a7ae8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6d964bcddb-nf8n5" podUID="0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1" Sep 9 04:59:48.612949 containerd[1534]: time="2025-09-09T04:59:48.612866825Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 04:59:48.624026 containerd[1534]: time="2025-09-09T04:59:48.623886158Z" level=error msg="Failed to destroy network for sandbox \"c685b5922df32051a3429abfcf51d5806280a1b6e322f25f511b9111a93823fc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.625528 containerd[1534]: time="2025-09-09T04:59:48.625427827Z" level=error msg="Failed to destroy network for sandbox \"56ca1318b7935b83bd874dc82bbf0e8cf9e7a8927dfb3cc3ba5e6f145d7f7ad3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.626045 containerd[1534]: time="2025-09-09T04:59:48.626010053Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-t475x,Uid:4c0b831d-41fe-4470-964b-e4d28117b02b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c685b5922df32051a3429abfcf51d5806280a1b6e322f25f511b9111a93823fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.626586 kubelet[2665]: E0909 04:59:48.626537 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c685b5922df32051a3429abfcf51d5806280a1b6e322f25f511b9111a93823fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.626688 kubelet[2665]: E0909 04:59:48.626608 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c685b5922df32051a3429abfcf51d5806280a1b6e322f25f511b9111a93823fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d964bcddb-t475x" Sep 9 04:59:48.626688 kubelet[2665]: E0909 04:59:48.626627 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c685b5922df32051a3429abfcf51d5806280a1b6e322f25f511b9111a93823fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6d964bcddb-t475x" Sep 9 04:59:48.626688 kubelet[2665]: E0909 04:59:48.626671 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6d964bcddb-t475x_calico-apiserver(4c0b831d-41fe-4470-964b-e4d28117b02b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6d964bcddb-t475x_calico-apiserver(4c0b831d-41fe-4470-964b-e4d28117b02b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c685b5922df32051a3429abfcf51d5806280a1b6e322f25f511b9111a93823fc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6d964bcddb-t475x" podUID="4c0b831d-41fe-4470-964b-e4d28117b02b" Sep 9 04:59:48.628153 containerd[1534]: time="2025-09-09T04:59:48.628102586Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-m56jh,Uid:a78e19f7-e135-4e75-9a5d-fc71b1732548,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"56ca1318b7935b83bd874dc82bbf0e8cf9e7a8927dfb3cc3ba5e6f145d7f7ad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.628362 kubelet[2665]: E0909 04:59:48.628313 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56ca1318b7935b83bd874dc82bbf0e8cf9e7a8927dfb3cc3ba5e6f145d7f7ad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.628528 kubelet[2665]: E0909 04:59:48.628373 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56ca1318b7935b83bd874dc82bbf0e8cf9e7a8927dfb3cc3ba5e6f145d7f7ad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-m56jh" Sep 9 04:59:48.628528 kubelet[2665]: E0909 04:59:48.628392 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56ca1318b7935b83bd874dc82bbf0e8cf9e7a8927dfb3cc3ba5e6f145d7f7ad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-m56jh" Sep 9 04:59:48.628528 kubelet[2665]: E0909 04:59:48.628430 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-m56jh_kube-system(a78e19f7-e135-4e75-9a5d-fc71b1732548)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-m56jh_kube-system(a78e19f7-e135-4e75-9a5d-fc71b1732548)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"56ca1318b7935b83bd874dc82bbf0e8cf9e7a8927dfb3cc3ba5e6f145d7f7ad3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-m56jh" podUID="a78e19f7-e135-4e75-9a5d-fc71b1732548" Sep 9 04:59:48.631418 containerd[1534]: time="2025-09-09T04:59:48.631096960Z" level=error msg="Failed to destroy network for sandbox \"4e1eab02a2e1e0c3d8955fa041a3a3393ddbef3b385a12b00a8dd912decfbfe3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.632367 containerd[1534]: time="2025-09-09T04:59:48.632296054Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mrj5t,Uid:ebb4a83d-5f9f-41ce-8042-1eac984fa7d5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e1eab02a2e1e0c3d8955fa041a3a3393ddbef3b385a12b00a8dd912decfbfe3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.632744 kubelet[2665]: E0909 04:59:48.632698 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e1eab02a2e1e0c3d8955fa041a3a3393ddbef3b385a12b00a8dd912decfbfe3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.632821 kubelet[2665]: E0909 04:59:48.632761 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e1eab02a2e1e0c3d8955fa041a3a3393ddbef3b385a12b00a8dd912decfbfe3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-mrj5t" Sep 9 04:59:48.632821 kubelet[2665]: E0909 04:59:48.632779 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e1eab02a2e1e0c3d8955fa041a3a3393ddbef3b385a12b00a8dd912decfbfe3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-mrj5t" Sep 9 04:59:48.633553 kubelet[2665]: E0909 04:59:48.632820 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-mrj5t_calico-system(ebb4a83d-5f9f-41ce-8042-1eac984fa7d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-mrj5t_calico-system(ebb4a83d-5f9f-41ce-8042-1eac984fa7d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4e1eab02a2e1e0c3d8955fa041a3a3393ddbef3b385a12b00a8dd912decfbfe3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-mrj5t" podUID="ebb4a83d-5f9f-41ce-8042-1eac984fa7d5" Sep 9 04:59:48.640964 containerd[1534]: time="2025-09-09T04:59:48.640912719Z" level=error msg="Failed to destroy network for sandbox \"62e952eda9da2031716233e93c2e8452ecb6792349bfd81553c2b19d223ed196\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.642071 containerd[1534]: time="2025-09-09T04:59:48.642028889Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6nkl6,Uid:4a205d0f-8b34-4ade-831f-621b0059d51f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"62e952eda9da2031716233e93c2e8452ecb6792349bfd81553c2b19d223ed196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.642568 kubelet[2665]: E0909 04:59:48.642229 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62e952eda9da2031716233e93c2e8452ecb6792349bfd81553c2b19d223ed196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.642568 kubelet[2665]: E0909 04:59:48.642279 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62e952eda9da2031716233e93c2e8452ecb6792349bfd81553c2b19d223ed196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-6nkl6" Sep 9 04:59:48.642568 kubelet[2665]: E0909 04:59:48.642298 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62e952eda9da2031716233e93c2e8452ecb6792349bfd81553c2b19d223ed196\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-6nkl6" Sep 9 04:59:48.642740 kubelet[2665]: E0909 04:59:48.642348 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-6nkl6_kube-system(4a205d0f-8b34-4ade-831f-621b0059d51f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-6nkl6_kube-system(4a205d0f-8b34-4ade-831f-621b0059d51f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62e952eda9da2031716233e93c2e8452ecb6792349bfd81553c2b19d223ed196\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-6nkl6" podUID="4a205d0f-8b34-4ade-831f-621b0059d51f" Sep 9 04:59:48.649473 containerd[1534]: time="2025-09-09T04:59:48.649416739Z" level=error msg="Failed to destroy network for sandbox \"9eff64ae6be29b33f9ab02972feac41afb26a7ae41a3cb3d7bbfd183d97c23e6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.651360 containerd[1534]: time="2025-09-09T04:59:48.651305224Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f86fdbd-xnjnq,Uid:33b022e7-d521-48f6-96bd-a9569f5800c6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eff64ae6be29b33f9ab02972feac41afb26a7ae41a3cb3d7bbfd183d97c23e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.651594 kubelet[2665]: E0909 04:59:48.651547 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eff64ae6be29b33f9ab02972feac41afb26a7ae41a3cb3d7bbfd183d97c23e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.651681 kubelet[2665]: E0909 04:59:48.651612 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eff64ae6be29b33f9ab02972feac41afb26a7ae41a3cb3d7bbfd183d97c23e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f86fdbd-xnjnq" Sep 9 04:59:48.651681 kubelet[2665]: E0909 04:59:48.651635 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eff64ae6be29b33f9ab02972feac41afb26a7ae41a3cb3d7bbfd183d97c23e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f86fdbd-xnjnq" Sep 9 04:59:48.651759 kubelet[2665]: E0909 04:59:48.651679 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5f86fdbd-xnjnq_calico-system(33b022e7-d521-48f6-96bd-a9569f5800c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5f86fdbd-xnjnq_calico-system(33b022e7-d521-48f6-96bd-a9569f5800c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9eff64ae6be29b33f9ab02972feac41afb26a7ae41a3cb3d7bbfd183d97c23e6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5f86fdbd-xnjnq" podUID="33b022e7-d521-48f6-96bd-a9569f5800c6" Sep 9 04:59:48.653364 containerd[1534]: time="2025-09-09T04:59:48.653316274Z" level=error msg="Failed to destroy network for sandbox \"8801ee2f4a6c41dd1b6c5debcd2e0fbed9720d5df4e579d5cda2d70737da4869\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.654807 containerd[1534]: time="2025-09-09T04:59:48.654751818Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c6d659cc-r5jzx,Uid:b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8801ee2f4a6c41dd1b6c5debcd2e0fbed9720d5df4e579d5cda2d70737da4869\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.655357 kubelet[2665]: E0909 04:59:48.655302 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8801ee2f4a6c41dd1b6c5debcd2e0fbed9720d5df4e579d5cda2d70737da4869\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.655427 kubelet[2665]: E0909 04:59:48.655372 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8801ee2f4a6c41dd1b6c5debcd2e0fbed9720d5df4e579d5cda2d70737da4869\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c6d659cc-r5jzx" Sep 9 04:59:48.655427 kubelet[2665]: E0909 04:59:48.655395 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8801ee2f4a6c41dd1b6c5debcd2e0fbed9720d5df4e579d5cda2d70737da4869\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c6d659cc-r5jzx" Sep 9 04:59:48.655527 kubelet[2665]: E0909 04:59:48.655431 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6c6d659cc-r5jzx_calico-system(b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6c6d659cc-r5jzx_calico-system(b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8801ee2f4a6c41dd1b6c5debcd2e0fbed9720d5df4e579d5cda2d70737da4869\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6c6d659cc-r5jzx" podUID="b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a" Sep 9 04:59:48.663948 containerd[1534]: time="2025-09-09T04:59:48.663899667Z" level=error msg="Failed to destroy network for sandbox \"e8bad74478e418c0beb724d16ab99e3d2eeb24f7658487465bfc8bf7a70c947b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.664916 containerd[1534]: time="2025-09-09T04:59:48.664882551Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kgwfz,Uid:ff1d9b5f-1e61-411f-bd7e-fee7b5332631,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e8bad74478e418c0beb724d16ab99e3d2eeb24f7658487465bfc8bf7a70c947b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.665165 kubelet[2665]: E0909 04:59:48.665114 2665 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e8bad74478e418c0beb724d16ab99e3d2eeb24f7658487465bfc8bf7a70c947b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:59:48.665231 kubelet[2665]: E0909 04:59:48.665172 2665 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e8bad74478e418c0beb724d16ab99e3d2eeb24f7658487465bfc8bf7a70c947b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kgwfz" Sep 9 04:59:48.665231 kubelet[2665]: E0909 04:59:48.665191 2665 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e8bad74478e418c0beb724d16ab99e3d2eeb24f7658487465bfc8bf7a70c947b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kgwfz" Sep 9 04:59:48.665292 kubelet[2665]: E0909 04:59:48.665235 2665 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kgwfz_calico-system(ff1d9b5f-1e61-411f-bd7e-fee7b5332631)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kgwfz_calico-system(ff1d9b5f-1e61-411f-bd7e-fee7b5332631)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e8bad74478e418c0beb724d16ab99e3d2eeb24f7658487465bfc8bf7a70c947b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kgwfz" podUID="ff1d9b5f-1e61-411f-bd7e-fee7b5332631" Sep 9 04:59:49.399475 systemd[1]: run-netns-cni\x2d3dc21a50\x2d64dc\x2de1c9\x2da63b\x2d64c56e2d7088.mount: Deactivated successfully. Sep 9 04:59:49.399578 systemd[1]: run-netns-cni\x2d8b9db245\x2dcef2\x2d5e27\x2d049f\x2daa1bb10a8b1b.mount: Deactivated successfully. Sep 9 04:59:49.399629 systemd[1]: run-netns-cni\x2d43d24e0a\x2de6f1\x2d77a6\x2d86cf\x2d3019ee4c4e09.mount: Deactivated successfully. Sep 9 04:59:49.399670 systemd[1]: run-netns-cni\x2d46e6f575\x2d6baa\x2dd5bc\x2d7381\x2d3cc272c5a8cb.mount: Deactivated successfully. Sep 9 04:59:51.453296 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3883741720.mount: Deactivated successfully. Sep 9 04:59:51.727452 containerd[1534]: time="2025-09-09T04:59:51.727307156Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:51.728330 containerd[1534]: time="2025-09-09T04:59:51.728138707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 04:59:51.729213 containerd[1534]: time="2025-09-09T04:59:51.729180345Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:51.731267 containerd[1534]: time="2025-09-09T04:59:51.731207740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:51.731962 containerd[1534]: time="2025-09-09T04:59:51.731924686Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 3.119001099s" Sep 9 04:59:51.732074 containerd[1534]: time="2025-09-09T04:59:51.732056531Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 04:59:51.748911 containerd[1534]: time="2025-09-09T04:59:51.748861750Z" level=info msg="CreateContainer within sandbox \"ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 04:59:51.757117 containerd[1534]: time="2025-09-09T04:59:51.756569434Z" level=info msg="Container 49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:51.774131 containerd[1534]: time="2025-09-09T04:59:51.774076439Z" level=info msg="CreateContainer within sandbox \"ba3e7f25ad6d3a3b742adae2e74449b027b15007f419a910e966d39bfe88bebb\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96\"" Sep 9 04:59:51.774632 containerd[1534]: time="2025-09-09T04:59:51.774610259Z" level=info msg="StartContainer for \"49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96\"" Sep 9 04:59:51.776564 containerd[1534]: time="2025-09-09T04:59:51.776537090Z" level=info msg="connecting to shim 49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96" address="unix:///run/containerd/s/b848f5f2a9c59ee4b9eb3c54cc9757a7411a7d985550db872103fad5b9792d1b" protocol=ttrpc version=3 Sep 9 04:59:51.805672 systemd[1]: Started cri-containerd-49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96.scope - libcontainer container 49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96. Sep 9 04:59:51.848103 containerd[1534]: time="2025-09-09T04:59:51.848063405Z" level=info msg="StartContainer for \"49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96\" returns successfully" Sep 9 04:59:51.964511 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 04:59:51.964608 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 04:59:52.143792 kubelet[2665]: I0909 04:59:52.143185 2665 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-backend-key-pair\") pod \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\" (UID: \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\") " Sep 9 04:59:52.143792 kubelet[2665]: I0909 04:59:52.143240 2665 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-ca-bundle\") pod \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\" (UID: \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\") " Sep 9 04:59:52.143792 kubelet[2665]: I0909 04:59:52.143265 2665 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwztg\" (UniqueName: \"kubernetes.io/projected/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-kube-api-access-kwztg\") pod \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\" (UID: \"b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a\") " Sep 9 04:59:52.144200 kubelet[2665]: I0909 04:59:52.144155 2665 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a" (UID: "b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 04:59:52.146212 kubelet[2665]: I0909 04:59:52.146185 2665 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-kube-api-access-kwztg" (OuterVolumeSpecName: "kube-api-access-kwztg") pod "b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a" (UID: "b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a"). InnerVolumeSpecName "kube-api-access-kwztg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 04:59:52.152875 kubelet[2665]: I0909 04:59:52.152829 2665 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a" (UID: "b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 04:59:52.244732 kubelet[2665]: I0909 04:59:52.244696 2665 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwztg\" (UniqueName: \"kubernetes.io/projected/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-kube-api-access-kwztg\") on node \"localhost\" DevicePath \"\"" Sep 9 04:59:52.244732 kubelet[2665]: I0909 04:59:52.244724 2665 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 04:59:52.244732 kubelet[2665]: I0909 04:59:52.244734 2665 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 04:59:52.454230 systemd[1]: var-lib-kubelet-pods-b1b0c4cf\x2d4acb\x2d4ff4\x2daa58\x2d79ffc93c562a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkwztg.mount: Deactivated successfully. Sep 9 04:59:52.454330 systemd[1]: var-lib-kubelet-pods-b1b0c4cf\x2d4acb\x2d4ff4\x2daa58\x2d79ffc93c562a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 04:59:52.526557 systemd[1]: Removed slice kubepods-besteffort-podb1b0c4cf_4acb_4ff4_aa58_79ffc93c562a.slice - libcontainer container kubepods-besteffort-podb1b0c4cf_4acb_4ff4_aa58_79ffc93c562a.slice. Sep 9 04:59:52.672766 kubelet[2665]: I0909 04:59:52.672652 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hllcp" podStartSLOduration=1.510629744 podStartE2EDuration="12.672635158s" podCreationTimestamp="2025-09-09 04:59:40 +0000 UTC" firstStartedPulling="2025-09-09 04:59:40.570780984 +0000 UTC m=+20.150849117" lastFinishedPulling="2025-09-09 04:59:51.732786398 +0000 UTC m=+31.312854531" observedRunningTime="2025-09-09 04:59:52.672161262 +0000 UTC m=+32.252229395" watchObservedRunningTime="2025-09-09 04:59:52.672635158 +0000 UTC m=+32.252703251" Sep 9 04:59:52.724948 systemd[1]: Created slice kubepods-besteffort-pod61bd929d_42bf_496c_8f27_efa09484685a.slice - libcontainer container kubepods-besteffort-pod61bd929d_42bf_496c_8f27_efa09484685a.slice. Sep 9 04:59:52.747052 kubelet[2665]: I0909 04:59:52.746877 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/61bd929d-42bf-496c-8f27-efa09484685a-whisker-backend-key-pair\") pod \"whisker-79689dc87b-x5sj6\" (UID: \"61bd929d-42bf-496c-8f27-efa09484685a\") " pod="calico-system/whisker-79689dc87b-x5sj6" Sep 9 04:59:52.747052 kubelet[2665]: I0909 04:59:52.746924 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61bd929d-42bf-496c-8f27-efa09484685a-whisker-ca-bundle\") pod \"whisker-79689dc87b-x5sj6\" (UID: \"61bd929d-42bf-496c-8f27-efa09484685a\") " pod="calico-system/whisker-79689dc87b-x5sj6" Sep 9 04:59:52.747052 kubelet[2665]: I0909 04:59:52.746954 2665 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxngd\" (UniqueName: \"kubernetes.io/projected/61bd929d-42bf-496c-8f27-efa09484685a-kube-api-access-fxngd\") pod \"whisker-79689dc87b-x5sj6\" (UID: \"61bd929d-42bf-496c-8f27-efa09484685a\") " pod="calico-system/whisker-79689dc87b-x5sj6" Sep 9 04:59:52.794189 containerd[1534]: time="2025-09-09T04:59:52.794151275Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96\" id:\"cee374e2f6864b0a5ff07866c85423e2d4d282afcc71f9eb46e4224033221736\" pid:3795 exit_status:1 exited_at:{seconds:1757393992 nanos:793365768}" Sep 9 04:59:53.030476 containerd[1534]: time="2025-09-09T04:59:53.030441141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79689dc87b-x5sj6,Uid:61bd929d-42bf-496c-8f27-efa09484685a,Namespace:calico-system,Attempt:0,}" Sep 9 04:59:53.207258 systemd-networkd[1452]: cali990837b6278: Link UP Sep 9 04:59:53.207803 systemd-networkd[1452]: cali990837b6278: Gained carrier Sep 9 04:59:53.224021 containerd[1534]: 2025-09-09 04:59:53.054 [INFO][3812] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:59:53.224021 containerd[1534]: 2025-09-09 04:59:53.090 [INFO][3812] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--79689dc87b--x5sj6-eth0 whisker-79689dc87b- calico-system 61bd929d-42bf-496c-8f27-efa09484685a 889 0 2025-09-09 04:59:52 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79689dc87b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-79689dc87b-x5sj6 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali990837b6278 [] [] }} ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-" Sep 9 04:59:53.224021 containerd[1534]: 2025-09-09 04:59:53.090 [INFO][3812] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" Sep 9 04:59:53.224021 containerd[1534]: 2025-09-09 04:59:53.159 [INFO][3825] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" HandleID="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Workload="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.159 [INFO][3825] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" HandleID="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Workload="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c250), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-79689dc87b-x5sj6", "timestamp":"2025-09-09 04:59:53.159308673 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.159 [INFO][3825] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.159 [INFO][3825] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.159 [INFO][3825] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.170 [INFO][3825] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" host="localhost" Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.176 [INFO][3825] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.181 [INFO][3825] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.183 [INFO][3825] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.185 [INFO][3825] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:59:53.224262 containerd[1534]: 2025-09-09 04:59:53.185 [INFO][3825] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" host="localhost" Sep 9 04:59:53.224464 containerd[1534]: 2025-09-09 04:59:53.187 [INFO][3825] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca Sep 9 04:59:53.224464 containerd[1534]: 2025-09-09 04:59:53.192 [INFO][3825] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" host="localhost" Sep 9 04:59:53.224464 containerd[1534]: 2025-09-09 04:59:53.197 [INFO][3825] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" host="localhost" Sep 9 04:59:53.224464 containerd[1534]: 2025-09-09 04:59:53.197 [INFO][3825] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" host="localhost" Sep 9 04:59:53.224464 containerd[1534]: 2025-09-09 04:59:53.197 [INFO][3825] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:59:53.224464 containerd[1534]: 2025-09-09 04:59:53.197 [INFO][3825] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" HandleID="k8s-pod-network.27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Workload="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" Sep 9 04:59:53.224682 containerd[1534]: 2025-09-09 04:59:53.200 [INFO][3812] cni-plugin/k8s.go 418: Populated endpoint ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--79689dc87b--x5sj6-eth0", GenerateName:"whisker-79689dc87b-", Namespace:"calico-system", SelfLink:"", UID:"61bd929d-42bf-496c-8f27-efa09484685a", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79689dc87b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-79689dc87b-x5sj6", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali990837b6278", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:59:53.224682 containerd[1534]: 2025-09-09 04:59:53.200 [INFO][3812] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" Sep 9 04:59:53.224762 containerd[1534]: 2025-09-09 04:59:53.200 [INFO][3812] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali990837b6278 ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" Sep 9 04:59:53.224762 containerd[1534]: 2025-09-09 04:59:53.209 [INFO][3812] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" Sep 9 04:59:53.224801 containerd[1534]: 2025-09-09 04:59:53.210 [INFO][3812] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--79689dc87b--x5sj6-eth0", GenerateName:"whisker-79689dc87b-", Namespace:"calico-system", SelfLink:"", UID:"61bd929d-42bf-496c-8f27-efa09484685a", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79689dc87b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca", Pod:"whisker-79689dc87b-x5sj6", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali990837b6278", MAC:"86:a7:c9:e4:d3:4b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:59:53.224853 containerd[1534]: 2025-09-09 04:59:53.221 [INFO][3812] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" Namespace="calico-system" Pod="whisker-79689dc87b-x5sj6" WorkloadEndpoint="localhost-k8s-whisker--79689dc87b--x5sj6-eth0" Sep 9 04:59:53.287435 containerd[1534]: time="2025-09-09T04:59:53.287144655Z" level=info msg="connecting to shim 27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca" address="unix:///run/containerd/s/40a93d8c0fcfe42ebb6cc751a4fe20e8c3fac2e660c0b4b795a3b785cbc00c0c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:59:53.344733 systemd[1]: Started cri-containerd-27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca.scope - libcontainer container 27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca. Sep 9 04:59:53.365048 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:59:53.457188 containerd[1534]: time="2025-09-09T04:59:53.454646279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79689dc87b-x5sj6,Uid:61bd929d-42bf-496c-8f27-efa09484685a,Namespace:calico-system,Attempt:0,} returns sandbox id \"27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca\"" Sep 9 04:59:53.457708 containerd[1534]: time="2025-09-09T04:59:53.457674307Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 04:59:53.683402 systemd-networkd[1452]: vxlan.calico: Link UP Sep 9 04:59:53.683412 systemd-networkd[1452]: vxlan.calico: Gained carrier Sep 9 04:59:53.752226 containerd[1534]: time="2025-09-09T04:59:53.752154066Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96\" id:\"82e91362c9cfe439cf7b0933a981f78b1d1cc89e21f1420317239d3230dc1e08\" pid:4049 exit_status:1 exited_at:{seconds:1757393993 nanos:751860536}" Sep 9 04:59:54.338911 containerd[1534]: time="2025-09-09T04:59:54.338844842Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:54.338911 containerd[1534]: time="2025-09-09T04:59:54.339217413Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 04:59:54.340157 containerd[1534]: time="2025-09-09T04:59:54.340120200Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:54.348108 containerd[1534]: time="2025-09-09T04:59:54.348063999Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:54.348767 containerd[1534]: time="2025-09-09T04:59:54.348738499Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 890.743461ms" Sep 9 04:59:54.348834 containerd[1534]: time="2025-09-09T04:59:54.348771900Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 04:59:54.352349 containerd[1534]: time="2025-09-09T04:59:54.352303846Z" level=info msg="CreateContainer within sandbox \"27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 04:59:54.363729 containerd[1534]: time="2025-09-09T04:59:54.363682549Z" level=info msg="Container ec9caa1df05d0d9e3980b839727b688c6cb50c17dfa293fd9c88b5f274d8f381: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:54.378968 containerd[1534]: time="2025-09-09T04:59:54.378930487Z" level=info msg="CreateContainer within sandbox \"27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"ec9caa1df05d0d9e3980b839727b688c6cb50c17dfa293fd9c88b5f274d8f381\"" Sep 9 04:59:54.381025 containerd[1534]: time="2025-09-09T04:59:54.380986789Z" level=info msg="StartContainer for \"ec9caa1df05d0d9e3980b839727b688c6cb50c17dfa293fd9c88b5f274d8f381\"" Sep 9 04:59:54.382192 containerd[1534]: time="2025-09-09T04:59:54.382165985Z" level=info msg="connecting to shim ec9caa1df05d0d9e3980b839727b688c6cb50c17dfa293fd9c88b5f274d8f381" address="unix:///run/containerd/s/40a93d8c0fcfe42ebb6cc751a4fe20e8c3fac2e660c0b4b795a3b785cbc00c0c" protocol=ttrpc version=3 Sep 9 04:59:54.405651 systemd[1]: Started cri-containerd-ec9caa1df05d0d9e3980b839727b688c6cb50c17dfa293fd9c88b5f274d8f381.scope - libcontainer container ec9caa1df05d0d9e3980b839727b688c6cb50c17dfa293fd9c88b5f274d8f381. Sep 9 04:59:54.447533 containerd[1534]: time="2025-09-09T04:59:54.447317145Z" level=info msg="StartContainer for \"ec9caa1df05d0d9e3980b839727b688c6cb50c17dfa293fd9c88b5f274d8f381\" returns successfully" Sep 9 04:59:54.448431 containerd[1534]: time="2025-09-09T04:59:54.448395617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 04:59:54.510619 systemd-networkd[1452]: cali990837b6278: Gained IPv6LL Sep 9 04:59:54.520973 kubelet[2665]: I0909 04:59:54.520930 2665 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a" path="/var/lib/kubelet/pods/b1b0c4cf-4acb-4ff4-aa58-79ffc93c562a/volumes" Sep 9 04:59:55.214668 systemd-networkd[1452]: vxlan.calico: Gained IPv6LL Sep 9 04:59:55.674985 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1460476218.mount: Deactivated successfully. Sep 9 04:59:55.781547 containerd[1534]: time="2025-09-09T04:59:55.781500827Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:55.782499 containerd[1534]: time="2025-09-09T04:59:55.782117165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 04:59:55.783124 containerd[1534]: time="2025-09-09T04:59:55.783077074Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:55.785575 containerd[1534]: time="2025-09-09T04:59:55.785510185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:59:55.786097 containerd[1534]: time="2025-09-09T04:59:55.786041560Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.337612862s" Sep 9 04:59:55.786097 containerd[1534]: time="2025-09-09T04:59:55.786093162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 04:59:55.789515 containerd[1534]: time="2025-09-09T04:59:55.788960566Z" level=info msg="CreateContainer within sandbox \"27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 04:59:55.797667 containerd[1534]: time="2025-09-09T04:59:55.797612659Z" level=info msg="Container 8b0791c78b1dbf140cacf70ec96ae091790c3950baabfa0526d7a9cd32313565: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:59:55.808093 containerd[1534]: time="2025-09-09T04:59:55.808037043Z" level=info msg="CreateContainer within sandbox \"27a7d976be80e799ddf6bcfacd3f2cd3a8f294cca352a9ec443b0f22b01f80ca\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"8b0791c78b1dbf140cacf70ec96ae091790c3950baabfa0526d7a9cd32313565\"" Sep 9 04:59:55.808672 containerd[1534]: time="2025-09-09T04:59:55.808631421Z" level=info msg="StartContainer for \"8b0791c78b1dbf140cacf70ec96ae091790c3950baabfa0526d7a9cd32313565\"" Sep 9 04:59:55.810867 containerd[1534]: time="2025-09-09T04:59:55.810830245Z" level=info msg="connecting to shim 8b0791c78b1dbf140cacf70ec96ae091790c3950baabfa0526d7a9cd32313565" address="unix:///run/containerd/s/40a93d8c0fcfe42ebb6cc751a4fe20e8c3fac2e660c0b4b795a3b785cbc00c0c" protocol=ttrpc version=3 Sep 9 04:59:55.839684 systemd[1]: Started cri-containerd-8b0791c78b1dbf140cacf70ec96ae091790c3950baabfa0526d7a9cd32313565.scope - libcontainer container 8b0791c78b1dbf140cacf70ec96ae091790c3950baabfa0526d7a9cd32313565. Sep 9 04:59:55.875351 containerd[1534]: time="2025-09-09T04:59:55.875258249Z" level=info msg="StartContainer for \"8b0791c78b1dbf140cacf70ec96ae091790c3950baabfa0526d7a9cd32313565\" returns successfully" Sep 9 04:59:56.701923 kubelet[2665]: I0909 04:59:56.700732 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-79689dc87b-x5sj6" podStartSLOduration=2.37038363 podStartE2EDuration="4.700714065s" podCreationTimestamp="2025-09-09 04:59:52 +0000 UTC" firstStartedPulling="2025-09-09 04:59:53.457008163 +0000 UTC m=+33.037076296" lastFinishedPulling="2025-09-09 04:59:55.787338638 +0000 UTC m=+35.367406731" observedRunningTime="2025-09-09 04:59:56.698838692 +0000 UTC m=+36.278906825" watchObservedRunningTime="2025-09-09 04:59:56.700714065 +0000 UTC m=+36.280782198" Sep 9 05:00:00.518955 containerd[1534]: time="2025-09-09T05:00:00.518894887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kgwfz,Uid:ff1d9b5f-1e61-411f-bd7e-fee7b5332631,Namespace:calico-system,Attempt:0,}" Sep 9 05:00:00.520198 containerd[1534]: time="2025-09-09T05:00:00.520138839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f86fdbd-xnjnq,Uid:33b022e7-d521-48f6-96bd-a9569f5800c6,Namespace:calico-system,Attempt:0,}" Sep 9 05:00:00.523171 containerd[1534]: time="2025-09-09T05:00:00.522768186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6nkl6,Uid:4a205d0f-8b34-4ade-831f-621b0059d51f,Namespace:kube-system,Attempt:0,}" Sep 9 05:00:00.659966 systemd-networkd[1452]: cali01f548124fb: Link UP Sep 9 05:00:00.660190 systemd-networkd[1452]: cali01f548124fb: Gained carrier Sep 9 05:00:00.670846 containerd[1534]: 2025-09-09 05:00:00.578 [INFO][4219] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0 calico-kube-controllers-5f86fdbd- calico-system 33b022e7-d521-48f6-96bd-a9569f5800c6 824 0 2025-09-09 04:59:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5f86fdbd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5f86fdbd-xnjnq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali01f548124fb [] [] }} ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-" Sep 9 05:00:00.670846 containerd[1534]: 2025-09-09 05:00:00.578 [INFO][4219] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" Sep 9 05:00:00.670846 containerd[1534]: 2025-09-09 05:00:00.610 [INFO][4259] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" HandleID="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Workload="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.610 [INFO][4259] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" HandleID="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Workload="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323480), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5f86fdbd-xnjnq", "timestamp":"2025-09-09 05:00:00.61056562 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.610 [INFO][4259] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.610 [INFO][4259] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.610 [INFO][4259] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.620 [INFO][4259] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" host="localhost" Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.628 [INFO][4259] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.636 [INFO][4259] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.638 [INFO][4259] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.640 [INFO][4259] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:00.671022 containerd[1534]: 2025-09-09 05:00:00.640 [INFO][4259] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" host="localhost" Sep 9 05:00:00.671231 containerd[1534]: 2025-09-09 05:00:00.642 [INFO][4259] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed Sep 9 05:00:00.671231 containerd[1534]: 2025-09-09 05:00:00.646 [INFO][4259] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" host="localhost" Sep 9 05:00:00.671231 containerd[1534]: 2025-09-09 05:00:00.651 [INFO][4259] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" host="localhost" Sep 9 05:00:00.671231 containerd[1534]: 2025-09-09 05:00:00.651 [INFO][4259] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" host="localhost" Sep 9 05:00:00.671231 containerd[1534]: 2025-09-09 05:00:00.651 [INFO][4259] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:00:00.671231 containerd[1534]: 2025-09-09 05:00:00.651 [INFO][4259] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" HandleID="k8s-pod-network.65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Workload="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" Sep 9 05:00:00.671366 containerd[1534]: 2025-09-09 05:00:00.654 [INFO][4219] cni-plugin/k8s.go 418: Populated endpoint ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0", GenerateName:"calico-kube-controllers-5f86fdbd-", Namespace:"calico-system", SelfLink:"", UID:"33b022e7-d521-48f6-96bd-a9569f5800c6", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f86fdbd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5f86fdbd-xnjnq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali01f548124fb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:00.671420 containerd[1534]: 2025-09-09 05:00:00.654 [INFO][4219] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" Sep 9 05:00:00.671420 containerd[1534]: 2025-09-09 05:00:00.655 [INFO][4219] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali01f548124fb ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" Sep 9 05:00:00.671420 containerd[1534]: 2025-09-09 05:00:00.656 [INFO][4219] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" Sep 9 05:00:00.671501 containerd[1534]: 2025-09-09 05:00:00.657 [INFO][4219] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0", GenerateName:"calico-kube-controllers-5f86fdbd-", Namespace:"calico-system", SelfLink:"", UID:"33b022e7-d521-48f6-96bd-a9569f5800c6", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f86fdbd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed", Pod:"calico-kube-controllers-5f86fdbd-xnjnq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali01f548124fb", MAC:"12:39:bb:6c:02:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:00.671564 containerd[1534]: 2025-09-09 05:00:00.668 [INFO][4219] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" Namespace="calico-system" Pod="calico-kube-controllers-5f86fdbd-xnjnq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5f86fdbd--xnjnq-eth0" Sep 9 05:00:00.698728 containerd[1534]: time="2025-09-09T05:00:00.698681342Z" level=info msg="connecting to shim 65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed" address="unix:///run/containerd/s/1cea43ee75f6d71cb559e2ae1a0aaf023e69dfd62e9d7b81bda0fced61d2c0e7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:00:00.730711 systemd[1]: Started cri-containerd-65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed.scope - libcontainer container 65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed. Sep 9 05:00:00.745135 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:00:00.774308 systemd-networkd[1452]: calia00c311149a: Link UP Sep 9 05:00:00.776900 systemd-networkd[1452]: calia00c311149a: Gained carrier Sep 9 05:00:00.780043 containerd[1534]: time="2025-09-09T05:00:00.780002331Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f86fdbd-xnjnq,Uid:33b022e7-d521-48f6-96bd-a9569f5800c6,Namespace:calico-system,Attempt:0,} returns sandbox id \"65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed\"" Sep 9 05:00:00.783640 containerd[1534]: time="2025-09-09T05:00:00.783607502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 05:00:00.798246 containerd[1534]: 2025-09-09 05:00:00.571 [INFO][4208] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--kgwfz-eth0 csi-node-driver- calico-system ff1d9b5f-1e61-411f-bd7e-fee7b5332631 672 0 2025-09-09 04:59:40 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-kgwfz eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calia00c311149a [] [] }} ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-" Sep 9 05:00:00.798246 containerd[1534]: 2025-09-09 05:00:00.571 [INFO][4208] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-eth0" Sep 9 05:00:00.798246 containerd[1534]: 2025-09-09 05:00:00.612 [INFO][4251] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" HandleID="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Workload="localhost-k8s-csi--node--driver--kgwfz-eth0" Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.612 [INFO][4251] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" HandleID="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Workload="localhost-k8s-csi--node--driver--kgwfz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400042c0a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-kgwfz", "timestamp":"2025-09-09 05:00:00.612769636 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.612 [INFO][4251] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.651 [INFO][4251] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.651 [INFO][4251] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.723 [INFO][4251] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" host="localhost" Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.730 [INFO][4251] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.739 [INFO][4251] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.741 [INFO][4251] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.747 [INFO][4251] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:00.798450 containerd[1534]: 2025-09-09 05:00:00.747 [INFO][4251] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" host="localhost" Sep 9 05:00:00.798717 containerd[1534]: 2025-09-09 05:00:00.749 [INFO][4251] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d Sep 9 05:00:00.798717 containerd[1534]: 2025-09-09 05:00:00.754 [INFO][4251] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" host="localhost" Sep 9 05:00:00.798717 containerd[1534]: 2025-09-09 05:00:00.761 [INFO][4251] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" host="localhost" Sep 9 05:00:00.798717 containerd[1534]: 2025-09-09 05:00:00.762 [INFO][4251] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" host="localhost" Sep 9 05:00:00.798717 containerd[1534]: 2025-09-09 05:00:00.762 [INFO][4251] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:00:00.798717 containerd[1534]: 2025-09-09 05:00:00.763 [INFO][4251] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" HandleID="k8s-pod-network.cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Workload="localhost-k8s-csi--node--driver--kgwfz-eth0" Sep 9 05:00:00.798840 containerd[1534]: 2025-09-09 05:00:00.768 [INFO][4208] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kgwfz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ff1d9b5f-1e61-411f-bd7e-fee7b5332631", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-kgwfz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia00c311149a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:00.798894 containerd[1534]: 2025-09-09 05:00:00.768 [INFO][4208] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-eth0" Sep 9 05:00:00.798894 containerd[1534]: 2025-09-09 05:00:00.768 [INFO][4208] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia00c311149a ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-eth0" Sep 9 05:00:00.798894 containerd[1534]: 2025-09-09 05:00:00.777 [INFO][4208] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-eth0" Sep 9 05:00:00.798955 containerd[1534]: 2025-09-09 05:00:00.778 [INFO][4208] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kgwfz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ff1d9b5f-1e61-411f-bd7e-fee7b5332631", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d", Pod:"csi-node-driver-kgwfz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia00c311149a", MAC:"0a:44:1d:a8:e3:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:00.799007 containerd[1534]: 2025-09-09 05:00:00.795 [INFO][4208] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" Namespace="calico-system" Pod="csi-node-driver-kgwfz" WorkloadEndpoint="localhost-k8s-csi--node--driver--kgwfz-eth0" Sep 9 05:00:00.820071 containerd[1534]: time="2025-09-09T05:00:00.820025109Z" level=info msg="connecting to shim cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d" address="unix:///run/containerd/s/6414b1417dd3dea7899285d19b8b9333b1908dfc68029b37338043fd2896d3d6" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:00:00.847689 systemd[1]: Started cri-containerd-cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d.scope - libcontainer container cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d. Sep 9 05:00:00.861389 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:00:00.863722 systemd-networkd[1452]: califc3165ae71e: Link UP Sep 9 05:00:00.864111 systemd-networkd[1452]: califc3165ae71e: Gained carrier Sep 9 05:00:00.879375 containerd[1534]: 2025-09-09 05:00:00.587 [INFO][4222] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0 coredns-7c65d6cfc9- kube-system 4a205d0f-8b34-4ade-831f-621b0059d51f 826 0 2025-09-09 04:59:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-6nkl6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califc3165ae71e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-" Sep 9 05:00:00.879375 containerd[1534]: 2025-09-09 05:00:00.587 [INFO][4222] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" Sep 9 05:00:00.879375 containerd[1534]: 2025-09-09 05:00:00.616 [INFO][4265] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" HandleID="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Workload="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.616 [INFO][4265] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" HandleID="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Workload="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a2440), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-6nkl6", "timestamp":"2025-09-09 05:00:00.616756857 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.616 [INFO][4265] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.762 [INFO][4265] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.762 [INFO][4265] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.823 [INFO][4265] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" host="localhost" Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.831 [INFO][4265] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.841 [INFO][4265] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.844 [INFO][4265] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.846 [INFO][4265] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:00.879608 containerd[1534]: 2025-09-09 05:00:00.846 [INFO][4265] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" host="localhost" Sep 9 05:00:00.879854 containerd[1534]: 2025-09-09 05:00:00.848 [INFO][4265] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed Sep 9 05:00:00.879854 containerd[1534]: 2025-09-09 05:00:00.852 [INFO][4265] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" host="localhost" Sep 9 05:00:00.879854 containerd[1534]: 2025-09-09 05:00:00.858 [INFO][4265] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" host="localhost" Sep 9 05:00:00.879854 containerd[1534]: 2025-09-09 05:00:00.858 [INFO][4265] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" host="localhost" Sep 9 05:00:00.879854 containerd[1534]: 2025-09-09 05:00:00.858 [INFO][4265] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:00:00.879854 containerd[1534]: 2025-09-09 05:00:00.858 [INFO][4265] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" HandleID="k8s-pod-network.ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Workload="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" Sep 9 05:00:00.879990 containerd[1534]: 2025-09-09 05:00:00.861 [INFO][4222] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"4a205d0f-8b34-4ade-831f-621b0059d51f", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-6nkl6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califc3165ae71e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:00.880060 containerd[1534]: 2025-09-09 05:00:00.861 [INFO][4222] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" Sep 9 05:00:00.880060 containerd[1534]: 2025-09-09 05:00:00.862 [INFO][4222] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc3165ae71e ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" Sep 9 05:00:00.880060 containerd[1534]: 2025-09-09 05:00:00.864 [INFO][4222] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" Sep 9 05:00:00.880127 containerd[1534]: 2025-09-09 05:00:00.865 [INFO][4222] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"4a205d0f-8b34-4ade-831f-621b0059d51f", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed", Pod:"coredns-7c65d6cfc9-6nkl6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califc3165ae71e", MAC:"42:06:39:59:31:6f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:00.880127 containerd[1534]: 2025-09-09 05:00:00.877 [INFO][4222] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6nkl6" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--6nkl6-eth0" Sep 9 05:00:00.893203 containerd[1534]: time="2025-09-09T05:00:00.893082368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kgwfz,Uid:ff1d9b5f-1e61-411f-bd7e-fee7b5332631,Namespace:calico-system,Attempt:0,} returns sandbox id \"cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d\"" Sep 9 05:00:00.900459 containerd[1534]: time="2025-09-09T05:00:00.900414195Z" level=info msg="connecting to shim ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed" address="unix:///run/containerd/s/ffe2fda47149b60d3b43963cbea1afa1f3e68e293294cf6333a0dc7eee7b2d78" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:00:00.926679 systemd[1]: Started cri-containerd-ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed.scope - libcontainer container ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed. Sep 9 05:00:00.938374 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:00:00.982955 containerd[1534]: time="2025-09-09T05:00:00.982914454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6nkl6,Uid:4a205d0f-8b34-4ade-831f-621b0059d51f,Namespace:kube-system,Attempt:0,} returns sandbox id \"ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed\"" Sep 9 05:00:00.986566 containerd[1534]: time="2025-09-09T05:00:00.986536946Z" level=info msg="CreateContainer within sandbox \"ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:00:00.996926 containerd[1534]: time="2025-09-09T05:00:00.996895369Z" level=info msg="Container c4bf086ceb8b136c04e507a4981a7e920043e623cada43084ce507470aa2ad4d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:01.001871 containerd[1534]: time="2025-09-09T05:00:01.001834735Z" level=info msg="CreateContainer within sandbox \"ee006583d4bb7bcd221a43b50d420e01feec0728db51909bb014dfdaa82318ed\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c4bf086ceb8b136c04e507a4981a7e920043e623cada43084ce507470aa2ad4d\"" Sep 9 05:00:01.003511 containerd[1534]: time="2025-09-09T05:00:01.003467575Z" level=info msg="StartContainer for \"c4bf086ceb8b136c04e507a4981a7e920043e623cada43084ce507470aa2ad4d\"" Sep 9 05:00:01.004247 containerd[1534]: time="2025-09-09T05:00:01.004224034Z" level=info msg="connecting to shim c4bf086ceb8b136c04e507a4981a7e920043e623cada43084ce507470aa2ad4d" address="unix:///run/containerd/s/ffe2fda47149b60d3b43963cbea1afa1f3e68e293294cf6333a0dc7eee7b2d78" protocol=ttrpc version=3 Sep 9 05:00:01.024772 systemd[1]: Started cri-containerd-c4bf086ceb8b136c04e507a4981a7e920043e623cada43084ce507470aa2ad4d.scope - libcontainer container c4bf086ceb8b136c04e507a4981a7e920043e623cada43084ce507470aa2ad4d. Sep 9 05:00:01.055222 containerd[1534]: time="2025-09-09T05:00:01.055106453Z" level=info msg="StartContainer for \"c4bf086ceb8b136c04e507a4981a7e920043e623cada43084ce507470aa2ad4d\" returns successfully" Sep 9 05:00:01.517993 containerd[1534]: time="2025-09-09T05:00:01.517866788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-m56jh,Uid:a78e19f7-e135-4e75-9a5d-fc71b1732548,Namespace:kube-system,Attempt:0,}" Sep 9 05:00:01.636251 systemd-networkd[1452]: cali93da0bff2d8: Link UP Sep 9 05:00:01.638009 systemd-networkd[1452]: cali93da0bff2d8: Gained carrier Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.554 [INFO][4484] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0 coredns-7c65d6cfc9- kube-system a78e19f7-e135-4e75-9a5d-fc71b1732548 820 0 2025-09-09 04:59:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-m56jh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali93da0bff2d8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.554 [INFO][4484] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.576 [INFO][4497] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" HandleID="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Workload="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.576 [INFO][4497] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" HandleID="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Workload="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd630), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-m56jh", "timestamp":"2025-09-09 05:00:01.576416398 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.576 [INFO][4497] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.576 [INFO][4497] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.576 [INFO][4497] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.587 [INFO][4497] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.595 [INFO][4497] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.601 [INFO][4497] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.605 [INFO][4497] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.609 [INFO][4497] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.609 [INFO][4497] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.612 [INFO][4497] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.618 [INFO][4497] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.628 [INFO][4497] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.628 [INFO][4497] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" host="localhost" Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.628 [INFO][4497] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:00:01.688389 containerd[1534]: 2025-09-09 05:00:01.628 [INFO][4497] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" HandleID="k8s-pod-network.bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Workload="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" Sep 9 05:00:01.689145 containerd[1534]: 2025-09-09 05:00:01.633 [INFO][4484] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a78e19f7-e135-4e75-9a5d-fc71b1732548", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-m56jh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93da0bff2d8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:01.689145 containerd[1534]: 2025-09-09 05:00:01.633 [INFO][4484] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" Sep 9 05:00:01.689145 containerd[1534]: 2025-09-09 05:00:01.633 [INFO][4484] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali93da0bff2d8 ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" Sep 9 05:00:01.689145 containerd[1534]: 2025-09-09 05:00:01.637 [INFO][4484] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" Sep 9 05:00:01.689145 containerd[1534]: 2025-09-09 05:00:01.640 [INFO][4484] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a78e19f7-e135-4e75-9a5d-fc71b1732548", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb", Pod:"coredns-7c65d6cfc9-m56jh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali93da0bff2d8", MAC:"9e:51:bd:d3:bf:71", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:01.689145 containerd[1534]: 2025-09-09 05:00:01.685 [INFO][4484] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" Namespace="kube-system" Pod="coredns-7c65d6cfc9-m56jh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--m56jh-eth0" Sep 9 05:00:01.734125 kubelet[2665]: I0909 05:00:01.733893 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-6nkl6" podStartSLOduration=34.733874295 podStartE2EDuration="34.733874295s" podCreationTimestamp="2025-09-09 04:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:00:01.717281684 +0000 UTC m=+41.297349817" watchObservedRunningTime="2025-09-09 05:00:01.733874295 +0000 UTC m=+41.313942428" Sep 9 05:00:01.741279 containerd[1534]: time="2025-09-09T05:00:01.741228877Z" level=info msg="connecting to shim bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb" address="unix:///run/containerd/s/a6b4a2c3cc03c9f60addd464ffe935800bead4f42cf4f80e8fa5ca4b9db5379f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:00:01.774646 systemd[1]: Started cri-containerd-bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb.scope - libcontainer container bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb. Sep 9 05:00:01.791295 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:00:01.812195 containerd[1534]: time="2025-09-09T05:00:01.812160433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-m56jh,Uid:a78e19f7-e135-4e75-9a5d-fc71b1732548,Namespace:kube-system,Attempt:0,} returns sandbox id \"bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb\"" Sep 9 05:00:01.816511 containerd[1534]: time="2025-09-09T05:00:01.815890885Z" level=info msg="CreateContainer within sandbox \"bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:00:01.824280 containerd[1534]: time="2025-09-09T05:00:01.824249252Z" level=info msg="Container 256673a6d943001c09391404101ffedc6b328380295d830a9d99ff1c91816a1f: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:01.828423 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1671564578.mount: Deactivated successfully. Sep 9 05:00:01.832002 containerd[1534]: time="2025-09-09T05:00:01.831966403Z" level=info msg="CreateContainer within sandbox \"bfbc1a81a298ad0163fd8cc6b226e3ec4db561d69bbc26c69a6a3fc15ad2affb\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"256673a6d943001c09391404101ffedc6b328380295d830a9d99ff1c91816a1f\"" Sep 9 05:00:01.832586 containerd[1534]: time="2025-09-09T05:00:01.832564818Z" level=info msg="StartContainer for \"256673a6d943001c09391404101ffedc6b328380295d830a9d99ff1c91816a1f\"" Sep 9 05:00:01.834074 containerd[1534]: time="2025-09-09T05:00:01.834047295Z" level=info msg="connecting to shim 256673a6d943001c09391404101ffedc6b328380295d830a9d99ff1c91816a1f" address="unix:///run/containerd/s/a6b4a2c3cc03c9f60addd464ffe935800bead4f42cf4f80e8fa5ca4b9db5379f" protocol=ttrpc version=3 Sep 9 05:00:01.859687 systemd[1]: Started cri-containerd-256673a6d943001c09391404101ffedc6b328380295d830a9d99ff1c91816a1f.scope - libcontainer container 256673a6d943001c09391404101ffedc6b328380295d830a9d99ff1c91816a1f. Sep 9 05:00:01.925514 containerd[1534]: time="2025-09-09T05:00:01.925469238Z" level=info msg="StartContainer for \"256673a6d943001c09391404101ffedc6b328380295d830a9d99ff1c91816a1f\" returns successfully" Sep 9 05:00:01.998619 systemd-networkd[1452]: cali01f548124fb: Gained IPv6LL Sep 9 05:00:02.113866 systemd[1]: Started sshd@7-10.0.0.72:22-10.0.0.1:47248.service - OpenSSH per-connection server daemon (10.0.0.1:47248). Sep 9 05:00:02.175670 sshd[4606]: Accepted publickey for core from 10.0.0.1 port 47248 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:02.176897 sshd-session[4606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:02.181025 systemd-logind[1504]: New session 8 of user core. Sep 9 05:00:02.185630 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 05:00:02.393656 sshd[4609]: Connection closed by 10.0.0.1 port 47248 Sep 9 05:00:02.393737 sshd-session[4606]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:02.397475 systemd-logind[1504]: Session 8 logged out. Waiting for processes to exit. Sep 9 05:00:02.397719 systemd[1]: sshd@7-10.0.0.72:22-10.0.0.1:47248.service: Deactivated successfully. Sep 9 05:00:02.400041 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 05:00:02.401583 systemd-logind[1504]: Removed session 8. Sep 9 05:00:02.510672 systemd-networkd[1452]: califc3165ae71e: Gained IPv6LL Sep 9 05:00:02.518217 containerd[1534]: time="2025-09-09T05:00:02.518177044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-t475x,Uid:4c0b831d-41fe-4470-964b-e4d28117b02b,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:00:02.574624 systemd-networkd[1452]: calia00c311149a: Gained IPv6LL Sep 9 05:00:02.614273 systemd-networkd[1452]: cali885022822ea: Link UP Sep 9 05:00:02.614438 systemd-networkd[1452]: cali885022822ea: Gained carrier Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.552 [INFO][4624] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0 calico-apiserver-6d964bcddb- calico-apiserver 4c0b831d-41fe-4470-964b-e4d28117b02b 830 0 2025-09-09 04:59:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6d964bcddb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6d964bcddb-t475x eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali885022822ea [] [] }} ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.552 [INFO][4624] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.575 [INFO][4638] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" HandleID="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Workload="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.575 [INFO][4638] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" HandleID="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Workload="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c790), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6d964bcddb-t475x", "timestamp":"2025-09-09 05:00:02.575028053 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.575 [INFO][4638] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.575 [INFO][4638] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.575 [INFO][4638] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.587 [INFO][4638] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.591 [INFO][4638] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.595 [INFO][4638] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.597 [INFO][4638] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.599 [INFO][4638] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.599 [INFO][4638] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.601 [INFO][4638] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0 Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.604 [INFO][4638] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.610 [INFO][4638] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.610 [INFO][4638] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" host="localhost" Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.610 [INFO][4638] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:00:02.627944 containerd[1534]: 2025-09-09 05:00:02.610 [INFO][4638] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" HandleID="k8s-pod-network.6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Workload="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" Sep 9 05:00:02.628684 containerd[1534]: 2025-09-09 05:00:02.612 [INFO][4624] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0", GenerateName:"calico-apiserver-6d964bcddb-", Namespace:"calico-apiserver", SelfLink:"", UID:"4c0b831d-41fe-4470-964b-e4d28117b02b", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d964bcddb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6d964bcddb-t475x", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali885022822ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:02.628684 containerd[1534]: 2025-09-09 05:00:02.612 [INFO][4624] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" Sep 9 05:00:02.628684 containerd[1534]: 2025-09-09 05:00:02.612 [INFO][4624] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali885022822ea ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" Sep 9 05:00:02.628684 containerd[1534]: 2025-09-09 05:00:02.615 [INFO][4624] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" Sep 9 05:00:02.628684 containerd[1534]: 2025-09-09 05:00:02.615 [INFO][4624] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0", GenerateName:"calico-apiserver-6d964bcddb-", Namespace:"calico-apiserver", SelfLink:"", UID:"4c0b831d-41fe-4470-964b-e4d28117b02b", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d964bcddb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0", Pod:"calico-apiserver-6d964bcddb-t475x", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali885022822ea", MAC:"aa:a5:ec:db:17:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:02.628684 containerd[1534]: 2025-09-09 05:00:02.624 [INFO][4624] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-t475x" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--t475x-eth0" Sep 9 05:00:02.665476 containerd[1534]: time="2025-09-09T05:00:02.665375589Z" level=info msg="connecting to shim 6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0" address="unix:///run/containerd/s/09c3a8b19afa4d7b2d10623edd38a8cb23456be9e83c5053f28d97e960fc992c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:00:02.693751 systemd[1]: Started cri-containerd-6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0.scope - libcontainer container 6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0. Sep 9 05:00:02.707888 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:00:02.730117 kubelet[2665]: I0909 05:00:02.730058 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-m56jh" podStartSLOduration=35.730038226 podStartE2EDuration="35.730038226s" podCreationTimestamp="2025-09-09 04:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:00:02.718271783 +0000 UTC m=+42.298339916" watchObservedRunningTime="2025-09-09 05:00:02.730038226 +0000 UTC m=+42.310106359" Sep 9 05:00:02.745799 containerd[1534]: time="2025-09-09T05:00:02.745705003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-t475x,Uid:4c0b831d-41fe-4470-964b-e4d28117b02b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0\"" Sep 9 05:00:02.894640 systemd-networkd[1452]: cali93da0bff2d8: Gained IPv6LL Sep 9 05:00:03.520021 containerd[1534]: time="2025-09-09T05:00:03.519982595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mrj5t,Uid:ebb4a83d-5f9f-41ce-8042-1eac984fa7d5,Namespace:calico-system,Attempt:0,}" Sep 9 05:00:03.533562 containerd[1534]: time="2025-09-09T05:00:03.533282467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-nf8n5,Uid:0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:00:03.633560 systemd-networkd[1452]: cali0c258f3d1c2: Link UP Sep 9 05:00:03.637194 systemd-networkd[1452]: cali0c258f3d1c2: Gained carrier Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.560 [INFO][4705] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--mrj5t-eth0 goldmane-7988f88666- calico-system ebb4a83d-5f9f-41ce-8042-1eac984fa7d5 827 0 2025-09-09 04:59:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-mrj5t eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0c258f3d1c2 [] [] }} ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.560 [INFO][4705] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.589 [INFO][4732] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" HandleID="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Workload="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.589 [INFO][4732] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" HandleID="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Workload="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323490), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-mrj5t", "timestamp":"2025-09-09 05:00:03.589586107 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.589 [INFO][4732] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.589 [INFO][4732] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.589 [INFO][4732] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.599 [INFO][4732] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.606 [INFO][4732] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.610 [INFO][4732] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.611 [INFO][4732] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.614 [INFO][4732] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.614 [INFO][4732] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.616 [INFO][4732] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.620 [INFO][4732] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.626 [INFO][4732] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.626 [INFO][4732] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" host="localhost" Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.626 [INFO][4732] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:00:03.653471 containerd[1534]: 2025-09-09 05:00:03.626 [INFO][4732] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" HandleID="k8s-pod-network.5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Workload="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" Sep 9 05:00:03.654007 containerd[1534]: 2025-09-09 05:00:03.629 [INFO][4705] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--mrj5t-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"ebb4a83d-5f9f-41ce-8042-1eac984fa7d5", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-mrj5t", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0c258f3d1c2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:03.654007 containerd[1534]: 2025-09-09 05:00:03.630 [INFO][4705] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" Sep 9 05:00:03.654007 containerd[1534]: 2025-09-09 05:00:03.630 [INFO][4705] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c258f3d1c2 ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" Sep 9 05:00:03.654007 containerd[1534]: 2025-09-09 05:00:03.636 [INFO][4705] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" Sep 9 05:00:03.654007 containerd[1534]: 2025-09-09 05:00:03.637 [INFO][4705] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--mrj5t-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"ebb4a83d-5f9f-41ce-8042-1eac984fa7d5", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c", Pod:"goldmane-7988f88666-mrj5t", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0c258f3d1c2", MAC:"5e:96:37:85:a0:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:03.654007 containerd[1534]: 2025-09-09 05:00:03.650 [INFO][4705] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" Namespace="calico-system" Pod="goldmane-7988f88666-mrj5t" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mrj5t-eth0" Sep 9 05:00:03.687059 containerd[1534]: time="2025-09-09T05:00:03.687016310Z" level=info msg="connecting to shim 5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c" address="unix:///run/containerd/s/d098e09d0d4717e38b9d77ea55642a63ca27858fb483f4751467beeb0a466e0e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:00:03.714628 systemd[1]: Started cri-containerd-5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c.scope - libcontainer container 5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c. Sep 9 05:00:03.727388 systemd-networkd[1452]: cali885022822ea: Gained IPv6LL Sep 9 05:00:03.732583 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:00:03.736798 systemd-networkd[1452]: cali7ae66af35ef: Link UP Sep 9 05:00:03.737542 systemd-networkd[1452]: cali7ae66af35ef: Gained carrier Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.589 [INFO][4719] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0 calico-apiserver-6d964bcddb- calico-apiserver 0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1 828 0 2025-09-09 04:59:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6d964bcddb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6d964bcddb-nf8n5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7ae66af35ef [] [] }} ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.590 [INFO][4719] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.624 [INFO][4742] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" HandleID="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Workload="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.624 [INFO][4742] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" HandleID="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Workload="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a1530), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6d964bcddb-nf8n5", "timestamp":"2025-09-09 05:00:03.624690609 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.624 [INFO][4742] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.626 [INFO][4742] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.626 [INFO][4742] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.701 [INFO][4742] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.707 [INFO][4742] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.713 [INFO][4742] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.715 [INFO][4742] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.718 [INFO][4742] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.718 [INFO][4742] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.719 [INFO][4742] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264 Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.723 [INFO][4742] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.729 [INFO][4742] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.729 [INFO][4742] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" host="localhost" Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.729 [INFO][4742] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:00:03.753045 containerd[1534]: 2025-09-09 05:00:03.729 [INFO][4742] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" HandleID="k8s-pod-network.5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Workload="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" Sep 9 05:00:03.754469 containerd[1534]: 2025-09-09 05:00:03.733 [INFO][4719] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0", GenerateName:"calico-apiserver-6d964bcddb-", Namespace:"calico-apiserver", SelfLink:"", UID:"0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d964bcddb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6d964bcddb-nf8n5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7ae66af35ef", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:03.754469 containerd[1534]: 2025-09-09 05:00:03.733 [INFO][4719] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" Sep 9 05:00:03.754469 containerd[1534]: 2025-09-09 05:00:03.733 [INFO][4719] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7ae66af35ef ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" Sep 9 05:00:03.754469 containerd[1534]: 2025-09-09 05:00:03.737 [INFO][4719] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" Sep 9 05:00:03.754469 containerd[1534]: 2025-09-09 05:00:03.738 [INFO][4719] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0", GenerateName:"calico-apiserver-6d964bcddb-", Namespace:"calico-apiserver", SelfLink:"", UID:"0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6d964bcddb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264", Pod:"calico-apiserver-6d964bcddb-nf8n5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7ae66af35ef", MAC:"8a:b9:24:1f:ad:39", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:00:03.754469 containerd[1534]: 2025-09-09 05:00:03.748 [INFO][4719] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" Namespace="calico-apiserver" Pod="calico-apiserver-6d964bcddb-nf8n5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6d964bcddb--nf8n5-eth0" Sep 9 05:00:03.777188 containerd[1534]: time="2025-09-09T05:00:03.777143382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mrj5t,Uid:ebb4a83d-5f9f-41ce-8042-1eac984fa7d5,Namespace:calico-system,Attempt:0,} returns sandbox id \"5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c\"" Sep 9 05:00:03.788514 containerd[1534]: time="2025-09-09T05:00:03.787904114Z" level=info msg="connecting to shim 5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264" address="unix:///run/containerd/s/241a2088594c75f33266251a8d7d8f51638e2bafc7a4f6829b8f7eb90546be9b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:00:03.818655 systemd[1]: Started cri-containerd-5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264.scope - libcontainer container 5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264. Sep 9 05:00:03.829439 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:00:03.856293 containerd[1534]: time="2025-09-09T05:00:03.856259356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6d964bcddb-nf8n5,Uid:0aaa75f8-a1cd-46bc-9799-0bb8fa9008b1,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264\"" Sep 9 05:00:05.071045 systemd-networkd[1452]: cali7ae66af35ef: Gained IPv6LL Sep 9 05:00:05.646688 systemd-networkd[1452]: cali0c258f3d1c2: Gained IPv6LL Sep 9 05:00:07.413913 systemd[1]: Started sshd@8-10.0.0.72:22-10.0.0.1:47250.service - OpenSSH per-connection server daemon (10.0.0.1:47250). Sep 9 05:00:07.475086 sshd[4868]: Accepted publickey for core from 10.0.0.1 port 47250 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:07.476541 sshd-session[4868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:07.481036 systemd-logind[1504]: New session 9 of user core. Sep 9 05:00:07.488698 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 05:00:07.638537 sshd[4871]: Connection closed by 10.0.0.1 port 47250 Sep 9 05:00:07.639053 sshd-session[4868]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:07.643590 systemd[1]: sshd@8-10.0.0.72:22-10.0.0.1:47250.service: Deactivated successfully. Sep 9 05:00:07.645456 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 05:00:07.650283 systemd-logind[1504]: Session 9 logged out. Waiting for processes to exit. Sep 9 05:00:07.651507 systemd-logind[1504]: Removed session 9. Sep 9 05:00:08.923975 containerd[1534]: time="2025-09-09T05:00:08.923922518Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:08.925063 containerd[1534]: time="2025-09-09T05:00:08.924805216Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 05:00:08.925992 containerd[1534]: time="2025-09-09T05:00:08.925953720Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:08.928331 containerd[1534]: time="2025-09-09T05:00:08.928299048Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:08.928946 containerd[1534]: time="2025-09-09T05:00:08.928910300Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 8.145262077s" Sep 9 05:00:08.929037 containerd[1534]: time="2025-09-09T05:00:08.929021983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 05:00:08.930225 containerd[1534]: time="2025-09-09T05:00:08.930195527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 05:00:08.942639 containerd[1534]: time="2025-09-09T05:00:08.942597181Z" level=info msg="CreateContainer within sandbox \"65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 05:00:09.026946 containerd[1534]: time="2025-09-09T05:00:09.026679889Z" level=info msg="Container 2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:09.042540 containerd[1534]: time="2025-09-09T05:00:09.042462124Z" level=info msg="CreateContainer within sandbox \"65ee1e7f5b051538dd385cfc46930bd55a0ad386e5c7f5d27913e24493e6a2ed\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108\"" Sep 9 05:00:09.043250 containerd[1534]: time="2025-09-09T05:00:09.043009815Z" level=info msg="StartContainer for \"2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108\"" Sep 9 05:00:09.044377 containerd[1534]: time="2025-09-09T05:00:09.044347962Z" level=info msg="connecting to shim 2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108" address="unix:///run/containerd/s/1cea43ee75f6d71cb559e2ae1a0aaf023e69dfd62e9d7b81bda0fced61d2c0e7" protocol=ttrpc version=3 Sep 9 05:00:09.062657 systemd[1]: Started cri-containerd-2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108.scope - libcontainer container 2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108. Sep 9 05:00:09.098862 containerd[1534]: time="2025-09-09T05:00:09.098822009Z" level=info msg="StartContainer for \"2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108\" returns successfully" Sep 9 05:00:09.748330 kubelet[2665]: I0909 05:00:09.748269 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5f86fdbd-xnjnq" podStartSLOduration=21.601739501 podStartE2EDuration="29.748255645s" podCreationTimestamp="2025-09-09 04:59:40 +0000 UTC" firstStartedPulling="2025-09-09 05:00:00.783362976 +0000 UTC m=+40.363431109" lastFinishedPulling="2025-09-09 05:00:08.92987912 +0000 UTC m=+48.509947253" observedRunningTime="2025-09-09 05:00:09.747829477 +0000 UTC m=+49.327897610" watchObservedRunningTime="2025-09-09 05:00:09.748255645 +0000 UTC m=+49.328323778" Sep 9 05:00:09.806983 containerd[1534]: time="2025-09-09T05:00:09.806908616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108\" id:\"ad7673cc01f690962413ba70bec391ddd1028b0d2a3dc2e593183d602230d44f\" pid:4950 exited_at:{seconds:1757394009 nanos:806415006}" Sep 9 05:00:12.654922 systemd[1]: Started sshd@9-10.0.0.72:22-10.0.0.1:41298.service - OpenSSH per-connection server daemon (10.0.0.1:41298). Sep 9 05:00:12.709821 sshd[4961]: Accepted publickey for core from 10.0.0.1 port 41298 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:12.711321 sshd-session[4961]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:12.715540 systemd-logind[1504]: New session 10 of user core. Sep 9 05:00:12.721659 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 05:00:12.921338 sshd[4964]: Connection closed by 10.0.0.1 port 41298 Sep 9 05:00:12.922676 sshd-session[4961]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:12.926611 systemd[1]: sshd@9-10.0.0.72:22-10.0.0.1:41298.service: Deactivated successfully. Sep 9 05:00:12.928285 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 05:00:12.929651 systemd-logind[1504]: Session 10 logged out. Waiting for processes to exit. Sep 9 05:00:12.930684 systemd-logind[1504]: Removed session 10. Sep 9 05:00:14.388102 containerd[1534]: time="2025-09-09T05:00:14.387509070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:14.388102 containerd[1534]: time="2025-09-09T05:00:14.388048999Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 05:00:14.389464 containerd[1534]: time="2025-09-09T05:00:14.389420263Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:14.391440 containerd[1534]: time="2025-09-09T05:00:14.391391617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:14.392086 containerd[1534]: time="2025-09-09T05:00:14.392057549Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 5.461829702s" Sep 9 05:00:14.392145 containerd[1534]: time="2025-09-09T05:00:14.392092230Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 05:00:14.395406 containerd[1534]: time="2025-09-09T05:00:14.394645034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:00:14.396183 containerd[1534]: time="2025-09-09T05:00:14.396080540Z" level=info msg="CreateContainer within sandbox \"cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 05:00:14.413352 containerd[1534]: time="2025-09-09T05:00:14.412681790Z" level=info msg="Container 9ef313094585dee08826c174d0dbb531571c1954a6e34ffb9d638a4fce5ccabd: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:14.427131 containerd[1534]: time="2025-09-09T05:00:14.427078002Z" level=info msg="CreateContainer within sandbox \"cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9ef313094585dee08826c174d0dbb531571c1954a6e34ffb9d638a4fce5ccabd\"" Sep 9 05:00:14.427888 containerd[1534]: time="2025-09-09T05:00:14.427822096Z" level=info msg="StartContainer for \"9ef313094585dee08826c174d0dbb531571c1954a6e34ffb9d638a4fce5ccabd\"" Sep 9 05:00:14.429639 containerd[1534]: time="2025-09-09T05:00:14.429604767Z" level=info msg="connecting to shim 9ef313094585dee08826c174d0dbb531571c1954a6e34ffb9d638a4fce5ccabd" address="unix:///run/containerd/s/6414b1417dd3dea7899285d19b8b9333b1908dfc68029b37338043fd2896d3d6" protocol=ttrpc version=3 Sep 9 05:00:14.454712 systemd[1]: Started cri-containerd-9ef313094585dee08826c174d0dbb531571c1954a6e34ffb9d638a4fce5ccabd.scope - libcontainer container 9ef313094585dee08826c174d0dbb531571c1954a6e34ffb9d638a4fce5ccabd. Sep 9 05:00:14.492795 containerd[1534]: time="2025-09-09T05:00:14.492757753Z" level=info msg="StartContainer for \"9ef313094585dee08826c174d0dbb531571c1954a6e34ffb9d638a4fce5ccabd\" returns successfully" Sep 9 05:00:17.313467 containerd[1534]: time="2025-09-09T05:00:17.313420396Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96\" id:\"a5c47e954e15c3cd6da3b26fc157889d99987d056e3acc0ec95475d1c953319c\" pid:5035 exited_at:{seconds:1757394017 nanos:311748089}" Sep 9 05:00:17.936962 systemd[1]: Started sshd@10-10.0.0.72:22-10.0.0.1:41300.service - OpenSSH per-connection server daemon (10.0.0.1:41300). Sep 9 05:00:18.012530 sshd[5048]: Accepted publickey for core from 10.0.0.1 port 41300 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:18.015327 sshd-session[5048]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:18.022809 systemd-logind[1504]: New session 11 of user core. Sep 9 05:00:18.029635 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 05:00:18.199195 sshd[5051]: Connection closed by 10.0.0.1 port 41300 Sep 9 05:00:18.199433 sshd-session[5048]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:18.203157 systemd[1]: sshd@10-10.0.0.72:22-10.0.0.1:41300.service: Deactivated successfully. Sep 9 05:00:18.204870 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 05:00:18.205574 systemd-logind[1504]: Session 11 logged out. Waiting for processes to exit. Sep 9 05:00:18.207004 systemd-logind[1504]: Removed session 11. Sep 9 05:00:19.889903 containerd[1534]: time="2025-09-09T05:00:19.889845893Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:19.890522 containerd[1534]: time="2025-09-09T05:00:19.890472102Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 05:00:19.891436 containerd[1534]: time="2025-09-09T05:00:19.891407237Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:19.893708 containerd[1534]: time="2025-09-09T05:00:19.893667352Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:19.894826 containerd[1534]: time="2025-09-09T05:00:19.894789289Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 5.500107134s" Sep 9 05:00:19.894826 containerd[1534]: time="2025-09-09T05:00:19.894825369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 05:00:19.896522 containerd[1534]: time="2025-09-09T05:00:19.896223391Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 05:00:19.897268 containerd[1534]: time="2025-09-09T05:00:19.897129285Z" level=info msg="CreateContainer within sandbox \"6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:00:19.909796 containerd[1534]: time="2025-09-09T05:00:19.909747240Z" level=info msg="Container 541b0fe853426d5e75e0cfc83312ead698ea6eba073d43bde576dff2a425676c: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:19.916247 containerd[1534]: time="2025-09-09T05:00:19.916214540Z" level=info msg="CreateContainer within sandbox \"6052473df235c22e48060907f19b590e9b67e6dc1756602ecc3d394ddc2034a0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"541b0fe853426d5e75e0cfc83312ead698ea6eba073d43bde576dff2a425676c\"" Sep 9 05:00:19.916745 containerd[1534]: time="2025-09-09T05:00:19.916722707Z" level=info msg="StartContainer for \"541b0fe853426d5e75e0cfc83312ead698ea6eba073d43bde576dff2a425676c\"" Sep 9 05:00:19.917743 containerd[1534]: time="2025-09-09T05:00:19.917717683Z" level=info msg="connecting to shim 541b0fe853426d5e75e0cfc83312ead698ea6eba073d43bde576dff2a425676c" address="unix:///run/containerd/s/09c3a8b19afa4d7b2d10623edd38a8cb23456be9e83c5053f28d97e960fc992c" protocol=ttrpc version=3 Sep 9 05:00:19.937669 systemd[1]: Started cri-containerd-541b0fe853426d5e75e0cfc83312ead698ea6eba073d43bde576dff2a425676c.scope - libcontainer container 541b0fe853426d5e75e0cfc83312ead698ea6eba073d43bde576dff2a425676c. Sep 9 05:00:19.972348 containerd[1534]: time="2025-09-09T05:00:19.972312045Z" level=info msg="StartContainer for \"541b0fe853426d5e75e0cfc83312ead698ea6eba073d43bde576dff2a425676c\" returns successfully" Sep 9 05:00:20.768503 kubelet[2665]: I0909 05:00:20.768434 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6d964bcddb-t475x" podStartSLOduration=28.620083821 podStartE2EDuration="45.768420722s" podCreationTimestamp="2025-09-09 04:59:35 +0000 UTC" firstStartedPulling="2025-09-09 05:00:02.747196439 +0000 UTC m=+42.327264572" lastFinishedPulling="2025-09-09 05:00:19.89553334 +0000 UTC m=+59.475601473" observedRunningTime="2025-09-09 05:00:20.767932194 +0000 UTC m=+60.348000327" watchObservedRunningTime="2025-09-09 05:00:20.768420722 +0000 UTC m=+60.348488855" Sep 9 05:00:23.225650 systemd[1]: Started sshd@11-10.0.0.72:22-10.0.0.1:51592.service - OpenSSH per-connection server daemon (10.0.0.1:51592). Sep 9 05:00:23.311246 sshd[5120]: Accepted publickey for core from 10.0.0.1 port 51592 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:23.313950 sshd-session[5120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:23.322897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3919519210.mount: Deactivated successfully. Sep 9 05:00:23.327144 systemd-logind[1504]: New session 12 of user core. Sep 9 05:00:23.331640 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 05:00:23.604119 sshd[5124]: Connection closed by 10.0.0.1 port 51592 Sep 9 05:00:23.605701 sshd-session[5120]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:23.614627 systemd[1]: sshd@11-10.0.0.72:22-10.0.0.1:51592.service: Deactivated successfully. Sep 9 05:00:23.617937 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 05:00:23.618879 systemd-logind[1504]: Session 12 logged out. Waiting for processes to exit. Sep 9 05:00:23.621138 systemd[1]: Started sshd@12-10.0.0.72:22-10.0.0.1:51606.service - OpenSSH per-connection server daemon (10.0.0.1:51606). Sep 9 05:00:23.622466 systemd-logind[1504]: Removed session 12. Sep 9 05:00:23.679110 sshd[5142]: Accepted publickey for core from 10.0.0.1 port 51606 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:23.680433 sshd-session[5142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:23.689123 systemd-logind[1504]: New session 13 of user core. Sep 9 05:00:23.699674 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 05:00:23.866424 containerd[1534]: time="2025-09-09T05:00:23.866260993Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:23.868766 containerd[1534]: time="2025-09-09T05:00:23.868690507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 05:00:23.873616 containerd[1534]: time="2025-09-09T05:00:23.873466974Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:23.876417 containerd[1534]: time="2025-09-09T05:00:23.875809647Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:23.877110 containerd[1534]: time="2025-09-09T05:00:23.876947262Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.980422106s" Sep 9 05:00:23.877110 containerd[1534]: time="2025-09-09T05:00:23.876982583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 05:00:23.878369 containerd[1534]: time="2025-09-09T05:00:23.878338962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:00:23.878861 containerd[1534]: time="2025-09-09T05:00:23.878813769Z" level=info msg="CreateContainer within sandbox \"5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 05:00:23.886524 containerd[1534]: time="2025-09-09T05:00:23.886242193Z" level=info msg="Container 50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:23.896108 containerd[1534]: time="2025-09-09T05:00:23.896070690Z" level=info msg="CreateContainer within sandbox \"5feaba9ebed2b8eabc36e8c08fbd0b888eb68392ff636d0b8af4816507a8e15c\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\"" Sep 9 05:00:23.896712 containerd[1534]: time="2025-09-09T05:00:23.896686339Z" level=info msg="StartContainer for \"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\"" Sep 9 05:00:23.898565 containerd[1534]: time="2025-09-09T05:00:23.898470724Z" level=info msg="connecting to shim 50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2" address="unix:///run/containerd/s/d098e09d0d4717e38b9d77ea55642a63ca27858fb483f4751467beeb0a466e0e" protocol=ttrpc version=3 Sep 9 05:00:23.923931 systemd[1]: Started cri-containerd-50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2.scope - libcontainer container 50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2. Sep 9 05:00:23.962467 containerd[1534]: time="2025-09-09T05:00:23.962364178Z" level=info msg="StartContainer for \"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\" returns successfully" Sep 9 05:00:24.039881 sshd[5145]: Connection closed by 10.0.0.1 port 51606 Sep 9 05:00:24.041305 sshd-session[5142]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:24.053088 systemd[1]: sshd@12-10.0.0.72:22-10.0.0.1:51606.service: Deactivated successfully. Sep 9 05:00:24.055438 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 05:00:24.058498 systemd-logind[1504]: Session 13 logged out. Waiting for processes to exit. Sep 9 05:00:24.063339 systemd[1]: Started sshd@13-10.0.0.72:22-10.0.0.1:51616.service - OpenSSH per-connection server daemon (10.0.0.1:51616). Sep 9 05:00:24.065471 systemd-logind[1504]: Removed session 13. Sep 9 05:00:24.119509 sshd[5196]: Accepted publickey for core from 10.0.0.1 port 51616 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:24.121016 sshd-session[5196]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:24.124830 systemd-logind[1504]: New session 14 of user core. Sep 9 05:00:24.139695 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 05:00:24.277315 sshd[5199]: Connection closed by 10.0.0.1 port 51616 Sep 9 05:00:24.277640 sshd-session[5196]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:24.280540 systemd[1]: sshd@13-10.0.0.72:22-10.0.0.1:51616.service: Deactivated successfully. Sep 9 05:00:24.282599 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 05:00:24.284144 systemd-logind[1504]: Session 14 logged out. Waiting for processes to exit. Sep 9 05:00:24.286692 systemd-logind[1504]: Removed session 14. Sep 9 05:00:24.795207 kubelet[2665]: I0909 05:00:24.795084 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-mrj5t" podStartSLOduration=24.695451541 podStartE2EDuration="44.794417715s" podCreationTimestamp="2025-09-09 04:59:40 +0000 UTC" firstStartedPulling="2025-09-09 05:00:03.778678298 +0000 UTC m=+43.358746391" lastFinishedPulling="2025-09-09 05:00:23.877644472 +0000 UTC m=+63.457712565" observedRunningTime="2025-09-09 05:00:24.794217312 +0000 UTC m=+64.374285445" watchObservedRunningTime="2025-09-09 05:00:24.794417715 +0000 UTC m=+64.374485808" Sep 9 05:00:24.870351 containerd[1534]: time="2025-09-09T05:00:24.870279911Z" level=info msg="TaskExit event in podsandbox handler container_id:\"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\" id:\"ec516749efe52d47e4a4c816d6e361f8262694f9a3198b15846c18e565984410\" pid:5224 exit_status:1 exited_at:{seconds:1757394024 nanos:869587101}" Sep 9 05:00:25.359784 containerd[1534]: time="2025-09-09T05:00:25.359746441Z" level=info msg="TaskExit event in podsandbox handler container_id:\"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\" id:\"0a406f3aaeace7684107d4985017768709227736fbc6300bbc9f1f03c11b3982\" pid:5250 exited_at:{seconds:1757394025 nanos:359448997}" Sep 9 05:00:25.718818 containerd[1534]: time="2025-09-09T05:00:25.717849696Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:25.718818 containerd[1534]: time="2025-09-09T05:00:25.718472184Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:00:25.720553 containerd[1534]: time="2025-09-09T05:00:25.720317289Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 1.841945847s" Sep 9 05:00:25.720553 containerd[1534]: time="2025-09-09T05:00:25.720386410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 05:00:25.722604 containerd[1534]: time="2025-09-09T05:00:25.722567239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 05:00:25.725797 containerd[1534]: time="2025-09-09T05:00:25.725674641Z" level=info msg="CreateContainer within sandbox \"5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:00:25.740468 containerd[1534]: time="2025-09-09T05:00:25.740199394Z" level=info msg="Container c40d5f2eeff267d6bbe9203dc543d4f94dbc3f40eba8002e24603e83bf1fa90c: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:25.751323 containerd[1534]: time="2025-09-09T05:00:25.751271102Z" level=info msg="CreateContainer within sandbox \"5ca5d583a023a4e1c6b1764c37acb1be60f3c462a9376f7efe4ef3752da2a264\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c40d5f2eeff267d6bbe9203dc543d4f94dbc3f40eba8002e24603e83bf1fa90c\"" Sep 9 05:00:25.752488 containerd[1534]: time="2025-09-09T05:00:25.752444478Z" level=info msg="StartContainer for \"c40d5f2eeff267d6bbe9203dc543d4f94dbc3f40eba8002e24603e83bf1fa90c\"" Sep 9 05:00:25.755170 containerd[1534]: time="2025-09-09T05:00:25.755131273Z" level=info msg="connecting to shim c40d5f2eeff267d6bbe9203dc543d4f94dbc3f40eba8002e24603e83bf1fa90c" address="unix:///run/containerd/s/241a2088594c75f33266251a8d7d8f51638e2bafc7a4f6829b8f7eb90546be9b" protocol=ttrpc version=3 Sep 9 05:00:25.780693 systemd[1]: Started cri-containerd-c40d5f2eeff267d6bbe9203dc543d4f94dbc3f40eba8002e24603e83bf1fa90c.scope - libcontainer container c40d5f2eeff267d6bbe9203dc543d4f94dbc3f40eba8002e24603e83bf1fa90c. Sep 9 05:00:25.857887 containerd[1534]: time="2025-09-09T05:00:25.857848683Z" level=info msg="TaskExit event in podsandbox handler container_id:\"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\" id:\"7968184d42802f097fc4bf0bd69f0e0578da4e03d8c46c1ea6888248abb344fd\" pid:5290 exit_status:1 exited_at:{seconds:1757394025 nanos:857561159}" Sep 9 05:00:25.904233 containerd[1534]: time="2025-09-09T05:00:25.899569960Z" level=info msg="StartContainer for \"c40d5f2eeff267d6bbe9203dc543d4f94dbc3f40eba8002e24603e83bf1fa90c\" returns successfully" Sep 9 05:00:26.795857 kubelet[2665]: I0909 05:00:26.795772 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6d964bcddb-nf8n5" podStartSLOduration=29.931393977 podStartE2EDuration="51.795757303s" podCreationTimestamp="2025-09-09 04:59:35 +0000 UTC" firstStartedPulling="2025-09-09 05:00:03.857329101 +0000 UTC m=+43.437397234" lastFinishedPulling="2025-09-09 05:00:25.721692427 +0000 UTC m=+65.301760560" observedRunningTime="2025-09-09 05:00:26.795253096 +0000 UTC m=+66.375321229" watchObservedRunningTime="2025-09-09 05:00:26.795757303 +0000 UTC m=+66.375825516" Sep 9 05:00:27.783870 kubelet[2665]: I0909 05:00:27.783822 2665 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:00:29.294635 systemd[1]: Started sshd@14-10.0.0.72:22-10.0.0.1:51624.service - OpenSSH per-connection server daemon (10.0.0.1:51624). Sep 9 05:00:29.391993 sshd[5330]: Accepted publickey for core from 10.0.0.1 port 51624 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:29.394310 sshd-session[5330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:29.398773 systemd-logind[1504]: New session 15 of user core. Sep 9 05:00:29.410661 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 05:00:29.556420 containerd[1534]: time="2025-09-09T05:00:29.556298108Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:29.557910 containerd[1534]: time="2025-09-09T05:00:29.557877248Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 05:00:29.558895 containerd[1534]: time="2025-09-09T05:00:29.558863180Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:29.563225 containerd[1534]: time="2025-09-09T05:00:29.563025710Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:00:29.563987 containerd[1534]: time="2025-09-09T05:00:29.563954121Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 3.841349762s" Sep 9 05:00:29.564039 containerd[1534]: time="2025-09-09T05:00:29.563989042Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 05:00:29.566707 containerd[1534]: time="2025-09-09T05:00:29.566673714Z" level=info msg="CreateContainer within sandbox \"cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 05:00:29.575589 containerd[1534]: time="2025-09-09T05:00:29.574916895Z" level=info msg="Container c6c2a8f8798896540128de9b5c8726b7f3339e96b1a0a09a9ab765d03548e589: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:00:29.586690 containerd[1534]: time="2025-09-09T05:00:29.586649037Z" level=info msg="CreateContainer within sandbox \"cffe12506bf10e41f59e215275479b5f4ab5b8cdbd9c02e3ecbbbdfb4734f19d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c6c2a8f8798896540128de9b5c8726b7f3339e96b1a0a09a9ab765d03548e589\"" Sep 9 05:00:29.587438 containerd[1534]: time="2025-09-09T05:00:29.587413646Z" level=info msg="StartContainer for \"c6c2a8f8798896540128de9b5c8726b7f3339e96b1a0a09a9ab765d03548e589\"" Sep 9 05:00:29.588784 containerd[1534]: time="2025-09-09T05:00:29.588761143Z" level=info msg="connecting to shim c6c2a8f8798896540128de9b5c8726b7f3339e96b1a0a09a9ab765d03548e589" address="unix:///run/containerd/s/6414b1417dd3dea7899285d19b8b9333b1908dfc68029b37338043fd2896d3d6" protocol=ttrpc version=3 Sep 9 05:00:29.611673 systemd[1]: Started cri-containerd-c6c2a8f8798896540128de9b5c8726b7f3339e96b1a0a09a9ab765d03548e589.scope - libcontainer container c6c2a8f8798896540128de9b5c8726b7f3339e96b1a0a09a9ab765d03548e589. Sep 9 05:00:29.626642 sshd[5337]: Connection closed by 10.0.0.1 port 51624 Sep 9 05:00:29.628207 sshd-session[5330]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:29.632017 systemd[1]: sshd@14-10.0.0.72:22-10.0.0.1:51624.service: Deactivated successfully. Sep 9 05:00:29.634145 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 05:00:29.635172 systemd-logind[1504]: Session 15 logged out. Waiting for processes to exit. Sep 9 05:00:29.636786 systemd-logind[1504]: Removed session 15. Sep 9 05:00:29.663063 containerd[1534]: time="2025-09-09T05:00:29.663025405Z" level=info msg="StartContainer for \"c6c2a8f8798896540128de9b5c8726b7f3339e96b1a0a09a9ab765d03548e589\" returns successfully" Sep 9 05:00:30.603830 kubelet[2665]: I0909 05:00:30.603775 2665 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 05:00:30.606699 kubelet[2665]: I0909 05:00:30.606672 2665 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 05:00:34.643187 systemd[1]: Started sshd@15-10.0.0.72:22-10.0.0.1:51292.service - OpenSSH per-connection server daemon (10.0.0.1:51292). Sep 9 05:00:34.708440 sshd[5395]: Accepted publickey for core from 10.0.0.1 port 51292 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:34.710030 sshd-session[5395]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:34.714884 systemd-logind[1504]: New session 16 of user core. Sep 9 05:00:34.724682 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 05:00:34.881805 sshd[5398]: Connection closed by 10.0.0.1 port 51292 Sep 9 05:00:34.882354 sshd-session[5395]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:34.885939 systemd[1]: sshd@15-10.0.0.72:22-10.0.0.1:51292.service: Deactivated successfully. Sep 9 05:00:34.887996 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 05:00:34.889378 systemd-logind[1504]: Session 16 logged out. Waiting for processes to exit. Sep 9 05:00:34.890602 systemd-logind[1504]: Removed session 16. Sep 9 05:00:35.910140 containerd[1534]: time="2025-09-09T05:00:35.909953125Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108\" id:\"1a0dd317e680905e8eb60e2dfaa3e9b3fdd48b9d902cd247bd63392dbd0cb969\" pid:5422 exited_at:{seconds:1757394035 nanos:909604122}" Sep 9 05:00:39.051540 containerd[1534]: time="2025-09-09T05:00:39.051479662Z" level=info msg="TaskExit event in podsandbox handler container_id:\"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\" id:\"1ab82a156728bc1cf0952885359c8d4f0a14bc7343b6056b2282bf49f5359b19\" pid:5444 exited_at:{seconds:1757394039 nanos:50925056}" Sep 9 05:00:39.066307 kubelet[2665]: I0909 05:00:39.066118 2665 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-kgwfz" podStartSLOduration=30.395713315 podStartE2EDuration="59.066100324s" podCreationTimestamp="2025-09-09 04:59:40 +0000 UTC" firstStartedPulling="2025-09-09 05:00:00.894512244 +0000 UTC m=+40.474580377" lastFinishedPulling="2025-09-09 05:00:29.564899253 +0000 UTC m=+69.144967386" observedRunningTime="2025-09-09 05:00:29.809389622 +0000 UTC m=+69.389457795" watchObservedRunningTime="2025-09-09 05:00:39.066100324 +0000 UTC m=+78.646168457" Sep 9 05:00:39.895234 systemd[1]: Started sshd@16-10.0.0.72:22-10.0.0.1:51298.service - OpenSSH per-connection server daemon (10.0.0.1:51298). Sep 9 05:00:40.007607 sshd[5458]: Accepted publickey for core from 10.0.0.1 port 51298 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:40.008842 sshd-session[5458]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:40.013375 systemd-logind[1504]: New session 17 of user core. Sep 9 05:00:40.020627 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 05:00:40.156702 sshd[5461]: Connection closed by 10.0.0.1 port 51298 Sep 9 05:00:40.156784 sshd-session[5458]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:40.160414 systemd[1]: sshd@16-10.0.0.72:22-10.0.0.1:51298.service: Deactivated successfully. Sep 9 05:00:40.164081 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 05:00:40.164985 systemd-logind[1504]: Session 17 logged out. Waiting for processes to exit. Sep 9 05:00:40.166454 systemd-logind[1504]: Removed session 17. Sep 9 05:00:45.180051 systemd[1]: Started sshd@17-10.0.0.72:22-10.0.0.1:55410.service - OpenSSH per-connection server daemon (10.0.0.1:55410). Sep 9 05:00:45.253911 sshd[5476]: Accepted publickey for core from 10.0.0.1 port 55410 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:45.256406 sshd-session[5476]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:45.261268 systemd-logind[1504]: New session 18 of user core. Sep 9 05:00:45.277689 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 05:00:45.453110 sshd[5479]: Connection closed by 10.0.0.1 port 55410 Sep 9 05:00:45.453626 sshd-session[5476]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:45.458791 systemd-logind[1504]: Session 18 logged out. Waiting for processes to exit. Sep 9 05:00:45.458937 systemd[1]: sshd@17-10.0.0.72:22-10.0.0.1:55410.service: Deactivated successfully. Sep 9 05:00:45.460873 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 05:00:45.462989 systemd-logind[1504]: Removed session 18. Sep 9 05:00:47.306260 containerd[1534]: time="2025-09-09T05:00:47.306219958Z" level=info msg="TaskExit event in podsandbox handler container_id:\"49334378f04ee527e9489c73e02e299bd58bd59b1ce3e8d7dc79713673dc7f96\" id:\"8682f1183e15dd7e152216e452fbeb088ef6635674b38f31b7da69a685027f9e\" pid:5503 exited_at:{seconds:1757394047 nanos:305822275}" Sep 9 05:00:48.343838 kubelet[2665]: I0909 05:00:48.343777 2665 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:00:50.469466 systemd[1]: Started sshd@18-10.0.0.72:22-10.0.0.1:49180.service - OpenSSH per-connection server daemon (10.0.0.1:49180). Sep 9 05:00:50.545310 sshd[5519]: Accepted publickey for core from 10.0.0.1 port 49180 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:50.547691 sshd-session[5519]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:50.557695 systemd-logind[1504]: New session 19 of user core. Sep 9 05:00:50.563664 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 05:00:50.762490 sshd[5522]: Connection closed by 10.0.0.1 port 49180 Sep 9 05:00:50.763914 sshd-session[5519]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:50.775956 systemd[1]: sshd@18-10.0.0.72:22-10.0.0.1:49180.service: Deactivated successfully. Sep 9 05:00:50.778001 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 05:00:50.779063 systemd-logind[1504]: Session 19 logged out. Waiting for processes to exit. Sep 9 05:00:50.783337 systemd[1]: Started sshd@19-10.0.0.72:22-10.0.0.1:49194.service - OpenSSH per-connection server daemon (10.0.0.1:49194). Sep 9 05:00:50.784602 systemd-logind[1504]: Removed session 19. Sep 9 05:00:50.856061 sshd[5536]: Accepted publickey for core from 10.0.0.1 port 49194 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:50.856930 sshd-session[5536]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:50.862980 systemd-logind[1504]: New session 20 of user core. Sep 9 05:00:50.871700 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 05:00:51.115718 sshd[5539]: Connection closed by 10.0.0.1 port 49194 Sep 9 05:00:51.116501 sshd-session[5536]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:51.126167 systemd[1]: sshd@19-10.0.0.72:22-10.0.0.1:49194.service: Deactivated successfully. Sep 9 05:00:51.128451 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 05:00:51.129357 systemd-logind[1504]: Session 20 logged out. Waiting for processes to exit. Sep 9 05:00:51.133164 systemd[1]: Started sshd@20-10.0.0.72:22-10.0.0.1:49208.service - OpenSSH per-connection server daemon (10.0.0.1:49208). Sep 9 05:00:51.134594 systemd-logind[1504]: Removed session 20. Sep 9 05:00:51.195406 sshd[5551]: Accepted publickey for core from 10.0.0.1 port 49208 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:51.196918 sshd-session[5551]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:51.201686 systemd-logind[1504]: New session 21 of user core. Sep 9 05:00:51.214839 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 05:00:53.007796 sshd[5554]: Connection closed by 10.0.0.1 port 49208 Sep 9 05:00:53.007231 sshd-session[5551]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:53.017442 systemd[1]: sshd@20-10.0.0.72:22-10.0.0.1:49208.service: Deactivated successfully. Sep 9 05:00:53.023381 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 05:00:53.024723 systemd[1]: session-21.scope: Consumed 557ms CPU time, 76M memory peak. Sep 9 05:00:53.026964 systemd-logind[1504]: Session 21 logged out. Waiting for processes to exit. Sep 9 05:00:53.031730 systemd-logind[1504]: Removed session 21. Sep 9 05:00:53.036912 systemd[1]: Started sshd@21-10.0.0.72:22-10.0.0.1:49214.service - OpenSSH per-connection server daemon (10.0.0.1:49214). Sep 9 05:00:53.104738 sshd[5571]: Accepted publickey for core from 10.0.0.1 port 49214 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:53.106554 sshd-session[5571]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:53.111767 systemd-logind[1504]: New session 22 of user core. Sep 9 05:00:53.116658 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 05:00:53.586897 sshd[5576]: Connection closed by 10.0.0.1 port 49214 Sep 9 05:00:53.587347 sshd-session[5571]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:53.602704 systemd[1]: sshd@21-10.0.0.72:22-10.0.0.1:49214.service: Deactivated successfully. Sep 9 05:00:53.605127 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 05:00:53.606013 systemd-logind[1504]: Session 22 logged out. Waiting for processes to exit. Sep 9 05:00:53.608376 systemd[1]: Started sshd@22-10.0.0.72:22-10.0.0.1:49220.service - OpenSSH per-connection server daemon (10.0.0.1:49220). Sep 9 05:00:53.609879 systemd-logind[1504]: Removed session 22. Sep 9 05:00:53.675075 sshd[5587]: Accepted publickey for core from 10.0.0.1 port 49220 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:53.676422 sshd-session[5587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:53.681568 systemd-logind[1504]: New session 23 of user core. Sep 9 05:00:53.695720 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 05:00:53.851530 sshd[5590]: Connection closed by 10.0.0.1 port 49220 Sep 9 05:00:53.851764 sshd-session[5587]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:53.855937 systemd[1]: sshd@22-10.0.0.72:22-10.0.0.1:49220.service: Deactivated successfully. Sep 9 05:00:53.858749 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 05:00:53.859681 systemd-logind[1504]: Session 23 logged out. Waiting for processes to exit. Sep 9 05:00:53.860646 systemd-logind[1504]: Removed session 23. Sep 9 05:00:57.896245 containerd[1534]: time="2025-09-09T05:00:57.896075782Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108\" id:\"04b426ff274a7917ff594f97fb997fe5bcb1aab2a1db24622ad3156fa1a839c1\" pid:5617 exited_at:{seconds:1757394057 nanos:895688939}" Sep 9 05:00:58.863064 systemd[1]: Started sshd@23-10.0.0.72:22-10.0.0.1:49226.service - OpenSSH per-connection server daemon (10.0.0.1:49226). Sep 9 05:00:58.916166 sshd[5632]: Accepted publickey for core from 10.0.0.1 port 49226 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:00:58.917580 sshd-session[5632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:00:58.922671 systemd-logind[1504]: New session 24 of user core. Sep 9 05:00:58.931645 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 05:00:59.072813 sshd[5635]: Connection closed by 10.0.0.1 port 49226 Sep 9 05:00:59.073155 sshd-session[5632]: pam_unix(sshd:session): session closed for user core Sep 9 05:00:59.076330 systemd[1]: sshd@23-10.0.0.72:22-10.0.0.1:49226.service: Deactivated successfully. Sep 9 05:00:59.078076 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 05:00:59.080105 systemd-logind[1504]: Session 24 logged out. Waiting for processes to exit. Sep 9 05:00:59.082536 systemd-logind[1504]: Removed session 24. Sep 9 05:01:04.090762 systemd[1]: Started sshd@24-10.0.0.72:22-10.0.0.1:41670.service - OpenSSH per-connection server daemon (10.0.0.1:41670). Sep 9 05:01:04.169072 sshd[5648]: Accepted publickey for core from 10.0.0.1 port 41670 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:01:04.170639 sshd-session[5648]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:01:04.175312 systemd-logind[1504]: New session 25 of user core. Sep 9 05:01:04.180679 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 05:01:04.298262 sshd[5651]: Connection closed by 10.0.0.1 port 41670 Sep 9 05:01:04.298631 sshd-session[5648]: pam_unix(sshd:session): session closed for user core Sep 9 05:01:04.304225 systemd[1]: sshd@24-10.0.0.72:22-10.0.0.1:41670.service: Deactivated successfully. Sep 9 05:01:04.306016 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 05:01:04.308716 systemd-logind[1504]: Session 25 logged out. Waiting for processes to exit. Sep 9 05:01:04.309645 systemd-logind[1504]: Removed session 25. Sep 9 05:01:05.939137 containerd[1534]: time="2025-09-09T05:01:05.939039471Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a98426f0b803ac849954f1098257f55b5d40f72d8c6b8b2a613d478620c8108\" id:\"8da715535f92c6dc34b349453f9cc1cc59f3f5a0d68975450fd7ad8b025ed0a9\" pid:5675 exited_at:{seconds:1757394065 nanos:938781229}" Sep 9 05:01:09.043243 containerd[1534]: time="2025-09-09T05:01:09.043195469Z" level=info msg="TaskExit event in podsandbox handler container_id:\"50cc9cf02ab02e76bf316f145aa210974849b1895b9bc05b70ea9754e559bca2\" id:\"4d5609f4c25d675011b0fa33f6e8e69b2fb58d1b6bf0a73bde98a14e28a23d2e\" pid:5698 exited_at:{seconds:1757394069 nanos:42901508}" Sep 9 05:01:09.312725 systemd[1]: Started sshd@25-10.0.0.72:22-10.0.0.1:41684.service - OpenSSH per-connection server daemon (10.0.0.1:41684). Sep 9 05:01:09.363506 sshd[5709]: Accepted publickey for core from 10.0.0.1 port 41684 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:01:09.363976 sshd-session[5709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:01:09.368914 systemd-logind[1504]: New session 26 of user core. Sep 9 05:01:09.378637 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 05:01:09.502614 sshd[5712]: Connection closed by 10.0.0.1 port 41684 Sep 9 05:01:09.502960 sshd-session[5709]: pam_unix(sshd:session): session closed for user core Sep 9 05:01:09.509222 systemd[1]: sshd@25-10.0.0.72:22-10.0.0.1:41684.service: Deactivated successfully. Sep 9 05:01:09.511431 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 05:01:09.512324 systemd-logind[1504]: Session 26 logged out. Waiting for processes to exit. Sep 9 05:01:09.513768 systemd-logind[1504]: Removed session 26. Sep 9 05:01:14.515418 systemd[1]: Started sshd@26-10.0.0.72:22-10.0.0.1:53566.service - OpenSSH per-connection server daemon (10.0.0.1:53566). Sep 9 05:01:14.570160 sshd[5732]: Accepted publickey for core from 10.0.0.1 port 53566 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 05:01:14.572209 sshd-session[5732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:01:14.576321 systemd-logind[1504]: New session 27 of user core. Sep 9 05:01:14.583650 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 9 05:01:14.711500 sshd[5735]: Connection closed by 10.0.0.1 port 53566 Sep 9 05:01:14.711423 sshd-session[5732]: pam_unix(sshd:session): session closed for user core Sep 9 05:01:14.714866 systemd-logind[1504]: Session 27 logged out. Waiting for processes to exit. Sep 9 05:01:14.715310 systemd[1]: sshd@26-10.0.0.72:22-10.0.0.1:53566.service: Deactivated successfully. Sep 9 05:01:14.717017 systemd[1]: session-27.scope: Deactivated successfully. Sep 9 05:01:14.719704 systemd-logind[1504]: Removed session 27.