Sep 9 05:02:14.762868 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 9 05:02:14.762889 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 03:38:34 -00 2025 Sep 9 05:02:14.762899 kernel: KASLR enabled Sep 9 05:02:14.762905 kernel: efi: EFI v2.7 by EDK II Sep 9 05:02:14.762910 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb832018 ACPI 2.0=0xdbfd0018 RNG=0xdbfd0a18 MEMRESERVE=0xdb838218 Sep 9 05:02:14.762919 kernel: random: crng init done Sep 9 05:02:14.762926 kernel: secureboot: Secure boot disabled Sep 9 05:02:14.762932 kernel: ACPI: Early table checksum verification disabled Sep 9 05:02:14.762938 kernel: ACPI: RSDP 0x00000000DBFD0018 000024 (v02 BOCHS ) Sep 9 05:02:14.762945 kernel: ACPI: XSDT 0x00000000DBFD0F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 9 05:02:14.762951 kernel: ACPI: FACP 0x00000000DBFD0B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.762957 kernel: ACPI: DSDT 0x00000000DBF0E018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.762962 kernel: ACPI: APIC 0x00000000DBFD0C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.762968 kernel: ACPI: PPTT 0x00000000DBFD0098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.762975 kernel: ACPI: GTDT 0x00000000DBFD0818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.762982 kernel: ACPI: MCFG 0x00000000DBFD0A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.762989 kernel: ACPI: SPCR 0x00000000DBFD0918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.762995 kernel: ACPI: DBG2 0x00000000DBFD0998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.763003 kernel: ACPI: IORT 0x00000000DBFD0198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:02:14.763009 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 9 05:02:14.763015 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 05:02:14.763021 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 05:02:14.763027 kernel: NODE_DATA(0) allocated [mem 0xdc965a00-0xdc96cfff] Sep 9 05:02:14.763033 kernel: Zone ranges: Sep 9 05:02:14.763039 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 05:02:14.763046 kernel: DMA32 empty Sep 9 05:02:14.763052 kernel: Normal empty Sep 9 05:02:14.763058 kernel: Device empty Sep 9 05:02:14.763063 kernel: Movable zone start for each node Sep 9 05:02:14.763069 kernel: Early memory node ranges Sep 9 05:02:14.763075 kernel: node 0: [mem 0x0000000040000000-0x00000000db81ffff] Sep 9 05:02:14.763081 kernel: node 0: [mem 0x00000000db820000-0x00000000db82ffff] Sep 9 05:02:14.763089 kernel: node 0: [mem 0x00000000db830000-0x00000000dc09ffff] Sep 9 05:02:14.763095 kernel: node 0: [mem 0x00000000dc0a0000-0x00000000dc2dffff] Sep 9 05:02:14.763101 kernel: node 0: [mem 0x00000000dc2e0000-0x00000000dc36ffff] Sep 9 05:02:14.763107 kernel: node 0: [mem 0x00000000dc370000-0x00000000dc45ffff] Sep 9 05:02:14.763113 kernel: node 0: [mem 0x00000000dc460000-0x00000000dc52ffff] Sep 9 05:02:14.763121 kernel: node 0: [mem 0x00000000dc530000-0x00000000dc5cffff] Sep 9 05:02:14.763127 kernel: node 0: [mem 0x00000000dc5d0000-0x00000000dce1ffff] Sep 9 05:02:14.763133 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Sep 9 05:02:14.763141 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Sep 9 05:02:14.763147 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Sep 9 05:02:14.763154 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 9 05:02:14.763162 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 05:02:14.763170 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 9 05:02:14.763177 kernel: cma: Reserved 16 MiB at 0x00000000d8000000 on node -1 Sep 9 05:02:14.763183 kernel: psci: probing for conduit method from ACPI. Sep 9 05:02:14.763189 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 05:02:14.763217 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 05:02:14.763224 kernel: psci: Trusted OS migration not required Sep 9 05:02:14.763230 kernel: psci: SMC Calling Convention v1.1 Sep 9 05:02:14.763237 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 9 05:02:14.763244 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 05:02:14.763252 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 05:02:14.763261 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 9 05:02:14.763268 kernel: Detected PIPT I-cache on CPU0 Sep 9 05:02:14.763274 kernel: CPU features: detected: GIC system register CPU interface Sep 9 05:02:14.763280 kernel: CPU features: detected: Spectre-v4 Sep 9 05:02:14.763287 kernel: CPU features: detected: Spectre-BHB Sep 9 05:02:14.763293 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 05:02:14.763299 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 05:02:14.763305 kernel: CPU features: detected: ARM erratum 1418040 Sep 9 05:02:14.763312 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 05:02:14.763318 kernel: alternatives: applying boot alternatives Sep 9 05:02:14.763325 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 05:02:14.763333 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 05:02:14.763342 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 05:02:14.763348 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 05:02:14.763355 kernel: Fallback order for Node 0: 0 Sep 9 05:02:14.763361 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 9 05:02:14.763367 kernel: Policy zone: DMA Sep 9 05:02:14.763373 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 05:02:14.763380 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 9 05:02:14.763386 kernel: software IO TLB: area num 4. Sep 9 05:02:14.763392 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 9 05:02:14.763399 kernel: software IO TLB: mapped [mem 0x00000000d7c00000-0x00000000d8000000] (4MB) Sep 9 05:02:14.763406 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 05:02:14.763413 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 05:02:14.763420 kernel: rcu: RCU event tracing is enabled. Sep 9 05:02:14.763429 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 05:02:14.763435 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 05:02:14.763442 kernel: Tracing variant of Tasks RCU enabled. Sep 9 05:02:14.763449 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 05:02:14.763455 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 05:02:14.763462 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 05:02:14.763469 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 05:02:14.763475 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 05:02:14.763483 kernel: GICv3: 256 SPIs implemented Sep 9 05:02:14.763489 kernel: GICv3: 0 Extended SPIs implemented Sep 9 05:02:14.763495 kernel: Root IRQ handler: gic_handle_irq Sep 9 05:02:14.763502 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 9 05:02:14.763510 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 9 05:02:14.763517 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 9 05:02:14.763523 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 9 05:02:14.763530 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 9 05:02:14.763537 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 9 05:02:14.763543 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 9 05:02:14.763550 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 9 05:02:14.763556 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 05:02:14.763564 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 05:02:14.763571 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 9 05:02:14.763578 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 9 05:02:14.763585 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 9 05:02:14.763591 kernel: arm-pv: using stolen time PV Sep 9 05:02:14.763600 kernel: Console: colour dummy device 80x25 Sep 9 05:02:14.763607 kernel: ACPI: Core revision 20240827 Sep 9 05:02:14.763614 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 9 05:02:14.763620 kernel: pid_max: default: 32768 minimum: 301 Sep 9 05:02:14.763627 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 05:02:14.763635 kernel: landlock: Up and running. Sep 9 05:02:14.763641 kernel: SELinux: Initializing. Sep 9 05:02:14.763648 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 05:02:14.763655 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 05:02:14.763662 kernel: rcu: Hierarchical SRCU implementation. Sep 9 05:02:14.763676 kernel: rcu: Max phase no-delay instances is 400. Sep 9 05:02:14.763685 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 05:02:14.763692 kernel: Remapping and enabling EFI services. Sep 9 05:02:14.763699 kernel: smp: Bringing up secondary CPUs ... Sep 9 05:02:14.763712 kernel: Detected PIPT I-cache on CPU1 Sep 9 05:02:14.763722 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 9 05:02:14.763733 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 9 05:02:14.763747 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 05:02:14.763756 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 9 05:02:14.763768 kernel: Detected PIPT I-cache on CPU2 Sep 9 05:02:14.763776 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 9 05:02:14.763783 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 9 05:02:14.763792 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 05:02:14.763799 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 9 05:02:14.763806 kernel: Detected PIPT I-cache on CPU3 Sep 9 05:02:14.763813 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 9 05:02:14.763820 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 9 05:02:14.763827 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 05:02:14.763834 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 9 05:02:14.763841 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 05:02:14.763848 kernel: SMP: Total of 4 processors activated. Sep 9 05:02:14.763859 kernel: CPU: All CPU(s) started at EL1 Sep 9 05:02:14.763866 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 05:02:14.763873 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 05:02:14.763880 kernel: CPU features: detected: Common not Private translations Sep 9 05:02:14.763887 kernel: CPU features: detected: CRC32 instructions Sep 9 05:02:14.763894 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 9 05:02:14.763901 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 05:02:14.763908 kernel: CPU features: detected: LSE atomic instructions Sep 9 05:02:14.763915 kernel: CPU features: detected: Privileged Access Never Sep 9 05:02:14.763922 kernel: CPU features: detected: RAS Extension Support Sep 9 05:02:14.763930 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 05:02:14.763939 kernel: alternatives: applying system-wide alternatives Sep 9 05:02:14.763947 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 9 05:02:14.763954 kernel: Memory: 2424480K/2572288K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38976K init, 1038K bss, 125472K reserved, 16384K cma-reserved) Sep 9 05:02:14.763961 kernel: devtmpfs: initialized Sep 9 05:02:14.763968 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 05:02:14.763975 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 05:02:14.763984 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 05:02:14.763993 kernel: 0 pages in range for non-PLT usage Sep 9 05:02:14.763999 kernel: 508560 pages in range for PLT usage Sep 9 05:02:14.764006 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 05:02:14.764013 kernel: SMBIOS 3.0.0 present. Sep 9 05:02:14.764020 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 9 05:02:14.764027 kernel: DMI: Memory slots populated: 1/1 Sep 9 05:02:14.764034 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 05:02:14.764041 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 05:02:14.764048 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 05:02:14.764056 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 05:02:14.764063 kernel: audit: initializing netlink subsys (disabled) Sep 9 05:02:14.764072 kernel: audit: type=2000 audit(0.020:1): state=initialized audit_enabled=0 res=1 Sep 9 05:02:14.764079 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 05:02:14.764086 kernel: cpuidle: using governor menu Sep 9 05:02:14.764093 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 05:02:14.764100 kernel: ASID allocator initialised with 32768 entries Sep 9 05:02:14.764107 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 05:02:14.764114 kernel: Serial: AMBA PL011 UART driver Sep 9 05:02:14.764122 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 05:02:14.764129 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 05:02:14.764136 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 05:02:14.764143 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 05:02:14.764150 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 05:02:14.764160 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 05:02:14.764167 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 05:02:14.764174 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 05:02:14.764180 kernel: ACPI: Added _OSI(Module Device) Sep 9 05:02:14.764187 kernel: ACPI: Added _OSI(Processor Device) Sep 9 05:02:14.764204 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 05:02:14.764211 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 05:02:14.764218 kernel: ACPI: Interpreter enabled Sep 9 05:02:14.764225 kernel: ACPI: Using GIC for interrupt routing Sep 9 05:02:14.764232 kernel: ACPI: MCFG table detected, 1 entries Sep 9 05:02:14.764241 kernel: ACPI: CPU0 has been hot-added Sep 9 05:02:14.764248 kernel: ACPI: CPU1 has been hot-added Sep 9 05:02:14.764255 kernel: ACPI: CPU2 has been hot-added Sep 9 05:02:14.764262 kernel: ACPI: CPU3 has been hot-added Sep 9 05:02:14.764271 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 9 05:02:14.764278 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 05:02:14.764285 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 05:02:14.764427 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 05:02:14.764497 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 05:02:14.764556 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 05:02:14.764618 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 9 05:02:14.764688 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 9 05:02:14.764698 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 9 05:02:14.764705 kernel: PCI host bridge to bus 0000:00 Sep 9 05:02:14.764777 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 9 05:02:14.764832 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 9 05:02:14.764891 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 9 05:02:14.764944 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 05:02:14.765182 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 9 05:02:14.765294 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 05:02:14.765362 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 9 05:02:14.765422 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 9 05:02:14.765484 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 9 05:02:14.765547 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 9 05:02:14.765606 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 9 05:02:14.765680 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 9 05:02:14.765742 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 9 05:02:14.765799 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 9 05:02:14.765854 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 9 05:02:14.765866 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 9 05:02:14.765873 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 9 05:02:14.765881 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 9 05:02:14.765890 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 9 05:02:14.765897 kernel: iommu: Default domain type: Translated Sep 9 05:02:14.765904 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 05:02:14.765911 kernel: efivars: Registered efivars operations Sep 9 05:02:14.765919 kernel: vgaarb: loaded Sep 9 05:02:14.765926 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 05:02:14.765933 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 05:02:14.765940 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 05:02:14.765948 kernel: pnp: PnP ACPI init Sep 9 05:02:14.766027 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 9 05:02:14.766038 kernel: pnp: PnP ACPI: found 1 devices Sep 9 05:02:14.766046 kernel: NET: Registered PF_INET protocol family Sep 9 05:02:14.766053 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 05:02:14.766060 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 05:02:14.766067 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 05:02:14.766074 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 05:02:14.766084 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 05:02:14.766091 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 05:02:14.766100 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 05:02:14.766107 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 05:02:14.766114 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 05:02:14.766121 kernel: PCI: CLS 0 bytes, default 64 Sep 9 05:02:14.766128 kernel: kvm [1]: HYP mode not available Sep 9 05:02:14.766135 kernel: Initialise system trusted keyrings Sep 9 05:02:14.766142 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 05:02:14.766149 kernel: Key type asymmetric registered Sep 9 05:02:14.766156 kernel: Asymmetric key parser 'x509' registered Sep 9 05:02:14.766164 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 05:02:14.766171 kernel: io scheduler mq-deadline registered Sep 9 05:02:14.766178 kernel: io scheduler kyber registered Sep 9 05:02:14.766185 kernel: io scheduler bfq registered Sep 9 05:02:14.766206 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 9 05:02:14.766213 kernel: ACPI: button: Power Button [PWRB] Sep 9 05:02:14.766223 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 9 05:02:14.766288 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 9 05:02:14.766298 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 05:02:14.766309 kernel: thunder_xcv, ver 1.0 Sep 9 05:02:14.766316 kernel: thunder_bgx, ver 1.0 Sep 9 05:02:14.766323 kernel: nicpf, ver 1.0 Sep 9 05:02:14.766330 kernel: nicvf, ver 1.0 Sep 9 05:02:14.766402 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 05:02:14.766459 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T05:02:14 UTC (1757394134) Sep 9 05:02:14.766469 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 05:02:14.766479 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 05:02:14.766488 kernel: watchdog: NMI not fully supported Sep 9 05:02:14.766495 kernel: watchdog: Hard watchdog permanently disabled Sep 9 05:02:14.766502 kernel: NET: Registered PF_INET6 protocol family Sep 9 05:02:14.766509 kernel: Segment Routing with IPv6 Sep 9 05:02:14.766516 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 05:02:14.766523 kernel: NET: Registered PF_PACKET protocol family Sep 9 05:02:14.766530 kernel: Key type dns_resolver registered Sep 9 05:02:14.766537 kernel: registered taskstats version 1 Sep 9 05:02:14.766544 kernel: Loading compiled-in X.509 certificates Sep 9 05:02:14.766552 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 44d1e8b5c5ffbaa3cedd99c03d41580671fabec5' Sep 9 05:02:14.766561 kernel: Demotion targets for Node 0: null Sep 9 05:02:14.766568 kernel: Key type .fscrypt registered Sep 9 05:02:14.766575 kernel: Key type fscrypt-provisioning registered Sep 9 05:02:14.766582 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 05:02:14.766589 kernel: ima: Allocated hash algorithm: sha1 Sep 9 05:02:14.766596 kernel: ima: No architecture policies found Sep 9 05:02:14.766603 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 05:02:14.766610 kernel: clk: Disabling unused clocks Sep 9 05:02:14.766618 kernel: PM: genpd: Disabling unused power domains Sep 9 05:02:14.766625 kernel: Warning: unable to open an initial console. Sep 9 05:02:14.766632 kernel: Freeing unused kernel memory: 38976K Sep 9 05:02:14.766639 kernel: Run /init as init process Sep 9 05:02:14.766648 kernel: with arguments: Sep 9 05:02:14.766655 kernel: /init Sep 9 05:02:14.766662 kernel: with environment: Sep 9 05:02:14.766675 kernel: HOME=/ Sep 9 05:02:14.766682 kernel: TERM=linux Sep 9 05:02:14.766692 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 05:02:14.766700 systemd[1]: Successfully made /usr/ read-only. Sep 9 05:02:14.766710 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:02:14.766718 systemd[1]: Detected virtualization kvm. Sep 9 05:02:14.766725 systemd[1]: Detected architecture arm64. Sep 9 05:02:14.766734 systemd[1]: Running in initrd. Sep 9 05:02:14.766742 systemd[1]: No hostname configured, using default hostname. Sep 9 05:02:14.766751 systemd[1]: Hostname set to . Sep 9 05:02:14.766758 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:02:14.766766 systemd[1]: Queued start job for default target initrd.target. Sep 9 05:02:14.766773 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:02:14.766781 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:02:14.766789 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 05:02:14.766797 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:02:14.766804 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 05:02:14.766816 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 05:02:14.766824 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 05:02:14.766832 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 05:02:14.766839 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:02:14.766847 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:02:14.766854 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:02:14.766861 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:02:14.766870 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:02:14.766878 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:02:14.766885 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:02:14.766893 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:02:14.766902 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 05:02:14.766909 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 05:02:14.766917 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:02:14.766925 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:02:14.766933 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:02:14.766941 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:02:14.766949 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 05:02:14.766956 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:02:14.766964 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 05:02:14.766971 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 05:02:14.766979 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 05:02:14.766988 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:02:14.766996 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:02:14.767005 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:02:14.767012 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 05:02:14.767020 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:02:14.767028 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 05:02:14.767037 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:02:14.767064 systemd-journald[245]: Collecting audit messages is disabled. Sep 9 05:02:14.767086 systemd-journald[245]: Journal started Sep 9 05:02:14.767106 systemd-journald[245]: Runtime Journal (/run/log/journal/d1c4b0b26a9b4c368013162b31827925) is 6M, max 48.5M, 42.4M free. Sep 9 05:02:14.761827 systemd-modules-load[247]: Inserted module 'overlay' Sep 9 05:02:14.772935 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:02:14.776215 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 05:02:14.776252 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:02:14.777519 systemd-modules-load[247]: Inserted module 'br_netfilter' Sep 9 05:02:14.778494 kernel: Bridge firewalling registered Sep 9 05:02:14.779052 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 05:02:14.780834 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:02:14.782503 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:02:14.790566 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:02:14.793276 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:02:14.796032 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:02:14.796524 systemd-tmpfiles[264]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 05:02:14.806709 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:02:14.809745 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:02:14.812100 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 05:02:14.815393 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:02:14.816601 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:02:14.829516 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:02:14.844426 dracut-cmdline[282]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 05:02:14.858624 systemd-resolved[286]: Positive Trust Anchors: Sep 9 05:02:14.858644 systemd-resolved[286]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:02:14.858682 systemd-resolved[286]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:02:14.863383 systemd-resolved[286]: Defaulting to hostname 'linux'. Sep 9 05:02:14.864350 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:02:14.867957 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:02:14.926225 kernel: SCSI subsystem initialized Sep 9 05:02:14.930213 kernel: Loading iSCSI transport class v2.0-870. Sep 9 05:02:14.938238 kernel: iscsi: registered transport (tcp) Sep 9 05:02:14.950217 kernel: iscsi: registered transport (qla4xxx) Sep 9 05:02:14.950238 kernel: QLogic iSCSI HBA Driver Sep 9 05:02:14.966410 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:02:14.985969 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:02:14.988162 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:02:15.031713 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 05:02:15.033994 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 05:02:15.091217 kernel: raid6: neonx8 gen() 15777 MB/s Sep 9 05:02:15.108209 kernel: raid6: neonx4 gen() 15818 MB/s Sep 9 05:02:15.125210 kernel: raid6: neonx2 gen() 13291 MB/s Sep 9 05:02:15.142207 kernel: raid6: neonx1 gen() 10439 MB/s Sep 9 05:02:15.159214 kernel: raid6: int64x8 gen() 6903 MB/s Sep 9 05:02:15.176207 kernel: raid6: int64x4 gen() 7356 MB/s Sep 9 05:02:15.193208 kernel: raid6: int64x2 gen() 6102 MB/s Sep 9 05:02:15.210214 kernel: raid6: int64x1 gen() 5047 MB/s Sep 9 05:02:15.210237 kernel: raid6: using algorithm neonx4 gen() 15818 MB/s Sep 9 05:02:15.227219 kernel: raid6: .... xor() 12375 MB/s, rmw enabled Sep 9 05:02:15.227243 kernel: raid6: using neon recovery algorithm Sep 9 05:02:15.232216 kernel: xor: measuring software checksum speed Sep 9 05:02:15.232242 kernel: 8regs : 21590 MB/sec Sep 9 05:02:15.233343 kernel: 32regs : 21681 MB/sec Sep 9 05:02:15.233359 kernel: arm64_neon : 28032 MB/sec Sep 9 05:02:15.233368 kernel: xor: using function: arm64_neon (28032 MB/sec) Sep 9 05:02:15.285540 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 05:02:15.292040 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:02:15.297773 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:02:15.331254 systemd-udevd[495]: Using default interface naming scheme 'v255'. Sep 9 05:02:15.335461 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:02:15.339100 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 05:02:15.363704 dracut-pre-trigger[506]: rd.md=0: removing MD RAID activation Sep 9 05:02:15.391177 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:02:15.394545 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:02:15.450615 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:02:15.459548 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 05:02:15.514255 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 9 05:02:15.517059 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 05:02:15.517170 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 05:02:15.517181 kernel: GPT:9289727 != 19775487 Sep 9 05:02:15.528075 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 05:02:15.530610 kernel: GPT:9289727 != 19775487 Sep 9 05:02:15.530648 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 05:02:15.530659 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:02:15.531314 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:02:15.531433 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:02:15.535094 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:02:15.537519 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:02:15.557702 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 05:02:15.565237 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 05:02:15.566477 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:02:15.575700 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 05:02:15.584037 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 05:02:15.594624 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 05:02:15.595947 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 05:02:15.599145 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:02:15.601379 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:02:15.603498 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:02:15.606288 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 05:02:15.608020 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 05:02:15.633944 disk-uuid[590]: Primary Header is updated. Sep 9 05:02:15.633944 disk-uuid[590]: Secondary Entries is updated. Sep 9 05:02:15.633944 disk-uuid[590]: Secondary Header is updated. Sep 9 05:02:15.638221 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:02:15.638499 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:02:15.643209 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:02:16.645714 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 05:02:16.645771 disk-uuid[595]: The operation has completed successfully. Sep 9 05:02:16.674146 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 05:02:16.674253 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 05:02:16.696251 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 05:02:16.721294 sh[610]: Success Sep 9 05:02:16.735669 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 05:02:16.735711 kernel: device-mapper: uevent: version 1.0.3 Sep 9 05:02:16.736860 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 05:02:16.746244 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 05:02:16.774702 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 05:02:16.777460 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 05:02:16.790453 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 05:02:16.797435 kernel: BTRFS: device fsid 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (622) Sep 9 05:02:16.797478 kernel: BTRFS info (device dm-0): first mount of filesystem 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 Sep 9 05:02:16.797488 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 05:02:16.802209 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 05:02:16.802249 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 05:02:16.803355 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 05:02:16.804591 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:02:16.806254 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 05:02:16.811966 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 05:02:16.821464 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 05:02:16.836227 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (651) Sep 9 05:02:16.836281 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 05:02:16.837249 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 05:02:16.839622 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:02:16.839672 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:02:16.844580 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 05:02:16.845384 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 05:02:16.847503 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 05:02:16.914646 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:02:16.918424 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:02:16.948037 ignition[696]: Ignition 2.22.0 Sep 9 05:02:16.948052 ignition[696]: Stage: fetch-offline Sep 9 05:02:16.948140 ignition[696]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:02:16.948159 ignition[696]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:02:16.948271 ignition[696]: parsed url from cmdline: "" Sep 9 05:02:16.948274 ignition[696]: no config URL provided Sep 9 05:02:16.948279 ignition[696]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:02:16.948285 ignition[696]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:02:16.948307 ignition[696]: op(1): [started] loading QEMU firmware config module Sep 9 05:02:16.954226 systemd-networkd[804]: lo: Link UP Sep 9 05:02:16.948311 ignition[696]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 05:02:16.954230 systemd-networkd[804]: lo: Gained carrier Sep 9 05:02:16.953613 ignition[696]: op(1): [finished] loading QEMU firmware config module Sep 9 05:02:16.954909 systemd-networkd[804]: Enumeration completed Sep 9 05:02:16.955723 systemd-networkd[804]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:02:16.955726 systemd-networkd[804]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:02:16.955896 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:02:16.956552 systemd-networkd[804]: eth0: Link UP Sep 9 05:02:16.956671 systemd-networkd[804]: eth0: Gained carrier Sep 9 05:02:16.956680 systemd-networkd[804]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:02:16.957418 systemd[1]: Reached target network.target - Network. Sep 9 05:02:16.971237 systemd-networkd[804]: eth0: DHCPv4 address 10.0.0.90/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 05:02:17.006477 ignition[696]: parsing config with SHA512: b6cb530edd365cce5151a8a1f1961a477852f14e4e845bf25ce108065138219a86f6622a981c51bfe8bf90b6f24f3c7bdea8b875e85972b115d6e377f6190bf6 Sep 9 05:02:17.010107 unknown[696]: fetched base config from "system" Sep 9 05:02:17.010119 unknown[696]: fetched user config from "qemu" Sep 9 05:02:17.010464 ignition[696]: fetch-offline: fetch-offline passed Sep 9 05:02:17.012856 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:02:17.010514 ignition[696]: Ignition finished successfully Sep 9 05:02:17.014086 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 05:02:17.014890 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 05:02:17.045132 ignition[812]: Ignition 2.22.0 Sep 9 05:02:17.045152 ignition[812]: Stage: kargs Sep 9 05:02:17.045300 ignition[812]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:02:17.045309 ignition[812]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:02:17.046049 ignition[812]: kargs: kargs passed Sep 9 05:02:17.046093 ignition[812]: Ignition finished successfully Sep 9 05:02:17.050529 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 05:02:17.053026 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 05:02:17.078852 ignition[820]: Ignition 2.22.0 Sep 9 05:02:17.078868 ignition[820]: Stage: disks Sep 9 05:02:17.079007 ignition[820]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:02:17.079015 ignition[820]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:02:17.079803 ignition[820]: disks: disks passed Sep 9 05:02:17.081625 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 05:02:17.079847 ignition[820]: Ignition finished successfully Sep 9 05:02:17.083088 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 05:02:17.084340 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 05:02:17.086118 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:02:17.087596 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:02:17.089168 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:02:17.091799 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 05:02:17.121856 systemd-fsck[830]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 05:02:17.126340 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 05:02:17.128559 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 05:02:17.194218 kernel: EXT4-fs (vda9): mounted filesystem 88574756-967d-44b3-be66-46689c8baf27 r/w with ordered data mode. Quota mode: none. Sep 9 05:02:17.194597 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 05:02:17.195836 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 05:02:17.198736 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:02:17.200474 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 05:02:17.201386 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 05:02:17.201428 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 05:02:17.201451 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:02:17.215476 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 05:02:17.218025 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 05:02:17.221227 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (838) Sep 9 05:02:17.222212 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 05:02:17.222227 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 05:02:17.224387 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:02:17.224412 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:02:17.226266 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:02:17.252577 initrd-setup-root[862]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 05:02:17.255797 initrd-setup-root[869]: cut: /sysroot/etc/group: No such file or directory Sep 9 05:02:17.260103 initrd-setup-root[876]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 05:02:17.264038 initrd-setup-root[883]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 05:02:17.330893 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 05:02:17.332868 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 05:02:17.334383 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 05:02:17.353234 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 05:02:17.361871 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 05:02:17.381187 ignition[953]: INFO : Ignition 2.22.0 Sep 9 05:02:17.381187 ignition[953]: INFO : Stage: mount Sep 9 05:02:17.383899 ignition[953]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:02:17.383899 ignition[953]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:02:17.383899 ignition[953]: INFO : mount: mount passed Sep 9 05:02:17.383899 ignition[953]: INFO : Ignition finished successfully Sep 9 05:02:17.384667 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 05:02:17.386631 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 05:02:17.796400 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 05:02:17.797931 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:02:17.827354 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (963) Sep 9 05:02:17.827388 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 05:02:17.827399 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 05:02:17.830309 kernel: BTRFS info (device vda6): turning on async discard Sep 9 05:02:17.830352 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 05:02:17.831724 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:02:17.865887 ignition[981]: INFO : Ignition 2.22.0 Sep 9 05:02:17.865887 ignition[981]: INFO : Stage: files Sep 9 05:02:17.867686 ignition[981]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:02:17.867686 ignition[981]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:02:17.867686 ignition[981]: DEBUG : files: compiled without relabeling support, skipping Sep 9 05:02:17.871065 ignition[981]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 05:02:17.871065 ignition[981]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 05:02:17.871065 ignition[981]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 05:02:17.871065 ignition[981]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 05:02:17.871065 ignition[981]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 05:02:17.870836 unknown[981]: wrote ssh authorized keys file for user: core Sep 9 05:02:17.878420 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 9 05:02:17.878420 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 9 05:02:17.941066 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 05:02:18.209972 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 9 05:02:18.209972 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:02:18.213860 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 05:02:18.230451 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 05:02:18.230451 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 05:02:18.230451 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 9 05:02:18.632506 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 05:02:18.656340 systemd-networkd[804]: eth0: Gained IPv6LL Sep 9 05:02:18.964500 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 9 05:02:18.964500 ignition[981]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 05:02:18.968131 ignition[981]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:02:18.970534 ignition[981]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:02:18.970534 ignition[981]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 05:02:18.970534 ignition[981]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 05:02:18.975624 ignition[981]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 05:02:18.975624 ignition[981]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 05:02:18.975624 ignition[981]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 05:02:18.975624 ignition[981]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 05:02:18.986359 ignition[981]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 05:02:18.991261 ignition[981]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 05:02:18.992792 ignition[981]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 05:02:18.992792 ignition[981]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 05:02:18.992792 ignition[981]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 05:02:18.992792 ignition[981]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:02:18.992792 ignition[981]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:02:18.992792 ignition[981]: INFO : files: files passed Sep 9 05:02:18.992792 ignition[981]: INFO : Ignition finished successfully Sep 9 05:02:18.996249 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 05:02:19.004347 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 05:02:19.012051 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 05:02:19.028794 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 05:02:19.028901 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 05:02:19.032412 initrd-setup-root-after-ignition[1009]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 05:02:19.033846 initrd-setup-root-after-ignition[1011]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:02:19.033846 initrd-setup-root-after-ignition[1011]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:02:19.036994 initrd-setup-root-after-ignition[1015]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:02:19.036279 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:02:19.038157 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 05:02:19.043360 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 05:02:19.078485 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 05:02:19.078588 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 05:02:19.080574 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 05:02:19.082207 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 05:02:19.083939 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 05:02:19.084640 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 05:02:19.098388 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:02:19.100663 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 05:02:19.127276 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:02:19.128478 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:02:19.130282 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 05:02:19.132040 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 05:02:19.132162 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:02:19.134374 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 05:02:19.136257 systemd[1]: Stopped target basic.target - Basic System. Sep 9 05:02:19.137882 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 05:02:19.139454 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:02:19.141262 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 05:02:19.143188 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:02:19.145079 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 05:02:19.146874 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:02:19.148693 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 05:02:19.150535 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 05:02:19.152187 systemd[1]: Stopped target swap.target - Swaps. Sep 9 05:02:19.153718 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 05:02:19.153842 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:02:19.155954 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:02:19.157053 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:02:19.158920 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 05:02:19.162269 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:02:19.163358 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 05:02:19.163469 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 05:02:19.166039 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 05:02:19.166140 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:02:19.168004 systemd[1]: Stopped target paths.target - Path Units. Sep 9 05:02:19.169526 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 05:02:19.169636 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:02:19.171449 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 05:02:19.172867 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 05:02:19.174465 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 05:02:19.174547 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:02:19.176500 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 05:02:19.176577 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:02:19.178002 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 05:02:19.178109 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:02:19.179751 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 05:02:19.179849 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 05:02:19.181970 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 05:02:19.184092 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 05:02:19.184998 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 05:02:19.185131 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:02:19.187066 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 05:02:19.187161 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:02:19.201136 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 05:02:19.205360 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 05:02:19.214171 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 05:02:19.221262 ignition[1036]: INFO : Ignition 2.22.0 Sep 9 05:02:19.221262 ignition[1036]: INFO : Stage: umount Sep 9 05:02:19.223999 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:02:19.223999 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 05:02:19.223999 ignition[1036]: INFO : umount: umount passed Sep 9 05:02:19.223999 ignition[1036]: INFO : Ignition finished successfully Sep 9 05:02:19.224088 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 05:02:19.224188 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 05:02:19.226168 systemd[1]: Stopped target network.target - Network. Sep 9 05:02:19.227906 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 05:02:19.227969 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 05:02:19.229691 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 05:02:19.229748 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 05:02:19.231522 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 05:02:19.231573 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 05:02:19.233108 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 05:02:19.233148 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 05:02:19.235009 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 05:02:19.236320 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 05:02:19.244235 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 05:02:19.244368 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 05:02:19.248366 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 05:02:19.248571 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 05:02:19.248684 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 05:02:19.252442 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 05:02:19.253030 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 05:02:19.255072 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 05:02:19.255108 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:02:19.258755 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 05:02:19.260040 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 05:02:19.260095 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:02:19.262131 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 05:02:19.262173 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:02:19.265189 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 05:02:19.265243 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 05:02:19.267077 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 05:02:19.267120 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:02:19.270187 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:02:19.277077 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 05:02:19.277151 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:02:19.286250 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 05:02:19.286371 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 05:02:19.288297 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 05:02:19.288430 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:02:19.290571 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 05:02:19.290655 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 05:02:19.291951 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 05:02:19.291987 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:02:19.294165 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 05:02:19.294299 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:02:19.297074 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 05:02:19.297130 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 05:02:19.300823 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 05:02:19.300881 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:02:19.304713 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 05:02:19.305844 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 05:02:19.305905 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:02:19.308895 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 05:02:19.308941 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:02:19.312468 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 05:02:19.312512 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:02:19.316672 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 05:02:19.316723 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:02:19.319579 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:02:19.319626 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:02:19.323923 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 05:02:19.323978 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 9 05:02:19.324005 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 05:02:19.324039 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:02:19.324358 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 05:02:19.325039 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 05:02:19.327359 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 05:02:19.329248 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 05:02:19.330984 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 05:02:19.332569 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 05:02:19.332652 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 05:02:19.335410 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 05:02:19.355832 systemd[1]: Switching root. Sep 9 05:02:19.397299 systemd-journald[245]: Journal stopped Sep 9 05:02:20.175930 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Sep 9 05:02:20.175981 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 05:02:20.175998 kernel: SELinux: policy capability open_perms=1 Sep 9 05:02:20.176007 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 05:02:20.176016 kernel: SELinux: policy capability always_check_network=0 Sep 9 05:02:20.176024 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 05:02:20.176034 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 05:02:20.176043 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 05:02:20.176055 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 05:02:20.176069 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 05:02:20.176078 kernel: audit: type=1403 audit(1757394139.634:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 05:02:20.176088 systemd[1]: Successfully loaded SELinux policy in 43.902ms. Sep 9 05:02:20.176104 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.271ms. Sep 9 05:02:20.176115 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:02:20.176125 systemd[1]: Detected virtualization kvm. Sep 9 05:02:20.176135 systemd[1]: Detected architecture arm64. Sep 9 05:02:20.176144 systemd[1]: Detected first boot. Sep 9 05:02:20.176156 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:02:20.176166 zram_generator::config[1083]: No configuration found. Sep 9 05:02:20.176176 kernel: NET: Registered PF_VSOCK protocol family Sep 9 05:02:20.176186 systemd[1]: Populated /etc with preset unit settings. Sep 9 05:02:20.176214 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 05:02:20.176226 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 05:02:20.176236 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 05:02:20.176246 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 05:02:20.176257 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 05:02:20.176269 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 05:02:20.176279 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 05:02:20.176289 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 05:02:20.176299 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 05:02:20.176309 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 05:02:20.176319 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 05:02:20.176329 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 05:02:20.176339 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:02:20.176350 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:02:20.176360 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 05:02:20.176370 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 05:02:20.176380 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 05:02:20.176390 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:02:20.176399 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 05:02:20.176410 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:02:20.176419 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:02:20.176430 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 05:02:20.176441 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 05:02:20.176451 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 05:02:20.176460 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 05:02:20.176470 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:02:20.176480 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:02:20.176490 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:02:20.176500 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:02:20.176509 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 05:02:20.176520 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 05:02:20.176530 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 05:02:20.176540 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:02:20.176554 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:02:20.176564 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:02:20.176577 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 05:02:20.176587 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 05:02:20.176597 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 05:02:20.176606 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 05:02:20.176617 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 05:02:20.176627 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 05:02:20.176643 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 05:02:20.176656 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 05:02:20.176667 systemd[1]: Reached target machines.target - Containers. Sep 9 05:02:20.176677 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 05:02:20.176687 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:02:20.176697 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:02:20.176707 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 05:02:20.176718 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:02:20.176729 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:02:20.176739 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:02:20.176751 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 05:02:20.176761 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:02:20.176771 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 05:02:20.176781 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 05:02:20.176790 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 05:02:20.176801 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 05:02:20.176811 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 05:02:20.176822 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:02:20.176831 kernel: fuse: init (API version 7.41) Sep 9 05:02:20.176840 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:02:20.176850 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:02:20.176861 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:02:20.176871 kernel: ACPI: bus type drm_connector registered Sep 9 05:02:20.176880 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 05:02:20.176891 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 05:02:20.176901 kernel: loop: module loaded Sep 9 05:02:20.176910 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:02:20.176920 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 05:02:20.176930 systemd[1]: Stopped verity-setup.service. Sep 9 05:02:20.176942 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 05:02:20.176952 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 05:02:20.176962 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 05:02:20.176995 systemd-journald[1151]: Collecting audit messages is disabled. Sep 9 05:02:20.177018 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 05:02:20.177029 systemd-journald[1151]: Journal started Sep 9 05:02:20.177048 systemd-journald[1151]: Runtime Journal (/run/log/journal/d1c4b0b26a9b4c368013162b31827925) is 6M, max 48.5M, 42.4M free. Sep 9 05:02:19.987296 systemd[1]: Queued start job for default target multi-user.target. Sep 9 05:02:20.012165 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 05:02:20.012542 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 05:02:20.179076 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:02:20.179905 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 05:02:20.181250 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 05:02:20.184232 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:02:20.185694 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 05:02:20.187024 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 05:02:20.187220 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 05:02:20.188581 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:02:20.188859 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:02:20.190375 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:02:20.190541 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:02:20.191789 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:02:20.191955 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:02:20.194556 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 05:02:20.194731 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 05:02:20.196072 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:02:20.196289 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:02:20.197572 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:02:20.199043 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:02:20.200557 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 05:02:20.202270 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 05:02:20.215007 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:02:20.217271 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 05:02:20.219367 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 05:02:20.220260 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 05:02:20.220297 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:02:20.221902 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 05:02:20.231318 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 05:02:20.232265 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:02:20.233569 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 05:02:20.235467 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 05:02:20.236453 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:02:20.239332 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 05:02:20.240601 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:02:20.241507 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:02:20.244293 systemd-journald[1151]: Time spent on flushing to /var/log/journal/d1c4b0b26a9b4c368013162b31827925 is 17.947ms for 890 entries. Sep 9 05:02:20.244293 systemd-journald[1151]: System Journal (/var/log/journal/d1c4b0b26a9b4c368013162b31827925) is 8M, max 195.6M, 187.6M free. Sep 9 05:02:20.268271 systemd-journald[1151]: Received client request to flush runtime journal. Sep 9 05:02:20.268306 kernel: loop0: detected capacity change from 0 to 100632 Sep 9 05:02:20.244389 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 05:02:20.247528 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:02:20.252545 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:02:20.253984 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 05:02:20.255554 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 05:02:20.270115 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 05:02:20.272627 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 05:02:20.276327 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 05:02:20.278567 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 05:02:20.281255 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 05:02:20.284212 systemd-tmpfiles[1200]: ACLs are not supported, ignoring. Sep 9 05:02:20.284226 systemd-tmpfiles[1200]: ACLs are not supported, ignoring. Sep 9 05:02:20.284725 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:02:20.288384 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:02:20.291421 kernel: loop1: detected capacity change from 0 to 203944 Sep 9 05:02:20.292113 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 05:02:20.312234 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 05:02:20.319217 kernel: loop2: detected capacity change from 0 to 119368 Sep 9 05:02:20.323050 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 05:02:20.325823 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:02:20.346303 systemd-tmpfiles[1222]: ACLs are not supported, ignoring. Sep 9 05:02:20.346316 systemd-tmpfiles[1222]: ACLs are not supported, ignoring. Sep 9 05:02:20.348264 kernel: loop3: detected capacity change from 0 to 100632 Sep 9 05:02:20.351857 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:02:20.356233 kernel: loop4: detected capacity change from 0 to 203944 Sep 9 05:02:20.362244 kernel: loop5: detected capacity change from 0 to 119368 Sep 9 05:02:20.366388 (sd-merge)[1225]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 05:02:20.366768 (sd-merge)[1225]: Merged extensions into '/usr'. Sep 9 05:02:20.370660 systemd[1]: Reload requested from client PID 1199 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 05:02:20.370785 systemd[1]: Reloading... Sep 9 05:02:20.432225 zram_generator::config[1260]: No configuration found. Sep 9 05:02:20.493920 ldconfig[1194]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 05:02:20.562018 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 05:02:20.562579 systemd[1]: Reloading finished in 191 ms. Sep 9 05:02:20.581735 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 05:02:20.583517 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 05:02:20.595551 systemd[1]: Starting ensure-sysext.service... Sep 9 05:02:20.597282 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:02:20.606512 systemd[1]: Reload requested from client PID 1286 ('systemctl') (unit ensure-sysext.service)... Sep 9 05:02:20.606529 systemd[1]: Reloading... Sep 9 05:02:20.611158 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 05:02:20.611205 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 05:02:20.611439 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 05:02:20.611632 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 05:02:20.612280 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 05:02:20.612492 systemd-tmpfiles[1287]: ACLs are not supported, ignoring. Sep 9 05:02:20.612540 systemd-tmpfiles[1287]: ACLs are not supported, ignoring. Sep 9 05:02:20.615180 systemd-tmpfiles[1287]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:02:20.615204 systemd-tmpfiles[1287]: Skipping /boot Sep 9 05:02:20.620921 systemd-tmpfiles[1287]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:02:20.620938 systemd-tmpfiles[1287]: Skipping /boot Sep 9 05:02:20.660238 zram_generator::config[1314]: No configuration found. Sep 9 05:02:20.790262 systemd[1]: Reloading finished in 183 ms. Sep 9 05:02:20.811200 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 05:02:20.816771 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:02:20.830187 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:02:20.832454 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 05:02:20.848129 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 05:02:20.851177 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:02:20.853694 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:02:20.857358 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 05:02:20.863568 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:02:20.864668 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:02:20.866797 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:02:20.869145 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:02:20.870370 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:02:20.871241 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:02:20.873223 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 05:02:20.877459 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:02:20.878431 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:02:20.878523 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:02:20.880966 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 05:02:20.884555 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 05:02:20.888531 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:02:20.888716 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:02:20.890361 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:02:20.890502 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:02:20.892926 systemd-udevd[1355]: Using default interface naming scheme 'v255'. Sep 9 05:02:20.893933 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 05:02:20.897650 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:02:20.897824 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:02:20.899671 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 05:02:20.908558 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:02:20.909852 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:02:20.913493 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:02:20.918142 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:02:20.924263 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:02:20.925949 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:02:20.926124 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:02:20.926450 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 05:02:20.928659 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:02:20.933122 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:02:20.933346 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:02:20.937848 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:02:20.939270 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:02:20.940793 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:02:20.940943 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:02:20.951523 systemd[1]: Finished ensure-sysext.service. Sep 9 05:02:20.952545 augenrules[1416]: No rules Sep 9 05:02:20.954144 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 05:02:20.960541 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:02:20.985875 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:02:20.988833 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 05:02:20.990625 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:02:20.991257 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:02:21.017328 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 05:02:21.023833 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 05:02:21.026679 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 05:02:21.030075 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:02:21.031118 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:02:21.031184 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:02:21.037407 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 05:02:21.049381 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 05:02:21.079976 systemd-resolved[1354]: Positive Trust Anchors: Sep 9 05:02:21.079995 systemd-resolved[1354]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:02:21.080027 systemd-resolved[1354]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:02:21.087556 systemd-resolved[1354]: Defaulting to hostname 'linux'. Sep 9 05:02:21.088114 systemd-networkd[1442]: lo: Link UP Sep 9 05:02:21.088118 systemd-networkd[1442]: lo: Gained carrier Sep 9 05:02:21.088873 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:02:21.089277 systemd-networkd[1442]: Enumeration completed Sep 9 05:02:21.089699 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:02:21.089709 systemd-networkd[1442]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:02:21.089886 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:02:21.090466 systemd-networkd[1442]: eth0: Link UP Sep 9 05:02:21.090587 systemd-networkd[1442]: eth0: Gained carrier Sep 9 05:02:21.090605 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:02:21.090886 systemd[1]: Reached target network.target - Network. Sep 9 05:02:21.092103 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:02:21.094319 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 05:02:21.097538 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 05:02:21.098696 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 05:02:21.101345 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:02:21.102385 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 05:02:21.103339 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 05:02:21.105469 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 05:02:21.106695 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 05:02:21.106728 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:02:21.108389 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 05:02:21.109303 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 05:02:21.110168 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 05:02:21.111363 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:02:21.113827 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 05:02:21.115949 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 05:02:21.118852 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 05:02:21.121540 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 05:02:21.122339 systemd-networkd[1442]: eth0: DHCPv4 address 10.0.0.90/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 05:02:21.122879 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 05:02:21.123509 systemd-timesyncd[1443]: Network configuration changed, trying to establish connection. Sep 9 05:02:21.124774 systemd-timesyncd[1443]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 05:02:21.124831 systemd-timesyncd[1443]: Initial clock synchronization to Tue 2025-09-09 05:02:20.901448 UTC. Sep 9 05:02:21.126757 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 05:02:21.128059 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 05:02:21.129773 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 05:02:21.130968 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:02:21.132010 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:02:21.132955 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:02:21.132977 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:02:21.134877 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 05:02:21.139379 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 05:02:21.143304 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 05:02:21.147444 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 05:02:21.150504 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 05:02:21.153305 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 05:02:21.154413 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 05:02:21.159206 jq[1470]: false Sep 9 05:02:21.157132 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 05:02:21.160282 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 05:02:21.162360 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 05:02:21.169347 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 05:02:21.171152 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 05:02:21.171586 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 05:02:21.172185 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 05:02:21.173883 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 05:02:21.176037 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 05:02:21.179045 extend-filesystems[1472]: Found /dev/vda6 Sep 9 05:02:21.178807 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 05:02:21.180222 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 05:02:21.180388 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 05:02:21.185137 jq[1485]: true Sep 9 05:02:21.186375 extend-filesystems[1472]: Found /dev/vda9 Sep 9 05:02:21.189586 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 05:02:21.192333 extend-filesystems[1472]: Checking size of /dev/vda9 Sep 9 05:02:21.194354 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 05:02:21.197093 update_engine[1483]: I20250909 05:02:21.196663 1483 main.cc:92] Flatcar Update Engine starting Sep 9 05:02:21.197653 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 05:02:21.198390 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 05:02:21.204853 extend-filesystems[1472]: Resized partition /dev/vda9 Sep 9 05:02:21.209969 extend-filesystems[1508]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 05:02:21.212464 tar[1492]: linux-arm64/helm Sep 9 05:02:21.217215 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 05:02:21.221111 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:02:21.224316 (ntainerd)[1498]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 05:02:21.233243 jq[1496]: true Sep 9 05:02:21.255357 dbus-daemon[1468]: [system] SELinux support is enabled Sep 9 05:02:21.273403 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 05:02:21.255607 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 05:02:21.273534 update_engine[1483]: I20250909 05:02:21.259318 1483 update_check_scheduler.cc:74] Next update check in 3m17s Sep 9 05:02:21.273566 extend-filesystems[1508]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 05:02:21.273566 extend-filesystems[1508]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 05:02:21.273566 extend-filesystems[1508]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 05:02:21.266615 systemd[1]: Started update-engine.service - Update Engine. Sep 9 05:02:21.289447 extend-filesystems[1472]: Resized filesystem in /dev/vda9 Sep 9 05:02:21.268481 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 05:02:21.268502 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 05:02:21.270451 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 05:02:21.270469 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 05:02:21.274439 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 05:02:21.281967 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 05:02:21.282163 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 05:02:21.314601 systemd-logind[1481]: Watching system buttons on /dev/input/event0 (Power Button) Sep 9 05:02:21.315417 systemd-logind[1481]: New seat seat0. Sep 9 05:02:21.316427 bash[1535]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:02:21.317539 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 05:02:21.319467 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 05:02:21.323087 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 05:02:21.338067 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:02:21.349414 locksmithd[1523]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 05:02:21.421177 containerd[1498]: time="2025-09-09T05:02:21Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 05:02:21.421735 containerd[1498]: time="2025-09-09T05:02:21.421697520Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 05:02:21.434896 containerd[1498]: time="2025-09-09T05:02:21.434835560Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.32µs" Sep 9 05:02:21.434896 containerd[1498]: time="2025-09-09T05:02:21.434884480Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 05:02:21.435022 containerd[1498]: time="2025-09-09T05:02:21.434907440Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 05:02:21.435088 containerd[1498]: time="2025-09-09T05:02:21.435065120Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 05:02:21.435112 containerd[1498]: time="2025-09-09T05:02:21.435086560Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 05:02:21.435130 containerd[1498]: time="2025-09-09T05:02:21.435109800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435179 containerd[1498]: time="2025-09-09T05:02:21.435159600Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435179 containerd[1498]: time="2025-09-09T05:02:21.435175320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435443 containerd[1498]: time="2025-09-09T05:02:21.435418120Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435443 containerd[1498]: time="2025-09-09T05:02:21.435439560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435488 containerd[1498]: time="2025-09-09T05:02:21.435450680Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435488 containerd[1498]: time="2025-09-09T05:02:21.435458600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435552 containerd[1498]: time="2025-09-09T05:02:21.435533840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435775 containerd[1498]: time="2025-09-09T05:02:21.435750520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435801 containerd[1498]: time="2025-09-09T05:02:21.435785280Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:02:21.435801 containerd[1498]: time="2025-09-09T05:02:21.435797480Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 05:02:21.435863 containerd[1498]: time="2025-09-09T05:02:21.435834920Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 05:02:21.436248 containerd[1498]: time="2025-09-09T05:02:21.436070440Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 05:02:21.436248 containerd[1498]: time="2025-09-09T05:02:21.436167240Z" level=info msg="metadata content store policy set" policy=shared Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441727920Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441786800Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441802760Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441817280Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441829640Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441840600Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441854960Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441873560Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441885440Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441896520Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441906080Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.441917520Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.442029320Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 05:02:21.442880 containerd[1498]: time="2025-09-09T05:02:21.442049600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442063280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442079520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442090520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442101440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442112680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442122200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442133520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442144400Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 05:02:21.443150 containerd[1498]: time="2025-09-09T05:02:21.442154920Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 05:02:21.443562 containerd[1498]: time="2025-09-09T05:02:21.443535600Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 05:02:21.443663 containerd[1498]: time="2025-09-09T05:02:21.443612600Z" level=info msg="Start snapshots syncer" Sep 9 05:02:21.443737 containerd[1498]: time="2025-09-09T05:02:21.443722480Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 05:02:21.444350 containerd[1498]: time="2025-09-09T05:02:21.444294440Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 05:02:21.444538 containerd[1498]: time="2025-09-09T05:02:21.444517280Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 05:02:21.444726 containerd[1498]: time="2025-09-09T05:02:21.444699920Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 05:02:21.444922 containerd[1498]: time="2025-09-09T05:02:21.444900800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 05:02:21.444992 containerd[1498]: time="2025-09-09T05:02:21.444979000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 05:02:21.445039 containerd[1498]: time="2025-09-09T05:02:21.445028200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 05:02:21.445087 containerd[1498]: time="2025-09-09T05:02:21.445075080Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 05:02:21.445139 containerd[1498]: time="2025-09-09T05:02:21.445127600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 05:02:21.445272 containerd[1498]: time="2025-09-09T05:02:21.445256200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 05:02:21.445325 containerd[1498]: time="2025-09-09T05:02:21.445313760Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 05:02:21.445410 containerd[1498]: time="2025-09-09T05:02:21.445395120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 05:02:21.445480 containerd[1498]: time="2025-09-09T05:02:21.445467240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 05:02:21.445529 containerd[1498]: time="2025-09-09T05:02:21.445518160Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 05:02:21.445599 containerd[1498]: time="2025-09-09T05:02:21.445585560Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:02:21.445666 containerd[1498]: time="2025-09-09T05:02:21.445651160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:02:21.445724 containerd[1498]: time="2025-09-09T05:02:21.445710880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:02:21.445777 containerd[1498]: time="2025-09-09T05:02:21.445764680Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:02:21.445819 containerd[1498]: time="2025-09-09T05:02:21.445809080Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 05:02:21.445864 containerd[1498]: time="2025-09-09T05:02:21.445853160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 05:02:21.445911 containerd[1498]: time="2025-09-09T05:02:21.445900520Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 05:02:21.446026 containerd[1498]: time="2025-09-09T05:02:21.446015760Z" level=info msg="runtime interface created" Sep 9 05:02:21.446074 containerd[1498]: time="2025-09-09T05:02:21.446063680Z" level=info msg="created NRI interface" Sep 9 05:02:21.446122 containerd[1498]: time="2025-09-09T05:02:21.446110400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 05:02:21.446170 containerd[1498]: time="2025-09-09T05:02:21.446159880Z" level=info msg="Connect containerd service" Sep 9 05:02:21.446348 containerd[1498]: time="2025-09-09T05:02:21.446329720Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 05:02:21.447103 containerd[1498]: time="2025-09-09T05:02:21.447074840Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 05:02:21.511790 containerd[1498]: time="2025-09-09T05:02:21.511723240Z" level=info msg="Start subscribing containerd event" Sep 9 05:02:21.511958 containerd[1498]: time="2025-09-09T05:02:21.511943720Z" level=info msg="Start recovering state" Sep 9 05:02:21.512245 containerd[1498]: time="2025-09-09T05:02:21.512228560Z" level=info msg="Start event monitor" Sep 9 05:02:21.512432 containerd[1498]: time="2025-09-09T05:02:21.512419520Z" level=info msg="Start cni network conf syncer for default" Sep 9 05:02:21.512495 containerd[1498]: time="2025-09-09T05:02:21.512485240Z" level=info msg="Start streaming server" Sep 9 05:02:21.512542 containerd[1498]: time="2025-09-09T05:02:21.512531200Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 05:02:21.512861 containerd[1498]: time="2025-09-09T05:02:21.512845080Z" level=info msg="runtime interface starting up..." Sep 9 05:02:21.512922 containerd[1498]: time="2025-09-09T05:02:21.512911520Z" level=info msg="starting plugins..." Sep 9 05:02:21.512975 containerd[1498]: time="2025-09-09T05:02:21.512964560Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 05:02:21.513259 containerd[1498]: time="2025-09-09T05:02:21.513239840Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 05:02:21.513475 containerd[1498]: time="2025-09-09T05:02:21.513449480Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 05:02:21.513677 containerd[1498]: time="2025-09-09T05:02:21.513653000Z" level=info msg="containerd successfully booted in 0.092840s" Sep 9 05:02:21.513746 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 05:02:21.529406 tar[1492]: linux-arm64/LICENSE Sep 9 05:02:21.529474 tar[1492]: linux-arm64/README.md Sep 9 05:02:21.548240 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 05:02:22.688461 systemd-networkd[1442]: eth0: Gained IPv6LL Sep 9 05:02:22.690844 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 05:02:22.694757 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 05:02:22.697294 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 05:02:22.699814 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:22.703385 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 05:02:22.731134 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 05:02:22.733276 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 05:02:22.735478 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 05:02:22.738964 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 05:02:22.946009 sshd_keygen[1494]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 05:02:22.971132 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 05:02:22.975443 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 05:02:22.998227 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 05:02:22.999255 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 05:02:23.002022 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 05:02:23.025461 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 05:02:23.028520 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 05:02:23.030727 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 05:02:23.032141 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 05:02:23.266975 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:23.268554 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 05:02:23.269634 systemd[1]: Startup finished in 2.008s (kernel) + 5.036s (initrd) + 3.679s (userspace) = 10.724s. Sep 9 05:02:23.270829 (kubelet)[1608]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:02:23.632256 kubelet[1608]: E0909 05:02:23.632146 1608 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:02:23.634495 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:02:23.634619 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:02:23.634917 systemd[1]: kubelet.service: Consumed 771ms CPU time, 257M memory peak. Sep 9 05:02:27.798550 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 05:02:27.799616 systemd[1]: Started sshd@0-10.0.0.90:22-10.0.0.1:34232.service - OpenSSH per-connection server daemon (10.0.0.1:34232). Sep 9 05:02:27.878146 sshd[1621]: Accepted publickey for core from 10.0.0.1 port 34232 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:02:27.882769 sshd-session[1621]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:02:27.892988 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 05:02:27.895278 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 05:02:27.905728 systemd-logind[1481]: New session 1 of user core. Sep 9 05:02:27.914262 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 05:02:27.916964 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 05:02:27.936483 (systemd)[1626]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 05:02:27.938881 systemd-logind[1481]: New session c1 of user core. Sep 9 05:02:28.067171 systemd[1626]: Queued start job for default target default.target. Sep 9 05:02:28.074228 systemd[1626]: Created slice app.slice - User Application Slice. Sep 9 05:02:28.074402 systemd[1626]: Reached target paths.target - Paths. Sep 9 05:02:28.074571 systemd[1626]: Reached target timers.target - Timers. Sep 9 05:02:28.075982 systemd[1626]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 05:02:28.085774 systemd[1626]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 05:02:28.085840 systemd[1626]: Reached target sockets.target - Sockets. Sep 9 05:02:28.085941 systemd[1626]: Reached target basic.target - Basic System. Sep 9 05:02:28.086010 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 05:02:28.086260 systemd[1626]: Reached target default.target - Main User Target. Sep 9 05:02:28.086299 systemd[1626]: Startup finished in 140ms. Sep 9 05:02:28.087412 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 05:02:28.150562 systemd[1]: Started sshd@1-10.0.0.90:22-10.0.0.1:34242.service - OpenSSH per-connection server daemon (10.0.0.1:34242). Sep 9 05:02:28.202437 sshd[1637]: Accepted publickey for core from 10.0.0.1 port 34242 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:02:28.203821 sshd-session[1637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:02:28.208972 systemd-logind[1481]: New session 2 of user core. Sep 9 05:02:28.220408 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 05:02:28.272254 sshd[1640]: Connection closed by 10.0.0.1 port 34242 Sep 9 05:02:28.272569 sshd-session[1637]: pam_unix(sshd:session): session closed for user core Sep 9 05:02:28.286317 systemd[1]: sshd@1-10.0.0.90:22-10.0.0.1:34242.service: Deactivated successfully. Sep 9 05:02:28.288543 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 05:02:28.289248 systemd-logind[1481]: Session 2 logged out. Waiting for processes to exit. Sep 9 05:02:28.291616 systemd[1]: Started sshd@2-10.0.0.90:22-10.0.0.1:34252.service - OpenSSH per-connection server daemon (10.0.0.1:34252). Sep 9 05:02:28.292742 systemd-logind[1481]: Removed session 2. Sep 9 05:02:28.338014 sshd[1646]: Accepted publickey for core from 10.0.0.1 port 34252 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:02:28.339130 sshd-session[1646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:02:28.343287 systemd-logind[1481]: New session 3 of user core. Sep 9 05:02:28.353404 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 05:02:28.401256 sshd[1649]: Connection closed by 10.0.0.1 port 34252 Sep 9 05:02:28.401522 sshd-session[1646]: pam_unix(sshd:session): session closed for user core Sep 9 05:02:28.415372 systemd[1]: sshd@2-10.0.0.90:22-10.0.0.1:34252.service: Deactivated successfully. Sep 9 05:02:28.417102 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 05:02:28.417950 systemd-logind[1481]: Session 3 logged out. Waiting for processes to exit. Sep 9 05:02:28.420717 systemd[1]: Started sshd@3-10.0.0.90:22-10.0.0.1:34256.service - OpenSSH per-connection server daemon (10.0.0.1:34256). Sep 9 05:02:28.421476 systemd-logind[1481]: Removed session 3. Sep 9 05:02:28.471781 sshd[1655]: Accepted publickey for core from 10.0.0.1 port 34256 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:02:28.473021 sshd-session[1655]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:02:28.476864 systemd-logind[1481]: New session 4 of user core. Sep 9 05:02:28.491372 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 05:02:28.543261 sshd[1658]: Connection closed by 10.0.0.1 port 34256 Sep 9 05:02:28.543510 sshd-session[1655]: pam_unix(sshd:session): session closed for user core Sep 9 05:02:28.560300 systemd[1]: sshd@3-10.0.0.90:22-10.0.0.1:34256.service: Deactivated successfully. Sep 9 05:02:28.561987 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 05:02:28.563070 systemd-logind[1481]: Session 4 logged out. Waiting for processes to exit. Sep 9 05:02:28.565619 systemd[1]: Started sshd@4-10.0.0.90:22-10.0.0.1:34270.service - OpenSSH per-connection server daemon (10.0.0.1:34270). Sep 9 05:02:28.566049 systemd-logind[1481]: Removed session 4. Sep 9 05:02:28.630381 sshd[1664]: Accepted publickey for core from 10.0.0.1 port 34270 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:02:28.631708 sshd-session[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:02:28.636258 systemd-logind[1481]: New session 5 of user core. Sep 9 05:02:28.647367 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 05:02:28.704419 sudo[1668]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 05:02:28.704685 sudo[1668]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:02:28.717114 sudo[1668]: pam_unix(sudo:session): session closed for user root Sep 9 05:02:28.719226 sshd[1667]: Connection closed by 10.0.0.1 port 34270 Sep 9 05:02:28.719220 sshd-session[1664]: pam_unix(sshd:session): session closed for user core Sep 9 05:02:28.732278 systemd[1]: sshd@4-10.0.0.90:22-10.0.0.1:34270.service: Deactivated successfully. Sep 9 05:02:28.734058 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 05:02:28.735960 systemd-logind[1481]: Session 5 logged out. Waiting for processes to exit. Sep 9 05:02:28.738012 systemd[1]: Started sshd@5-10.0.0.90:22-10.0.0.1:34284.service - OpenSSH per-connection server daemon (10.0.0.1:34284). Sep 9 05:02:28.738962 systemd-logind[1481]: Removed session 5. Sep 9 05:02:28.802550 sshd[1674]: Accepted publickey for core from 10.0.0.1 port 34284 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:02:28.803811 sshd-session[1674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:02:28.807584 systemd-logind[1481]: New session 6 of user core. Sep 9 05:02:28.817377 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 05:02:28.868145 sudo[1679]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 05:02:28.868426 sudo[1679]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:02:28.941550 sudo[1679]: pam_unix(sudo:session): session closed for user root Sep 9 05:02:28.946347 sudo[1678]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 05:02:28.946602 sudo[1678]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:02:28.954813 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:02:28.990749 augenrules[1701]: No rules Sep 9 05:02:28.992027 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:02:28.992289 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:02:28.993204 sudo[1678]: pam_unix(sudo:session): session closed for user root Sep 9 05:02:28.994632 sshd[1677]: Connection closed by 10.0.0.1 port 34284 Sep 9 05:02:28.995004 sshd-session[1674]: pam_unix(sshd:session): session closed for user core Sep 9 05:02:29.002025 systemd[1]: sshd@5-10.0.0.90:22-10.0.0.1:34284.service: Deactivated successfully. Sep 9 05:02:29.004439 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 05:02:29.005667 systemd-logind[1481]: Session 6 logged out. Waiting for processes to exit. Sep 9 05:02:29.007175 systemd[1]: Started sshd@6-10.0.0.90:22-10.0.0.1:34294.service - OpenSSH per-connection server daemon (10.0.0.1:34294). Sep 9 05:02:29.008343 systemd-logind[1481]: Removed session 6. Sep 9 05:02:29.061724 sshd[1710]: Accepted publickey for core from 10.0.0.1 port 34294 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:02:29.063125 sshd-session[1710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:02:29.067263 systemd-logind[1481]: New session 7 of user core. Sep 9 05:02:29.074355 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 05:02:29.125482 sudo[1714]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 05:02:29.125754 sudo[1714]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:02:29.420912 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 05:02:29.439548 (dockerd)[1735]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 05:02:29.637171 dockerd[1735]: time="2025-09-09T05:02:29.637101844Z" level=info msg="Starting up" Sep 9 05:02:29.639132 dockerd[1735]: time="2025-09-09T05:02:29.639049067Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 05:02:29.650075 dockerd[1735]: time="2025-09-09T05:02:29.650021090Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 05:02:29.962514 dockerd[1735]: time="2025-09-09T05:02:29.962443281Z" level=info msg="Loading containers: start." Sep 9 05:02:29.978475 kernel: Initializing XFRM netlink socket Sep 9 05:02:30.198414 systemd-networkd[1442]: docker0: Link UP Sep 9 05:02:30.202538 dockerd[1735]: time="2025-09-09T05:02:30.202497610Z" level=info msg="Loading containers: done." Sep 9 05:02:30.217230 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1926506297-merged.mount: Deactivated successfully. Sep 9 05:02:30.220225 dockerd[1735]: time="2025-09-09T05:02:30.220148669Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 05:02:30.220347 dockerd[1735]: time="2025-09-09T05:02:30.220317316Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 05:02:30.220457 dockerd[1735]: time="2025-09-09T05:02:30.220430580Z" level=info msg="Initializing buildkit" Sep 9 05:02:30.243899 dockerd[1735]: time="2025-09-09T05:02:30.243821465Z" level=info msg="Completed buildkit initialization" Sep 9 05:02:30.248902 dockerd[1735]: time="2025-09-09T05:02:30.248835043Z" level=info msg="Daemon has completed initialization" Sep 9 05:02:30.249143 dockerd[1735]: time="2025-09-09T05:02:30.248945889Z" level=info msg="API listen on /run/docker.sock" Sep 9 05:02:30.249095 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 05:02:30.810204 containerd[1498]: time="2025-09-09T05:02:30.810153377Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 9 05:02:31.521998 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3923845706.mount: Deactivated successfully. Sep 9 05:02:32.865681 containerd[1498]: time="2025-09-09T05:02:32.865617459Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:32.866907 containerd[1498]: time="2025-09-09T05:02:32.866863801Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=25652443" Sep 9 05:02:32.867750 containerd[1498]: time="2025-09-09T05:02:32.867729042Z" level=info msg="ImageCreate event name:\"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:32.870068 containerd[1498]: time="2025-09-09T05:02:32.870034709Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:32.871364 containerd[1498]: time="2025-09-09T05:02:32.871339657Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"25649241\" in 2.061084683s" Sep 9 05:02:32.871430 containerd[1498]: time="2025-09-09T05:02:32.871374479Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\"" Sep 9 05:02:32.872596 containerd[1498]: time="2025-09-09T05:02:32.872574921Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 9 05:02:33.885003 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 05:02:33.886460 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:34.035765 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:34.040005 (kubelet)[2017]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:02:34.083955 kubelet[2017]: E0909 05:02:34.083907 2017 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:02:34.087088 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:02:34.087268 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:02:34.089287 systemd[1]: kubelet.service: Consumed 152ms CPU time, 107.9M memory peak. Sep 9 05:02:34.305080 containerd[1498]: time="2025-09-09T05:02:34.304964810Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:34.306586 containerd[1498]: time="2025-09-09T05:02:34.306546072Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=22460311" Sep 9 05:02:34.307516 containerd[1498]: time="2025-09-09T05:02:34.307467941Z" level=info msg="ImageCreate event name:\"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:34.310125 containerd[1498]: time="2025-09-09T05:02:34.310063031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:34.311724 containerd[1498]: time="2025-09-09T05:02:34.311375014Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"23997423\" in 1.438768713s" Sep 9 05:02:34.311724 containerd[1498]: time="2025-09-09T05:02:34.311408187Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\"" Sep 9 05:02:34.312130 containerd[1498]: time="2025-09-09T05:02:34.312107633Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 9 05:02:35.532363 containerd[1498]: time="2025-09-09T05:02:35.532313266Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:35.532981 containerd[1498]: time="2025-09-09T05:02:35.532930933Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=17125905" Sep 9 05:02:35.534119 containerd[1498]: time="2025-09-09T05:02:35.534073924Z" level=info msg="ImageCreate event name:\"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:35.537092 containerd[1498]: time="2025-09-09T05:02:35.537053914Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:35.538650 containerd[1498]: time="2025-09-09T05:02:35.538619894Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"18663035\" in 1.226481589s" Sep 9 05:02:35.538689 containerd[1498]: time="2025-09-09T05:02:35.538655677Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\"" Sep 9 05:02:35.539102 containerd[1498]: time="2025-09-09T05:02:35.539071900Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 9 05:02:36.877727 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3579327340.mount: Deactivated successfully. Sep 9 05:02:37.086444 containerd[1498]: time="2025-09-09T05:02:37.086375100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:37.086953 containerd[1498]: time="2025-09-09T05:02:37.086917928Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=26916097" Sep 9 05:02:37.087716 containerd[1498]: time="2025-09-09T05:02:37.087652352Z" level=info msg="ImageCreate event name:\"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:37.089227 containerd[1498]: time="2025-09-09T05:02:37.089184321Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:37.089916 containerd[1498]: time="2025-09-09T05:02:37.089885471Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"26915114\" in 1.550782915s" Sep 9 05:02:37.089916 containerd[1498]: time="2025-09-09T05:02:37.089916633Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\"" Sep 9 05:02:37.090411 containerd[1498]: time="2025-09-09T05:02:37.090307635Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 05:02:37.562214 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4031299917.mount: Deactivated successfully. Sep 9 05:02:38.344217 containerd[1498]: time="2025-09-09T05:02:38.343737426Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:38.344562 containerd[1498]: time="2025-09-09T05:02:38.344321176Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 9 05:02:38.345387 containerd[1498]: time="2025-09-09T05:02:38.345359384Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:38.348004 containerd[1498]: time="2025-09-09T05:02:38.347947588Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:38.349811 containerd[1498]: time="2025-09-09T05:02:38.349778614Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.259441089s" Sep 9 05:02:38.349811 containerd[1498]: time="2025-09-09T05:02:38.349809751Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 05:02:38.350235 containerd[1498]: time="2025-09-09T05:02:38.350204845Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 05:02:38.762031 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount986689486.mount: Deactivated successfully. Sep 9 05:02:38.766684 containerd[1498]: time="2025-09-09T05:02:38.766647658Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:02:38.767119 containerd[1498]: time="2025-09-09T05:02:38.767091710Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 9 05:02:38.767897 containerd[1498]: time="2025-09-09T05:02:38.767865273Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:02:38.769838 containerd[1498]: time="2025-09-09T05:02:38.769810522Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:02:38.770451 containerd[1498]: time="2025-09-09T05:02:38.770418233Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 420.172722ms" Sep 9 05:02:38.770487 containerd[1498]: time="2025-09-09T05:02:38.770454792Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 05:02:38.770922 containerd[1498]: time="2025-09-09T05:02:38.770902193Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 9 05:02:39.247467 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount458083116.mount: Deactivated successfully. Sep 9 05:02:41.051030 containerd[1498]: time="2025-09-09T05:02:41.050876230Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:41.053089 containerd[1498]: time="2025-09-09T05:02:41.053044074Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537163" Sep 9 05:02:41.054054 containerd[1498]: time="2025-09-09T05:02:41.054020794Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:41.057096 containerd[1498]: time="2025-09-09T05:02:41.057058713Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:02:41.059906 containerd[1498]: time="2025-09-09T05:02:41.059851415Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.288922183s" Sep 9 05:02:41.059906 containerd[1498]: time="2025-09-09T05:02:41.059889929Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 9 05:02:44.337770 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 05:02:44.339507 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:44.555408 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:44.569536 (kubelet)[2179]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:02:44.611207 kubelet[2179]: E0909 05:02:44.611066 2179 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:02:44.613544 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:02:44.613685 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:02:44.614181 systemd[1]: kubelet.service: Consumed 135ms CPU time, 108M memory peak. Sep 9 05:02:47.341756 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:47.341896 systemd[1]: kubelet.service: Consumed 135ms CPU time, 108M memory peak. Sep 9 05:02:47.345632 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:47.367467 systemd[1]: Reload requested from client PID 2195 ('systemctl') (unit session-7.scope)... Sep 9 05:02:47.367481 systemd[1]: Reloading... Sep 9 05:02:47.428337 zram_generator::config[2237]: No configuration found. Sep 9 05:02:47.595274 systemd[1]: Reloading finished in 227 ms. Sep 9 05:02:47.649006 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:47.652057 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:47.652742 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:02:47.652975 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:47.653013 systemd[1]: kubelet.service: Consumed 93ms CPU time, 95.1M memory peak. Sep 9 05:02:47.654407 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:47.778388 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:47.783325 (kubelet)[2284]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:02:47.815127 kubelet[2284]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:02:47.815127 kubelet[2284]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 05:02:47.815127 kubelet[2284]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:02:47.815487 kubelet[2284]: I0909 05:02:47.815164 2284 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:02:49.116225 kubelet[2284]: I0909 05:02:49.115637 2284 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 05:02:49.116225 kubelet[2284]: I0909 05:02:49.115679 2284 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:02:49.116225 kubelet[2284]: I0909 05:02:49.115897 2284 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 05:02:49.135249 kubelet[2284]: E0909 05:02:49.135183 2284 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.90:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.90:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:02:49.136092 kubelet[2284]: I0909 05:02:49.136066 2284 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:02:49.143405 kubelet[2284]: I0909 05:02:49.143369 2284 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:02:49.148922 kubelet[2284]: I0909 05:02:49.147826 2284 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:02:49.148922 kubelet[2284]: I0909 05:02:49.148115 2284 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 05:02:49.148922 kubelet[2284]: I0909 05:02:49.148236 2284 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:02:49.148922 kubelet[2284]: I0909 05:02:49.148260 2284 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:02:49.149157 kubelet[2284]: I0909 05:02:49.148528 2284 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:02:49.149157 kubelet[2284]: I0909 05:02:49.148536 2284 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 05:02:49.149157 kubelet[2284]: I0909 05:02:49.148758 2284 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:02:49.150988 kubelet[2284]: I0909 05:02:49.150971 2284 kubelet.go:408] "Attempting to sync node with API server" Sep 9 05:02:49.151064 kubelet[2284]: I0909 05:02:49.151054 2284 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:02:49.151145 kubelet[2284]: I0909 05:02:49.151135 2284 kubelet.go:314] "Adding apiserver pod source" Sep 9 05:02:49.151278 kubelet[2284]: I0909 05:02:49.151268 2284 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:02:49.154628 kubelet[2284]: I0909 05:02:49.154607 2284 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:02:49.155404 kubelet[2284]: I0909 05:02:49.155378 2284 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:02:49.155574 kubelet[2284]: W0909 05:02:49.155562 2284 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 05:02:49.156517 kubelet[2284]: I0909 05:02:49.156488 2284 server.go:1274] "Started kubelet" Sep 9 05:02:49.157663 kubelet[2284]: I0909 05:02:49.157339 2284 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:02:49.158345 kubelet[2284]: W0909 05:02:49.158287 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.90:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.90:6443: connect: connection refused Sep 9 05:02:49.158383 kubelet[2284]: E0909 05:02:49.158347 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.90:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.90:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:02:49.158751 kubelet[2284]: I0909 05:02:49.158723 2284 server.go:449] "Adding debug handlers to kubelet server" Sep 9 05:02:49.158987 kubelet[2284]: I0909 05:02:49.158948 2284 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:02:49.159189 kubelet[2284]: I0909 05:02:49.159176 2284 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:02:49.159892 kubelet[2284]: W0909 05:02:49.159813 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.90:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.90:6443: connect: connection refused Sep 9 05:02:49.159892 kubelet[2284]: E0909 05:02:49.159856 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.90:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.90:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:02:49.161697 kubelet[2284]: I0909 05:02:49.160312 2284 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:02:49.161697 kubelet[2284]: I0909 05:02:49.160323 2284 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:02:49.161697 kubelet[2284]: I0909 05:02:49.160391 2284 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 05:02:49.161697 kubelet[2284]: I0909 05:02:49.160898 2284 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 05:02:49.161697 kubelet[2284]: I0909 05:02:49.160940 2284 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:02:49.161697 kubelet[2284]: W0909 05:02:49.161431 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.90:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.90:6443: connect: connection refused Sep 9 05:02:49.161697 kubelet[2284]: E0909 05:02:49.161468 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.90:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.90:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:02:49.161697 kubelet[2284]: E0909 05:02:49.161521 2284 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 05:02:49.162052 kubelet[2284]: E0909 05:02:49.162027 2284 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.90:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.90:6443: connect: connection refused" interval="200ms" Sep 9 05:02:49.162808 kubelet[2284]: E0909 05:02:49.162783 2284 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:02:49.162978 kubelet[2284]: I0909 05:02:49.162958 2284 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:02:49.163060 kubelet[2284]: I0909 05:02:49.163046 2284 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:02:49.164748 kubelet[2284]: E0909 05:02:49.163482 2284 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.90:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.90:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.186384af8ee9ff76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 05:02:49.156468598 +0000 UTC m=+1.370131302,LastTimestamp:2025-09-09 05:02:49.156468598 +0000 UTC m=+1.370131302,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 05:02:49.165167 kubelet[2284]: I0909 05:02:49.164854 2284 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:02:49.174372 kubelet[2284]: I0909 05:02:49.174335 2284 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 05:02:49.174372 kubelet[2284]: I0909 05:02:49.174354 2284 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 05:02:49.174372 kubelet[2284]: I0909 05:02:49.174375 2284 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:02:49.175704 kubelet[2284]: I0909 05:02:49.175657 2284 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:02:49.176747 kubelet[2284]: I0909 05:02:49.176721 2284 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:02:49.176747 kubelet[2284]: I0909 05:02:49.176750 2284 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 05:02:49.176816 kubelet[2284]: I0909 05:02:49.176768 2284 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 05:02:49.176860 kubelet[2284]: E0909 05:02:49.176812 2284 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:02:49.261658 kubelet[2284]: E0909 05:02:49.261603 2284 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 05:02:49.277929 kubelet[2284]: E0909 05:02:49.277872 2284 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 05:02:49.361799 kubelet[2284]: E0909 05:02:49.361732 2284 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 05:02:49.363322 kubelet[2284]: E0909 05:02:49.363276 2284 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.90:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.90:6443: connect: connection refused" interval="400ms" Sep 9 05:02:49.400610 kubelet[2284]: I0909 05:02:49.400529 2284 policy_none.go:49] "None policy: Start" Sep 9 05:02:49.401138 kubelet[2284]: W0909 05:02:49.400941 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.90:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.90:6443: connect: connection refused Sep 9 05:02:49.401138 kubelet[2284]: E0909 05:02:49.401002 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.90:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.90:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:02:49.401735 kubelet[2284]: I0909 05:02:49.401714 2284 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 05:02:49.401794 kubelet[2284]: I0909 05:02:49.401742 2284 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:02:49.407329 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 05:02:49.419483 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 05:02:49.424495 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 05:02:49.446605 kubelet[2284]: I0909 05:02:49.446065 2284 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:02:49.446605 kubelet[2284]: I0909 05:02:49.446560 2284 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:02:49.446922 kubelet[2284]: I0909 05:02:49.446872 2284 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:02:49.447431 kubelet[2284]: I0909 05:02:49.447392 2284 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:02:49.449160 kubelet[2284]: E0909 05:02:49.448991 2284 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 05:02:49.488887 systemd[1]: Created slice kubepods-burstable-podc33a320f4131fb8010d54cfd97eb9024.slice - libcontainer container kubepods-burstable-podc33a320f4131fb8010d54cfd97eb9024.slice. Sep 9 05:02:49.512181 systemd[1]: Created slice kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice - libcontainer container kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice. Sep 9 05:02:49.516007 systemd[1]: Created slice kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice - libcontainer container kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice. Sep 9 05:02:49.548953 kubelet[2284]: I0909 05:02:49.548833 2284 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:02:49.549641 kubelet[2284]: E0909 05:02:49.549563 2284 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.90:6443/api/v1/nodes\": dial tcp 10.0.0.90:6443: connect: connection refused" node="localhost" Sep 9 05:02:49.562972 kubelet[2284]: I0909 05:02:49.562950 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:49.562972 kubelet[2284]: I0909 05:02:49.562979 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 05:02:49.563055 kubelet[2284]: I0909 05:02:49.562997 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c33a320f4131fb8010d54cfd97eb9024-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c33a320f4131fb8010d54cfd97eb9024\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:02:49.563055 kubelet[2284]: I0909 05:02:49.563033 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c33a320f4131fb8010d54cfd97eb9024-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c33a320f4131fb8010d54cfd97eb9024\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:02:49.563055 kubelet[2284]: I0909 05:02:49.563051 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:49.563055 kubelet[2284]: I0909 05:02:49.563083 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:49.563055 kubelet[2284]: I0909 05:02:49.563101 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c33a320f4131fb8010d54cfd97eb9024-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c33a320f4131fb8010d54cfd97eb9024\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:02:49.563331 kubelet[2284]: I0909 05:02:49.563116 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:49.563331 kubelet[2284]: I0909 05:02:49.563132 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:49.751069 kubelet[2284]: I0909 05:02:49.750998 2284 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:02:49.751380 kubelet[2284]: E0909 05:02:49.751345 2284 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.90:6443/api/v1/nodes\": dial tcp 10.0.0.90:6443: connect: connection refused" node="localhost" Sep 9 05:02:49.763857 kubelet[2284]: E0909 05:02:49.763807 2284 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.90:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.90:6443: connect: connection refused" interval="800ms" Sep 9 05:02:49.809880 containerd[1498]: time="2025-09-09T05:02:49.809804643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c33a320f4131fb8010d54cfd97eb9024,Namespace:kube-system,Attempt:0,}" Sep 9 05:02:49.815618 containerd[1498]: time="2025-09-09T05:02:49.815374615Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,}" Sep 9 05:02:49.818748 containerd[1498]: time="2025-09-09T05:02:49.818557239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,}" Sep 9 05:02:49.839094 containerd[1498]: time="2025-09-09T05:02:49.839043609Z" level=info msg="connecting to shim 50beb027740f45dfb656e7bc9258b4780f395f254c12c8769f26f599a88bea59" address="unix:///run/containerd/s/10e6231359c0909963157ace55589e3ca73da04ebe76a38c5eee6cc46813a61c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:02:49.864267 containerd[1498]: time="2025-09-09T05:02:49.864227893Z" level=info msg="connecting to shim 24b2db822c7a343059276588f08a0d1f45653db915afd28ea166801d7bbd1108" address="unix:///run/containerd/s/6f7cb97219fad6bf32ca648174c4d6f6d5c9bb4f8d656bc0b68568c34f542960" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:02:49.877523 containerd[1498]: time="2025-09-09T05:02:49.877473399Z" level=info msg="connecting to shim aa204a3485475ab7360e337a06bc3d95fdbe9443b6d174bb3dc3ff418332eab9" address="unix:///run/containerd/s/3312cc0046356ab297b76068d427119863537906f4cb38fcbbba662cd04bc4ba" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:02:49.887352 systemd[1]: Started cri-containerd-50beb027740f45dfb656e7bc9258b4780f395f254c12c8769f26f599a88bea59.scope - libcontainer container 50beb027740f45dfb656e7bc9258b4780f395f254c12c8769f26f599a88bea59. Sep 9 05:02:49.891085 systemd[1]: Started cri-containerd-24b2db822c7a343059276588f08a0d1f45653db915afd28ea166801d7bbd1108.scope - libcontainer container 24b2db822c7a343059276588f08a0d1f45653db915afd28ea166801d7bbd1108. Sep 9 05:02:49.900946 systemd[1]: Started cri-containerd-aa204a3485475ab7360e337a06bc3d95fdbe9443b6d174bb3dc3ff418332eab9.scope - libcontainer container aa204a3485475ab7360e337a06bc3d95fdbe9443b6d174bb3dc3ff418332eab9. Sep 9 05:02:49.932398 containerd[1498]: time="2025-09-09T05:02:49.932178395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c33a320f4131fb8010d54cfd97eb9024,Namespace:kube-system,Attempt:0,} returns sandbox id \"50beb027740f45dfb656e7bc9258b4780f395f254c12c8769f26f599a88bea59\"" Sep 9 05:02:49.936757 containerd[1498]: time="2025-09-09T05:02:49.936725304Z" level=info msg="CreateContainer within sandbox \"50beb027740f45dfb656e7bc9258b4780f395f254c12c8769f26f599a88bea59\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 05:02:49.939361 containerd[1498]: time="2025-09-09T05:02:49.939272450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"aa204a3485475ab7360e337a06bc3d95fdbe9443b6d174bb3dc3ff418332eab9\"" Sep 9 05:02:49.942801 containerd[1498]: time="2025-09-09T05:02:49.942676626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"24b2db822c7a343059276588f08a0d1f45653db915afd28ea166801d7bbd1108\"" Sep 9 05:02:49.942801 containerd[1498]: time="2025-09-09T05:02:49.942722831Z" level=info msg="CreateContainer within sandbox \"aa204a3485475ab7360e337a06bc3d95fdbe9443b6d174bb3dc3ff418332eab9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 05:02:49.944889 containerd[1498]: time="2025-09-09T05:02:49.944860049Z" level=info msg="CreateContainer within sandbox \"24b2db822c7a343059276588f08a0d1f45653db915afd28ea166801d7bbd1108\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 05:02:49.946113 containerd[1498]: time="2025-09-09T05:02:49.945855333Z" level=info msg="Container c755ec3858f3bcb973964f31d41595839ebec582be8143c962c77c21f445c1fc: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:02:49.953795 containerd[1498]: time="2025-09-09T05:02:49.953760773Z" level=info msg="Container c1b664b2500c353b4928ee055353f289f3dd8ca5dc95ec446d2ad31f466fcce1: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:02:49.956171 containerd[1498]: time="2025-09-09T05:02:49.955862378Z" level=info msg="Container c519b22dcfd5d84ebda759c428905a9f2484803150731182f8feec6b84998765: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:02:49.956171 containerd[1498]: time="2025-09-09T05:02:49.955981007Z" level=info msg="CreateContainer within sandbox \"50beb027740f45dfb656e7bc9258b4780f395f254c12c8769f26f599a88bea59\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"c755ec3858f3bcb973964f31d41595839ebec582be8143c962c77c21f445c1fc\"" Sep 9 05:02:49.956775 containerd[1498]: time="2025-09-09T05:02:49.956748505Z" level=info msg="StartContainer for \"c755ec3858f3bcb973964f31d41595839ebec582be8143c962c77c21f445c1fc\"" Sep 9 05:02:49.958396 containerd[1498]: time="2025-09-09T05:02:49.958340976Z" level=info msg="connecting to shim c755ec3858f3bcb973964f31d41595839ebec582be8143c962c77c21f445c1fc" address="unix:///run/containerd/s/10e6231359c0909963157ace55589e3ca73da04ebe76a38c5eee6cc46813a61c" protocol=ttrpc version=3 Sep 9 05:02:49.959954 containerd[1498]: time="2025-09-09T05:02:49.959924214Z" level=info msg="CreateContainer within sandbox \"aa204a3485475ab7360e337a06bc3d95fdbe9443b6d174bb3dc3ff418332eab9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c1b664b2500c353b4928ee055353f289f3dd8ca5dc95ec446d2ad31f466fcce1\"" Sep 9 05:02:49.960466 containerd[1498]: time="2025-09-09T05:02:49.960442781Z" level=info msg="StartContainer for \"c1b664b2500c353b4928ee055353f289f3dd8ca5dc95ec446d2ad31f466fcce1\"" Sep 9 05:02:49.962215 containerd[1498]: time="2025-09-09T05:02:49.961881289Z" level=info msg="connecting to shim c1b664b2500c353b4928ee055353f289f3dd8ca5dc95ec446d2ad31f466fcce1" address="unix:///run/containerd/s/3312cc0046356ab297b76068d427119863537906f4cb38fcbbba662cd04bc4ba" protocol=ttrpc version=3 Sep 9 05:02:49.962915 containerd[1498]: time="2025-09-09T05:02:49.962887845Z" level=info msg="CreateContainer within sandbox \"24b2db822c7a343059276588f08a0d1f45653db915afd28ea166801d7bbd1108\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c519b22dcfd5d84ebda759c428905a9f2484803150731182f8feec6b84998765\"" Sep 9 05:02:49.963285 containerd[1498]: time="2025-09-09T05:02:49.963258204Z" level=info msg="StartContainer for \"c519b22dcfd5d84ebda759c428905a9f2484803150731182f8feec6b84998765\"" Sep 9 05:02:49.964539 containerd[1498]: time="2025-09-09T05:02:49.964414646Z" level=info msg="connecting to shim c519b22dcfd5d84ebda759c428905a9f2484803150731182f8feec6b84998765" address="unix:///run/containerd/s/6f7cb97219fad6bf32ca648174c4d6f6d5c9bb4f8d656bc0b68568c34f542960" protocol=ttrpc version=3 Sep 9 05:02:49.979328 systemd[1]: Started cri-containerd-c755ec3858f3bcb973964f31d41595839ebec582be8143c962c77c21f445c1fc.scope - libcontainer container c755ec3858f3bcb973964f31d41595839ebec582be8143c962c77c21f445c1fc. Sep 9 05:02:49.983082 systemd[1]: Started cri-containerd-c1b664b2500c353b4928ee055353f289f3dd8ca5dc95ec446d2ad31f466fcce1.scope - libcontainer container c1b664b2500c353b4928ee055353f289f3dd8ca5dc95ec446d2ad31f466fcce1. Sep 9 05:02:49.984263 systemd[1]: Started cri-containerd-c519b22dcfd5d84ebda759c428905a9f2484803150731182f8feec6b84998765.scope - libcontainer container c519b22dcfd5d84ebda759c428905a9f2484803150731182f8feec6b84998765. Sep 9 05:02:50.032917 containerd[1498]: time="2025-09-09T05:02:50.031080041Z" level=info msg="StartContainer for \"c755ec3858f3bcb973964f31d41595839ebec582be8143c962c77c21f445c1fc\" returns successfully" Sep 9 05:02:50.038997 containerd[1498]: time="2025-09-09T05:02:50.038394224Z" level=info msg="StartContainer for \"c1b664b2500c353b4928ee055353f289f3dd8ca5dc95ec446d2ad31f466fcce1\" returns successfully" Sep 9 05:02:50.042082 containerd[1498]: time="2025-09-09T05:02:50.041987397Z" level=info msg="StartContainer for \"c519b22dcfd5d84ebda759c428905a9f2484803150731182f8feec6b84998765\" returns successfully" Sep 9 05:02:50.153710 kubelet[2284]: I0909 05:02:50.153680 2284 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:02:52.588781 kubelet[2284]: E0909 05:02:52.588735 2284 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 05:02:52.642887 kubelet[2284]: I0909 05:02:52.642804 2284 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 05:02:52.718767 kubelet[2284]: E0909 05:02:52.718519 2284 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:53.156186 kubelet[2284]: I0909 05:02:53.156152 2284 apiserver.go:52] "Watching apiserver" Sep 9 05:02:53.161885 kubelet[2284]: I0909 05:02:53.161864 2284 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 05:02:54.730527 systemd[1]: Reload requested from client PID 2559 ('systemctl') (unit session-7.scope)... Sep 9 05:02:54.730544 systemd[1]: Reloading... Sep 9 05:02:54.796235 zram_generator::config[2602]: No configuration found. Sep 9 05:02:54.954664 systemd[1]: Reloading finished in 223 ms. Sep 9 05:02:54.993021 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:55.014304 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:02:55.017271 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:55.017333 systemd[1]: kubelet.service: Consumed 1.732s CPU time, 130.1M memory peak. Sep 9 05:02:55.019173 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:02:55.150643 (kubelet)[2644]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:02:55.155322 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:02:55.198039 kubelet[2644]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:02:55.198039 kubelet[2644]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 9 05:02:55.198039 kubelet[2644]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:02:55.198039 kubelet[2644]: I0909 05:02:55.197645 2644 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:02:55.204213 kubelet[2644]: I0909 05:02:55.203703 2644 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 9 05:02:55.204213 kubelet[2644]: I0909 05:02:55.203728 2644 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:02:55.204213 kubelet[2644]: I0909 05:02:55.204125 2644 server.go:934] "Client rotation is on, will bootstrap in background" Sep 9 05:02:55.207320 kubelet[2644]: I0909 05:02:55.207294 2644 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 05:02:55.211390 kubelet[2644]: I0909 05:02:55.211300 2644 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:02:55.215163 kubelet[2644]: I0909 05:02:55.215143 2644 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:02:55.217441 kubelet[2644]: I0909 05:02:55.217419 2644 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:02:55.217548 kubelet[2644]: I0909 05:02:55.217535 2644 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 9 05:02:55.217657 kubelet[2644]: I0909 05:02:55.217638 2644 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:02:55.217820 kubelet[2644]: I0909 05:02:55.217657 2644 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:02:55.217891 kubelet[2644]: I0909 05:02:55.217827 2644 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:02:55.217891 kubelet[2644]: I0909 05:02:55.217836 2644 container_manager_linux.go:300] "Creating device plugin manager" Sep 9 05:02:55.217891 kubelet[2644]: I0909 05:02:55.217867 2644 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:02:55.218015 kubelet[2644]: I0909 05:02:55.218001 2644 kubelet.go:408] "Attempting to sync node with API server" Sep 9 05:02:55.218472 kubelet[2644]: I0909 05:02:55.218024 2644 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:02:55.218472 kubelet[2644]: I0909 05:02:55.218042 2644 kubelet.go:314] "Adding apiserver pod source" Sep 9 05:02:55.218472 kubelet[2644]: I0909 05:02:55.218055 2644 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:02:55.220470 kubelet[2644]: I0909 05:02:55.220454 2644 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:02:55.227206 kubelet[2644]: I0909 05:02:55.225955 2644 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:02:55.227206 kubelet[2644]: I0909 05:02:55.226587 2644 server.go:1274] "Started kubelet" Sep 9 05:02:55.228394 kubelet[2644]: I0909 05:02:55.228361 2644 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:02:55.229203 kubelet[2644]: I0909 05:02:55.228763 2644 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:02:55.229203 kubelet[2644]: I0909 05:02:55.228959 2644 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:02:55.229813 kubelet[2644]: I0909 05:02:55.229666 2644 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:02:55.231455 kubelet[2644]: I0909 05:02:55.231434 2644 server.go:449] "Adding debug handlers to kubelet server" Sep 9 05:02:55.231729 kubelet[2644]: I0909 05:02:55.231672 2644 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:02:55.232730 kubelet[2644]: E0909 05:02:55.232628 2644 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:02:55.232730 kubelet[2644]: E0909 05:02:55.232677 2644 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 05:02:55.232730 kubelet[2644]: I0909 05:02:55.232696 2644 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 9 05:02:55.232858 kubelet[2644]: I0909 05:02:55.232853 2644 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 9 05:02:55.233215 kubelet[2644]: I0909 05:02:55.232944 2644 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:02:55.236917 kubelet[2644]: I0909 05:02:55.236877 2644 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:02:55.236985 kubelet[2644]: I0909 05:02:55.236969 2644 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:02:55.241512 kubelet[2644]: I0909 05:02:55.241481 2644 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:02:55.242552 kubelet[2644]: I0909 05:02:55.242535 2644 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:02:55.242659 kubelet[2644]: I0909 05:02:55.242648 2644 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 9 05:02:55.242753 kubelet[2644]: I0909 05:02:55.242736 2644 kubelet.go:2321] "Starting kubelet main sync loop" Sep 9 05:02:55.243358 kubelet[2644]: E0909 05:02:55.243287 2644 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:02:55.243422 kubelet[2644]: I0909 05:02:55.242879 2644 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:02:55.281220 kubelet[2644]: I0909 05:02:55.280833 2644 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 9 05:02:55.281220 kubelet[2644]: I0909 05:02:55.280845 2644 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 9 05:02:55.281220 kubelet[2644]: I0909 05:02:55.280864 2644 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:02:55.281220 kubelet[2644]: I0909 05:02:55.280997 2644 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 05:02:55.281220 kubelet[2644]: I0909 05:02:55.281007 2644 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 05:02:55.281220 kubelet[2644]: I0909 05:02:55.281024 2644 policy_none.go:49] "None policy: Start" Sep 9 05:02:55.281559 kubelet[2644]: I0909 05:02:55.281481 2644 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 9 05:02:55.281559 kubelet[2644]: I0909 05:02:55.281499 2644 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:02:55.281639 kubelet[2644]: I0909 05:02:55.281616 2644 state_mem.go:75] "Updated machine memory state" Sep 9 05:02:55.287006 kubelet[2644]: I0909 05:02:55.286914 2644 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:02:55.287124 kubelet[2644]: I0909 05:02:55.287056 2644 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:02:55.287124 kubelet[2644]: I0909 05:02:55.287097 2644 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:02:55.288450 kubelet[2644]: I0909 05:02:55.288077 2644 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:02:55.389126 kubelet[2644]: I0909 05:02:55.389095 2644 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 9 05:02:55.395387 kubelet[2644]: I0909 05:02:55.395347 2644 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 9 05:02:55.395485 kubelet[2644]: I0909 05:02:55.395412 2644 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 9 05:02:55.434017 kubelet[2644]: I0909 05:02:55.433966 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c33a320f4131fb8010d54cfd97eb9024-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c33a320f4131fb8010d54cfd97eb9024\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:02:55.434017 kubelet[2644]: I0909 05:02:55.433994 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c33a320f4131fb8010d54cfd97eb9024-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c33a320f4131fb8010d54cfd97eb9024\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:02:55.434017 kubelet[2644]: I0909 05:02:55.434014 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c33a320f4131fb8010d54cfd97eb9024-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c33a320f4131fb8010d54cfd97eb9024\") " pod="kube-system/kube-apiserver-localhost" Sep 9 05:02:55.434261 kubelet[2644]: I0909 05:02:55.434030 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:55.434261 kubelet[2644]: I0909 05:02:55.434046 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:55.434261 kubelet[2644]: I0909 05:02:55.434059 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:55.434261 kubelet[2644]: I0909 05:02:55.434135 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:55.434261 kubelet[2644]: I0909 05:02:55.434150 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 05:02:55.434384 kubelet[2644]: I0909 05:02:55.434164 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 9 05:02:56.219462 kubelet[2644]: I0909 05:02:56.219346 2644 apiserver.go:52] "Watching apiserver" Sep 9 05:02:56.233804 kubelet[2644]: I0909 05:02:56.233744 2644 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 9 05:02:56.265186 kubelet[2644]: I0909 05:02:56.265116 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.265087498 podStartE2EDuration="1.265087498s" podCreationTimestamp="2025-09-09 05:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:02:56.265039618 +0000 UTC m=+1.104255735" watchObservedRunningTime="2025-09-09 05:02:56.265087498 +0000 UTC m=+1.104303615" Sep 9 05:02:56.274171 kubelet[2644]: I0909 05:02:56.273828 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.273813404 podStartE2EDuration="1.273813404s" podCreationTimestamp="2025-09-09 05:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:02:56.273484439 +0000 UTC m=+1.112700556" watchObservedRunningTime="2025-09-09 05:02:56.273813404 +0000 UTC m=+1.113029521" Sep 9 05:02:56.294536 kubelet[2644]: I0909 05:02:56.294481 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.294453109 podStartE2EDuration="1.294453109s" podCreationTimestamp="2025-09-09 05:02:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:02:56.286087729 +0000 UTC m=+1.125303846" watchObservedRunningTime="2025-09-09 05:02:56.294453109 +0000 UTC m=+1.133669226" Sep 9 05:03:00.106343 kubelet[2644]: I0909 05:03:00.106297 2644 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 05:03:00.106739 containerd[1498]: time="2025-09-09T05:03:00.106713996Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 05:03:00.106988 kubelet[2644]: I0909 05:03:00.106923 2644 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 05:03:00.850436 systemd[1]: Created slice kubepods-besteffort-pode2284d41_6247_4b51_8a02_fece064bd33e.slice - libcontainer container kubepods-besteffort-pode2284d41_6247_4b51_8a02_fece064bd33e.slice. Sep 9 05:03:00.870258 kubelet[2644]: I0909 05:03:00.870223 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e2284d41-6247-4b51-8a02-fece064bd33e-kube-proxy\") pod \"kube-proxy-6ndfm\" (UID: \"e2284d41-6247-4b51-8a02-fece064bd33e\") " pod="kube-system/kube-proxy-6ndfm" Sep 9 05:03:00.870514 kubelet[2644]: I0909 05:03:00.870417 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e2284d41-6247-4b51-8a02-fece064bd33e-xtables-lock\") pod \"kube-proxy-6ndfm\" (UID: \"e2284d41-6247-4b51-8a02-fece064bd33e\") " pod="kube-system/kube-proxy-6ndfm" Sep 9 05:03:00.870514 kubelet[2644]: I0909 05:03:00.870445 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2284d41-6247-4b51-8a02-fece064bd33e-lib-modules\") pod \"kube-proxy-6ndfm\" (UID: \"e2284d41-6247-4b51-8a02-fece064bd33e\") " pod="kube-system/kube-proxy-6ndfm" Sep 9 05:03:00.870514 kubelet[2644]: I0909 05:03:00.870461 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr8jv\" (UniqueName: \"kubernetes.io/projected/e2284d41-6247-4b51-8a02-fece064bd33e-kube-api-access-xr8jv\") pod \"kube-proxy-6ndfm\" (UID: \"e2284d41-6247-4b51-8a02-fece064bd33e\") " pod="kube-system/kube-proxy-6ndfm" Sep 9 05:03:01.170474 containerd[1498]: time="2025-09-09T05:03:01.170180298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6ndfm,Uid:e2284d41-6247-4b51-8a02-fece064bd33e,Namespace:kube-system,Attempt:0,}" Sep 9 05:03:01.205862 containerd[1498]: time="2025-09-09T05:03:01.205708428Z" level=info msg="connecting to shim e3c7e5829ba9492db92bb3a1e205167a845fb8b8bcfa2325cc2cb89e1181943c" address="unix:///run/containerd/s/1bf974614e3e1a232130134ff0d12b2f25d5d161efbec25dd9a3a949a23ae668" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:01.242390 systemd[1]: Started cri-containerd-e3c7e5829ba9492db92bb3a1e205167a845fb8b8bcfa2325cc2cb89e1181943c.scope - libcontainer container e3c7e5829ba9492db92bb3a1e205167a845fb8b8bcfa2325cc2cb89e1181943c. Sep 9 05:03:01.250104 systemd[1]: Created slice kubepods-besteffort-pod11568123_fd70_450f_81aa_46be5c8f3989.slice - libcontainer container kubepods-besteffort-pod11568123_fd70_450f_81aa_46be5c8f3989.slice. Sep 9 05:03:01.268065 containerd[1498]: time="2025-09-09T05:03:01.268028257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6ndfm,Uid:e2284d41-6247-4b51-8a02-fece064bd33e,Namespace:kube-system,Attempt:0,} returns sandbox id \"e3c7e5829ba9492db92bb3a1e205167a845fb8b8bcfa2325cc2cb89e1181943c\"" Sep 9 05:03:01.270922 containerd[1498]: time="2025-09-09T05:03:01.270884493Z" level=info msg="CreateContainer within sandbox \"e3c7e5829ba9492db92bb3a1e205167a845fb8b8bcfa2325cc2cb89e1181943c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 05:03:01.272457 kubelet[2644]: I0909 05:03:01.272431 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/11568123-fd70-450f-81aa-46be5c8f3989-var-lib-calico\") pod \"tigera-operator-58fc44c59b-8ls9b\" (UID: \"11568123-fd70-450f-81aa-46be5c8f3989\") " pod="tigera-operator/tigera-operator-58fc44c59b-8ls9b" Sep 9 05:03:01.272835 kubelet[2644]: I0909 05:03:01.272754 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn79v\" (UniqueName: \"kubernetes.io/projected/11568123-fd70-450f-81aa-46be5c8f3989-kube-api-access-dn79v\") pod \"tigera-operator-58fc44c59b-8ls9b\" (UID: \"11568123-fd70-450f-81aa-46be5c8f3989\") " pod="tigera-operator/tigera-operator-58fc44c59b-8ls9b" Sep 9 05:03:01.284214 containerd[1498]: time="2025-09-09T05:03:01.283681895Z" level=info msg="Container 92b4813d55211c27e4f8a7e5897ab61c386418792fe8bd1a01b050068f3289eb: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:01.290663 containerd[1498]: time="2025-09-09T05:03:01.290622543Z" level=info msg="CreateContainer within sandbox \"e3c7e5829ba9492db92bb3a1e205167a845fb8b8bcfa2325cc2cb89e1181943c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"92b4813d55211c27e4f8a7e5897ab61c386418792fe8bd1a01b050068f3289eb\"" Sep 9 05:03:01.291358 containerd[1498]: time="2025-09-09T05:03:01.291323192Z" level=info msg="StartContainer for \"92b4813d55211c27e4f8a7e5897ab61c386418792fe8bd1a01b050068f3289eb\"" Sep 9 05:03:01.292997 containerd[1498]: time="2025-09-09T05:03:01.292965853Z" level=info msg="connecting to shim 92b4813d55211c27e4f8a7e5897ab61c386418792fe8bd1a01b050068f3289eb" address="unix:///run/containerd/s/1bf974614e3e1a232130134ff0d12b2f25d5d161efbec25dd9a3a949a23ae668" protocol=ttrpc version=3 Sep 9 05:03:01.315411 systemd[1]: Started cri-containerd-92b4813d55211c27e4f8a7e5897ab61c386418792fe8bd1a01b050068f3289eb.scope - libcontainer container 92b4813d55211c27e4f8a7e5897ab61c386418792fe8bd1a01b050068f3289eb. Sep 9 05:03:01.348692 containerd[1498]: time="2025-09-09T05:03:01.348643398Z" level=info msg="StartContainer for \"92b4813d55211c27e4f8a7e5897ab61c386418792fe8bd1a01b050068f3289eb\" returns successfully" Sep 9 05:03:01.553355 containerd[1498]: time="2025-09-09T05:03:01.553313669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-8ls9b,Uid:11568123-fd70-450f-81aa-46be5c8f3989,Namespace:tigera-operator,Attempt:0,}" Sep 9 05:03:01.567706 containerd[1498]: time="2025-09-09T05:03:01.567664851Z" level=info msg="connecting to shim 98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7" address="unix:///run/containerd/s/1863152e8826936c9ff2a1e2c2e9137cce4883a584cc3bf6722ea7fd4de44a17" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:01.592364 systemd[1]: Started cri-containerd-98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7.scope - libcontainer container 98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7. Sep 9 05:03:01.621496 containerd[1498]: time="2025-09-09T05:03:01.621459092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-8ls9b,Uid:11568123-fd70-450f-81aa-46be5c8f3989,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7\"" Sep 9 05:03:01.627239 containerd[1498]: time="2025-09-09T05:03:01.626357994Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 05:03:02.288572 kubelet[2644]: I0909 05:03:02.288519 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-6ndfm" podStartSLOduration=2.288501509 podStartE2EDuration="2.288501509s" podCreationTimestamp="2025-09-09 05:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:03:02.288499709 +0000 UTC m=+7.127715826" watchObservedRunningTime="2025-09-09 05:03:02.288501509 +0000 UTC m=+7.127717626" Sep 9 05:03:02.946584 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount848746715.mount: Deactivated successfully. Sep 9 05:03:03.368970 containerd[1498]: time="2025-09-09T05:03:03.368913648Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:03.369677 containerd[1498]: time="2025-09-09T05:03:03.369636256Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 05:03:03.370409 containerd[1498]: time="2025-09-09T05:03:03.370366745Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:03.372343 containerd[1498]: time="2025-09-09T05:03:03.372299647Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:03.373300 containerd[1498]: time="2025-09-09T05:03:03.373270498Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.746871023s" Sep 9 05:03:03.373344 containerd[1498]: time="2025-09-09T05:03:03.373303338Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 05:03:03.378034 containerd[1498]: time="2025-09-09T05:03:03.377996272Z" level=info msg="CreateContainer within sandbox \"98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 05:03:03.389001 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3895790729.mount: Deactivated successfully. Sep 9 05:03:03.389887 containerd[1498]: time="2025-09-09T05:03:03.389838206Z" level=info msg="Container 2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:03.395099 containerd[1498]: time="2025-09-09T05:03:03.395058666Z" level=info msg="CreateContainer within sandbox \"98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5\"" Sep 9 05:03:03.395553 containerd[1498]: time="2025-09-09T05:03:03.395527471Z" level=info msg="StartContainer for \"2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5\"" Sep 9 05:03:03.396694 containerd[1498]: time="2025-09-09T05:03:03.396661324Z" level=info msg="connecting to shim 2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5" address="unix:///run/containerd/s/1863152e8826936c9ff2a1e2c2e9137cce4883a584cc3bf6722ea7fd4de44a17" protocol=ttrpc version=3 Sep 9 05:03:03.423368 systemd[1]: Started cri-containerd-2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5.scope - libcontainer container 2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5. Sep 9 05:03:03.451366 containerd[1498]: time="2025-09-09T05:03:03.451327586Z" level=info msg="StartContainer for \"2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5\" returns successfully" Sep 9 05:03:04.303682 kubelet[2644]: I0909 05:03:04.303569 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-8ls9b" podStartSLOduration=1.5531944069999999 podStartE2EDuration="3.30354551s" podCreationTimestamp="2025-09-09 05:03:01 +0000 UTC" firstStartedPulling="2025-09-09 05:03:01.625908349 +0000 UTC m=+6.465124466" lastFinishedPulling="2025-09-09 05:03:03.376259452 +0000 UTC m=+8.215475569" observedRunningTime="2025-09-09 05:03:04.303429389 +0000 UTC m=+9.142645506" watchObservedRunningTime="2025-09-09 05:03:04.30354551 +0000 UTC m=+9.142761627" Sep 9 05:03:05.470241 systemd[1]: cri-containerd-2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5.scope: Deactivated successfully. Sep 9 05:03:05.509340 containerd[1498]: time="2025-09-09T05:03:05.509289815Z" level=info msg="received exit event container_id:\"2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5\" id:\"2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5\" pid:2967 exit_status:1 exited_at:{seconds:1757394185 nanos:496108000}" Sep 9 05:03:05.509827 containerd[1498]: time="2025-09-09T05:03:05.509460497Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5\" id:\"2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5\" pid:2967 exit_status:1 exited_at:{seconds:1757394185 nanos:496108000}" Sep 9 05:03:05.604800 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5-rootfs.mount: Deactivated successfully. Sep 9 05:03:06.294325 kubelet[2644]: I0909 05:03:06.294292 2644 scope.go:117] "RemoveContainer" containerID="2cbc603f605719e7dd25f80e5a4181e40e09b2a0081739b3372d36257eec80f5" Sep 9 05:03:06.298219 containerd[1498]: time="2025-09-09T05:03:06.298118593Z" level=info msg="CreateContainer within sandbox \"98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 9 05:03:06.307341 containerd[1498]: time="2025-09-09T05:03:06.307291843Z" level=info msg="Container fe878ef624183dd9cb67933c78587fefd1ef6521d493c2a9697f7bba556b4e49: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:06.314407 containerd[1498]: time="2025-09-09T05:03:06.314290551Z" level=info msg="CreateContainer within sandbox \"98c01fd11a52f9d9b1aa192bb98495bd7cad326fa36dc01f9942ffa2bc0aa1e7\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"fe878ef624183dd9cb67933c78587fefd1ef6521d493c2a9697f7bba556b4e49\"" Sep 9 05:03:06.315438 containerd[1498]: time="2025-09-09T05:03:06.315403762Z" level=info msg="StartContainer for \"fe878ef624183dd9cb67933c78587fefd1ef6521d493c2a9697f7bba556b4e49\"" Sep 9 05:03:06.316207 containerd[1498]: time="2025-09-09T05:03:06.316173289Z" level=info msg="connecting to shim fe878ef624183dd9cb67933c78587fefd1ef6521d493c2a9697f7bba556b4e49" address="unix:///run/containerd/s/1863152e8826936c9ff2a1e2c2e9137cce4883a584cc3bf6722ea7fd4de44a17" protocol=ttrpc version=3 Sep 9 05:03:06.349359 systemd[1]: Started cri-containerd-fe878ef624183dd9cb67933c78587fefd1ef6521d493c2a9697f7bba556b4e49.scope - libcontainer container fe878ef624183dd9cb67933c78587fefd1ef6521d493c2a9697f7bba556b4e49. Sep 9 05:03:06.382038 containerd[1498]: time="2025-09-09T05:03:06.381917290Z" level=info msg="StartContainer for \"fe878ef624183dd9cb67933c78587fefd1ef6521d493c2a9697f7bba556b4e49\" returns successfully" Sep 9 05:03:06.546510 update_engine[1483]: I20250909 05:03:06.545244 1483 update_attempter.cc:509] Updating boot flags... Sep 9 05:03:08.697423 sudo[1714]: pam_unix(sudo:session): session closed for user root Sep 9 05:03:08.699782 sshd[1713]: Connection closed by 10.0.0.1 port 34294 Sep 9 05:03:08.700406 sshd-session[1710]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:08.704060 systemd[1]: sshd@6-10.0.0.90:22-10.0.0.1:34294.service: Deactivated successfully. Sep 9 05:03:08.705932 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 05:03:08.706102 systemd[1]: session-7.scope: Consumed 8.199s CPU time, 218.5M memory peak. Sep 9 05:03:08.707014 systemd-logind[1481]: Session 7 logged out. Waiting for processes to exit. Sep 9 05:03:08.708273 systemd-logind[1481]: Removed session 7. Sep 9 05:03:14.911107 systemd[1]: Created slice kubepods-besteffort-pod18d5f1dd_1c35_4c3c_a80d_533cda659d97.slice - libcontainer container kubepods-besteffort-pod18d5f1dd_1c35_4c3c_a80d_533cda659d97.slice. Sep 9 05:03:14.963790 kubelet[2644]: I0909 05:03:14.963682 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwpj4\" (UniqueName: \"kubernetes.io/projected/18d5f1dd-1c35-4c3c-a80d-533cda659d97-kube-api-access-jwpj4\") pod \"calico-typha-58c9667bfb-bsq44\" (UID: \"18d5f1dd-1c35-4c3c-a80d-533cda659d97\") " pod="calico-system/calico-typha-58c9667bfb-bsq44" Sep 9 05:03:14.963790 kubelet[2644]: I0909 05:03:14.963751 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18d5f1dd-1c35-4c3c-a80d-533cda659d97-tigera-ca-bundle\") pod \"calico-typha-58c9667bfb-bsq44\" (UID: \"18d5f1dd-1c35-4c3c-a80d-533cda659d97\") " pod="calico-system/calico-typha-58c9667bfb-bsq44" Sep 9 05:03:14.963790 kubelet[2644]: I0909 05:03:14.963771 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/18d5f1dd-1c35-4c3c-a80d-533cda659d97-typha-certs\") pod \"calico-typha-58c9667bfb-bsq44\" (UID: \"18d5f1dd-1c35-4c3c-a80d-533cda659d97\") " pod="calico-system/calico-typha-58c9667bfb-bsq44" Sep 9 05:03:15.216176 containerd[1498]: time="2025-09-09T05:03:15.215700209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-58c9667bfb-bsq44,Uid:18d5f1dd-1c35-4c3c-a80d-533cda659d97,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:15.248934 containerd[1498]: time="2025-09-09T05:03:15.248840620Z" level=info msg="connecting to shim 16955feacb7b9b5b7b46bb74864f5d65d864efd9faa77ed16dc0c3b9d729c3c8" address="unix:///run/containerd/s/a4cfc1f6f34fe9caa1e570bb1846291ebe8a360523a89da2901bea22de54d786" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:15.290484 systemd[1]: Started cri-containerd-16955feacb7b9b5b7b46bb74864f5d65d864efd9faa77ed16dc0c3b9d729c3c8.scope - libcontainer container 16955feacb7b9b5b7b46bb74864f5d65d864efd9faa77ed16dc0c3b9d729c3c8. Sep 9 05:03:15.308467 systemd[1]: Created slice kubepods-besteffort-pod38b8dabe_feb6_4f97_acbb_2af75ae07894.slice - libcontainer container kubepods-besteffort-pod38b8dabe_feb6_4f97_acbb_2af75ae07894.slice. Sep 9 05:03:15.368691 kubelet[2644]: I0909 05:03:15.367754 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-cni-bin-dir\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.374936 kubelet[2644]: I0909 05:03:15.374660 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-cni-log-dir\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.374936 kubelet[2644]: I0909 05:03:15.374706 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/38b8dabe-feb6-4f97-acbb-2af75ae07894-node-certs\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.374936 kubelet[2644]: I0909 05:03:15.374723 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38b8dabe-feb6-4f97-acbb-2af75ae07894-tigera-ca-bundle\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.374936 kubelet[2644]: I0909 05:03:15.374739 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt64h\" (UniqueName: \"kubernetes.io/projected/38b8dabe-feb6-4f97-acbb-2af75ae07894-kube-api-access-jt64h\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.374936 kubelet[2644]: I0909 05:03:15.374764 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-flexvol-driver-host\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.375326 kubelet[2644]: I0909 05:03:15.374780 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-cni-net-dir\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.375326 kubelet[2644]: I0909 05:03:15.374799 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-var-run-calico\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.375326 kubelet[2644]: I0909 05:03:15.374823 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-var-lib-calico\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.375326 kubelet[2644]: I0909 05:03:15.374840 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-xtables-lock\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.375326 kubelet[2644]: I0909 05:03:15.374873 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-policysync\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.375439 kubelet[2644]: I0909 05:03:15.374897 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38b8dabe-feb6-4f97-acbb-2af75ae07894-lib-modules\") pod \"calico-node-nw9hr\" (UID: \"38b8dabe-feb6-4f97-acbb-2af75ae07894\") " pod="calico-system/calico-node-nw9hr" Sep 9 05:03:15.411970 containerd[1498]: time="2025-09-09T05:03:15.411920299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-58c9667bfb-bsq44,Uid:18d5f1dd-1c35-4c3c-a80d-533cda659d97,Namespace:calico-system,Attempt:0,} returns sandbox id \"16955feacb7b9b5b7b46bb74864f5d65d864efd9faa77ed16dc0c3b9d729c3c8\"" Sep 9 05:03:15.420804 containerd[1498]: time="2025-09-09T05:03:15.420765875Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 05:03:15.477285 kubelet[2644]: E0909 05:03:15.477017 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.477285 kubelet[2644]: W0909 05:03:15.477045 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.477285 kubelet[2644]: E0909 05:03:15.477077 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.482423 kubelet[2644]: E0909 05:03:15.482328 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.482423 kubelet[2644]: W0909 05:03:15.482355 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.482423 kubelet[2644]: E0909 05:03:15.482379 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.486624 kubelet[2644]: E0909 05:03:15.486589 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.486624 kubelet[2644]: W0909 05:03:15.486611 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.486624 kubelet[2644]: E0909 05:03:15.486627 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.578983 kubelet[2644]: E0909 05:03:15.578921 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-z6qqc" podUID="a7ba1d3b-babe-4d10-8083-57470bbd8f30" Sep 9 05:03:15.619041 containerd[1498]: time="2025-09-09T05:03:15.618918456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nw9hr,Uid:38b8dabe-feb6-4f97-acbb-2af75ae07894,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:15.641767 containerd[1498]: time="2025-09-09T05:03:15.641720882Z" level=info msg="connecting to shim 68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c" address="unix:///run/containerd/s/7a71b9ee8541ff8121720d83121f25ffe07e860fd42a1127011479b3a2d9e31f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:15.657976 kubelet[2644]: E0909 05:03:15.657811 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.657976 kubelet[2644]: W0909 05:03:15.657847 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.657976 kubelet[2644]: E0909 05:03:15.657878 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.658155 kubelet[2644]: E0909 05:03:15.658088 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.658155 kubelet[2644]: W0909 05:03:15.658109 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.658155 kubelet[2644]: E0909 05:03:15.658120 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.658305 kubelet[2644]: E0909 05:03:15.658287 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.658305 kubelet[2644]: W0909 05:03:15.658298 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.658305 kubelet[2644]: E0909 05:03:15.658307 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.658485 kubelet[2644]: E0909 05:03:15.658470 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.658485 kubelet[2644]: W0909 05:03:15.658481 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.658539 kubelet[2644]: E0909 05:03:15.658491 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.658776 kubelet[2644]: E0909 05:03:15.658748 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.658776 kubelet[2644]: W0909 05:03:15.658762 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.658776 kubelet[2644]: E0909 05:03:15.658774 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.659166 kubelet[2644]: E0909 05:03:15.659111 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.659166 kubelet[2644]: W0909 05:03:15.659126 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.659166 kubelet[2644]: E0909 05:03:15.659137 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.659942 kubelet[2644]: E0909 05:03:15.659419 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.659942 kubelet[2644]: W0909 05:03:15.659429 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.659942 kubelet[2644]: E0909 05:03:15.659449 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.659942 kubelet[2644]: E0909 05:03:15.659588 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.659942 kubelet[2644]: W0909 05:03:15.659604 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.659942 kubelet[2644]: E0909 05:03:15.659612 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.659942 kubelet[2644]: E0909 05:03:15.659755 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.659942 kubelet[2644]: W0909 05:03:15.659763 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.659942 kubelet[2644]: E0909 05:03:15.659780 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.660135 kubelet[2644]: E0909 05:03:15.659953 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.660135 kubelet[2644]: W0909 05:03:15.659962 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.660135 kubelet[2644]: E0909 05:03:15.659971 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.660705 kubelet[2644]: E0909 05:03:15.660283 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.660705 kubelet[2644]: W0909 05:03:15.660293 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.660705 kubelet[2644]: E0909 05:03:15.660304 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.660705 kubelet[2644]: E0909 05:03:15.660649 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.660705 kubelet[2644]: W0909 05:03:15.660659 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.660705 kubelet[2644]: E0909 05:03:15.660669 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.661590 kubelet[2644]: E0909 05:03:15.660838 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.661590 kubelet[2644]: W0909 05:03:15.660850 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.661590 kubelet[2644]: E0909 05:03:15.660859 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.661590 kubelet[2644]: E0909 05:03:15.661062 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.661590 kubelet[2644]: W0909 05:03:15.661072 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.661590 kubelet[2644]: E0909 05:03:15.661081 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.661590 kubelet[2644]: E0909 05:03:15.661245 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.661590 kubelet[2644]: W0909 05:03:15.661254 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.661590 kubelet[2644]: E0909 05:03:15.661262 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.661590 kubelet[2644]: E0909 05:03:15.661409 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.661833 kubelet[2644]: W0909 05:03:15.661425 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.661833 kubelet[2644]: E0909 05:03:15.661433 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.661833 kubelet[2644]: E0909 05:03:15.661638 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.661833 kubelet[2644]: W0909 05:03:15.661653 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.661833 kubelet[2644]: E0909 05:03:15.661662 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.661833 kubelet[2644]: E0909 05:03:15.661804 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.661833 kubelet[2644]: W0909 05:03:15.661812 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.661833 kubelet[2644]: E0909 05:03:15.661821 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.662461 kubelet[2644]: E0909 05:03:15.662362 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.662461 kubelet[2644]: W0909 05:03:15.662447 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.662461 kubelet[2644]: E0909 05:03:15.662462 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.662679 kubelet[2644]: E0909 05:03:15.662659 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.662679 kubelet[2644]: W0909 05:03:15.662677 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.662735 kubelet[2644]: E0909 05:03:15.662687 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.677116 kubelet[2644]: E0909 05:03:15.677091 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.677116 kubelet[2644]: W0909 05:03:15.677111 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.677253 kubelet[2644]: E0909 05:03:15.677125 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.677253 kubelet[2644]: I0909 05:03:15.677153 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7ba1d3b-babe-4d10-8083-57470bbd8f30-kubelet-dir\") pod \"csi-node-driver-z6qqc\" (UID: \"a7ba1d3b-babe-4d10-8083-57470bbd8f30\") " pod="calico-system/csi-node-driver-z6qqc" Sep 9 05:03:15.677407 kubelet[2644]: E0909 05:03:15.677315 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.677407 kubelet[2644]: W0909 05:03:15.677326 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.677407 kubelet[2644]: E0909 05:03:15.677346 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.677407 kubelet[2644]: I0909 05:03:15.677363 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7ba1d3b-babe-4d10-8083-57470bbd8f30-socket-dir\") pod \"csi-node-driver-z6qqc\" (UID: \"a7ba1d3b-babe-4d10-8083-57470bbd8f30\") " pod="calico-system/csi-node-driver-z6qqc" Sep 9 05:03:15.677760 kubelet[2644]: E0909 05:03:15.677542 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.677760 kubelet[2644]: W0909 05:03:15.677554 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.677760 kubelet[2644]: E0909 05:03:15.677575 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.677760 kubelet[2644]: I0909 05:03:15.677592 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbl8c\" (UniqueName: \"kubernetes.io/projected/a7ba1d3b-babe-4d10-8083-57470bbd8f30-kube-api-access-qbl8c\") pod \"csi-node-driver-z6qqc\" (UID: \"a7ba1d3b-babe-4d10-8083-57470bbd8f30\") " pod="calico-system/csi-node-driver-z6qqc" Sep 9 05:03:15.677760 kubelet[2644]: E0909 05:03:15.677762 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.677931 kubelet[2644]: W0909 05:03:15.677772 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.677931 kubelet[2644]: E0909 05:03:15.677786 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.677931 kubelet[2644]: I0909 05:03:15.677804 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7ba1d3b-babe-4d10-8083-57470bbd8f30-registration-dir\") pod \"csi-node-driver-z6qqc\" (UID: \"a7ba1d3b-babe-4d10-8083-57470bbd8f30\") " pod="calico-system/csi-node-driver-z6qqc" Sep 9 05:03:15.677997 kubelet[2644]: E0909 05:03:15.677964 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.677997 kubelet[2644]: W0909 05:03:15.677977 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.677997 kubelet[2644]: E0909 05:03:15.677994 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.678064 kubelet[2644]: I0909 05:03:15.678009 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a7ba1d3b-babe-4d10-8083-57470bbd8f30-varrun\") pod \"csi-node-driver-z6qqc\" (UID: \"a7ba1d3b-babe-4d10-8083-57470bbd8f30\") " pod="calico-system/csi-node-driver-z6qqc" Sep 9 05:03:15.678650 kubelet[2644]: E0909 05:03:15.678258 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.678650 kubelet[2644]: W0909 05:03:15.678276 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.678650 kubelet[2644]: E0909 05:03:15.678295 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.678650 kubelet[2644]: E0909 05:03:15.678445 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.678650 kubelet[2644]: W0909 05:03:15.678453 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.678650 kubelet[2644]: E0909 05:03:15.678493 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.678650 kubelet[2644]: E0909 05:03:15.678568 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.678650 kubelet[2644]: W0909 05:03:15.678575 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.678650 kubelet[2644]: E0909 05:03:15.678606 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.678432 systemd[1]: Started cri-containerd-68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c.scope - libcontainer container 68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c. Sep 9 05:03:15.679025 kubelet[2644]: E0909 05:03:15.678696 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.679025 kubelet[2644]: W0909 05:03:15.678702 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.679025 kubelet[2644]: E0909 05:03:15.678749 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.679025 kubelet[2644]: E0909 05:03:15.678850 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.679025 kubelet[2644]: W0909 05:03:15.678857 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.679025 kubelet[2644]: E0909 05:03:15.678960 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.679025 kubelet[2644]: E0909 05:03:15.678992 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.679025 kubelet[2644]: W0909 05:03:15.678999 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.679025 kubelet[2644]: E0909 05:03:15.679020 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.679581 kubelet[2644]: E0909 05:03:15.679154 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.679581 kubelet[2644]: W0909 05:03:15.679175 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.679581 kubelet[2644]: E0909 05:03:15.679186 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.679581 kubelet[2644]: E0909 05:03:15.679374 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.679581 kubelet[2644]: W0909 05:03:15.679382 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.679581 kubelet[2644]: E0909 05:03:15.679391 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.679581 kubelet[2644]: E0909 05:03:15.679546 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.679581 kubelet[2644]: W0909 05:03:15.679554 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.679581 kubelet[2644]: E0909 05:03:15.679562 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.680020 kubelet[2644]: E0909 05:03:15.680002 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.680020 kubelet[2644]: W0909 05:03:15.680017 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.680020 kubelet[2644]: E0909 05:03:15.680029 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.744033 containerd[1498]: time="2025-09-09T05:03:15.743877292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nw9hr,Uid:38b8dabe-feb6-4f97-acbb-2af75ae07894,Namespace:calico-system,Attempt:0,} returns sandbox id \"68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c\"" Sep 9 05:03:15.778725 kubelet[2644]: E0909 05:03:15.778697 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.778725 kubelet[2644]: W0909 05:03:15.778719 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.778725 kubelet[2644]: E0909 05:03:15.778738 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.778935 kubelet[2644]: E0909 05:03:15.778925 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.778935 kubelet[2644]: W0909 05:03:15.778935 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.778979 kubelet[2644]: E0909 05:03:15.778944 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.779156 kubelet[2644]: E0909 05:03:15.779143 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.779156 kubelet[2644]: W0909 05:03:15.779157 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.779259 kubelet[2644]: E0909 05:03:15.779178 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.779375 kubelet[2644]: E0909 05:03:15.779357 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.779375 kubelet[2644]: W0909 05:03:15.779371 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.779481 kubelet[2644]: E0909 05:03:15.779385 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.779557 kubelet[2644]: E0909 05:03:15.779543 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.779557 kubelet[2644]: W0909 05:03:15.779555 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.779628 kubelet[2644]: E0909 05:03:15.779569 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.779724 kubelet[2644]: E0909 05:03:15.779713 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.779724 kubelet[2644]: W0909 05:03:15.779724 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.779789 kubelet[2644]: E0909 05:03:15.779737 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.779914 kubelet[2644]: E0909 05:03:15.779903 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.779914 kubelet[2644]: W0909 05:03:15.779914 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.779968 kubelet[2644]: E0909 05:03:15.779928 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.780083 kubelet[2644]: E0909 05:03:15.780072 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.780083 kubelet[2644]: W0909 05:03:15.780082 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.780158 kubelet[2644]: E0909 05:03:15.780094 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.780259 kubelet[2644]: E0909 05:03:15.780247 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.780259 kubelet[2644]: W0909 05:03:15.780259 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.780361 kubelet[2644]: E0909 05:03:15.780274 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.780437 kubelet[2644]: E0909 05:03:15.780401 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.780437 kubelet[2644]: W0909 05:03:15.780415 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.780593 kubelet[2644]: E0909 05:03:15.780470 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.780593 kubelet[2644]: E0909 05:03:15.780557 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.780593 kubelet[2644]: W0909 05:03:15.780566 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.780705 kubelet[2644]: E0909 05:03:15.780677 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.780705 kubelet[2644]: W0909 05:03:15.780693 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.780873 kubelet[2644]: E0909 05:03:15.780678 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.780873 kubelet[2644]: E0909 05:03:15.780764 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.780873 kubelet[2644]: E0909 05:03:15.780827 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.780873 kubelet[2644]: W0909 05:03:15.780847 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.780873 kubelet[2644]: E0909 05:03:15.780873 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.781028 kubelet[2644]: E0909 05:03:15.781017 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.781028 kubelet[2644]: W0909 05:03:15.781028 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.781109 kubelet[2644]: E0909 05:03:15.781047 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.781249 kubelet[2644]: E0909 05:03:15.781234 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.781294 kubelet[2644]: W0909 05:03:15.781249 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.781294 kubelet[2644]: E0909 05:03:15.781263 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.781597 kubelet[2644]: E0909 05:03:15.781580 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.781676 kubelet[2644]: W0909 05:03:15.781664 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.781815 kubelet[2644]: E0909 05:03:15.781741 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.782014 kubelet[2644]: E0909 05:03:15.781999 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.782087 kubelet[2644]: W0909 05:03:15.782075 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.782151 kubelet[2644]: E0909 05:03:15.782141 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.782387 kubelet[2644]: E0909 05:03:15.782368 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.782387 kubelet[2644]: W0909 05:03:15.782388 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.782510 kubelet[2644]: E0909 05:03:15.782406 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.782563 kubelet[2644]: E0909 05:03:15.782549 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.782563 kubelet[2644]: W0909 05:03:15.782560 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.782609 kubelet[2644]: E0909 05:03:15.782574 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.782738 kubelet[2644]: E0909 05:03:15.782725 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.782738 kubelet[2644]: W0909 05:03:15.782737 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.782847 kubelet[2644]: E0909 05:03:15.782818 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.782927 kubelet[2644]: E0909 05:03:15.782910 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.782927 kubelet[2644]: W0909 05:03:15.782922 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.782992 kubelet[2644]: E0909 05:03:15.782972 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.783091 kubelet[2644]: E0909 05:03:15.783078 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.783091 kubelet[2644]: W0909 05:03:15.783089 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.783166 kubelet[2644]: E0909 05:03:15.783103 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.783272 kubelet[2644]: E0909 05:03:15.783260 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.783307 kubelet[2644]: W0909 05:03:15.783272 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.783307 kubelet[2644]: E0909 05:03:15.783285 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.783473 kubelet[2644]: E0909 05:03:15.783461 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.783473 kubelet[2644]: W0909 05:03:15.783472 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.783539 kubelet[2644]: E0909 05:03:15.783485 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.783658 kubelet[2644]: E0909 05:03:15.783646 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.783688 kubelet[2644]: W0909 05:03:15.783659 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.783688 kubelet[2644]: E0909 05:03:15.783669 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:15.794078 kubelet[2644]: E0909 05:03:15.794051 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:15.794603 kubelet[2644]: W0909 05:03:15.794371 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:15.794603 kubelet[2644]: E0909 05:03:15.794402 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:16.443806 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3047087415.mount: Deactivated successfully. Sep 9 05:03:17.243817 kubelet[2644]: E0909 05:03:17.243767 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-z6qqc" podUID="a7ba1d3b-babe-4d10-8083-57470bbd8f30" Sep 9 05:03:17.643084 containerd[1498]: time="2025-09-09T05:03:17.642606260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:17.643905 containerd[1498]: time="2025-09-09T05:03:17.643848947Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 05:03:17.646718 containerd[1498]: time="2025-09-09T05:03:17.646671284Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:17.649255 containerd[1498]: time="2025-09-09T05:03:17.649001137Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:17.650105 containerd[1498]: time="2025-09-09T05:03:17.650041584Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.229236389s" Sep 9 05:03:17.650105 containerd[1498]: time="2025-09-09T05:03:17.650073344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 05:03:17.656745 containerd[1498]: time="2025-09-09T05:03:17.656687222Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 05:03:17.680784 containerd[1498]: time="2025-09-09T05:03:17.680747963Z" level=info msg="CreateContainer within sandbox \"16955feacb7b9b5b7b46bb74864f5d65d864efd9faa77ed16dc0c3b9d729c3c8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 05:03:17.688392 containerd[1498]: time="2025-09-09T05:03:17.688349528Z" level=info msg="Container bf7de92c98bed0b6a897fcfc74f460c50d5a28f501a71b4cdfeb58d4c8f3e04b: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:17.691076 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3923088328.mount: Deactivated successfully. Sep 9 05:03:17.697670 containerd[1498]: time="2025-09-09T05:03:17.697609462Z" level=info msg="CreateContainer within sandbox \"16955feacb7b9b5b7b46bb74864f5d65d864efd9faa77ed16dc0c3b9d729c3c8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"bf7de92c98bed0b6a897fcfc74f460c50d5a28f501a71b4cdfeb58d4c8f3e04b\"" Sep 9 05:03:17.699387 containerd[1498]: time="2025-09-09T05:03:17.699352112Z" level=info msg="StartContainer for \"bf7de92c98bed0b6a897fcfc74f460c50d5a28f501a71b4cdfeb58d4c8f3e04b\"" Sep 9 05:03:17.700519 containerd[1498]: time="2025-09-09T05:03:17.700475078Z" level=info msg="connecting to shim bf7de92c98bed0b6a897fcfc74f460c50d5a28f501a71b4cdfeb58d4c8f3e04b" address="unix:///run/containerd/s/a4cfc1f6f34fe9caa1e570bb1846291ebe8a360523a89da2901bea22de54d786" protocol=ttrpc version=3 Sep 9 05:03:17.725361 systemd[1]: Started cri-containerd-bf7de92c98bed0b6a897fcfc74f460c50d5a28f501a71b4cdfeb58d4c8f3e04b.scope - libcontainer container bf7de92c98bed0b6a897fcfc74f460c50d5a28f501a71b4cdfeb58d4c8f3e04b. Sep 9 05:03:17.764568 containerd[1498]: time="2025-09-09T05:03:17.764531733Z" level=info msg="StartContainer for \"bf7de92c98bed0b6a897fcfc74f460c50d5a28f501a71b4cdfeb58d4c8f3e04b\" returns successfully" Sep 9 05:03:18.335696 kubelet[2644]: I0909 05:03:18.335607 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-58c9667bfb-bsq44" podStartSLOduration=2.097657673 podStartE2EDuration="4.335590394s" podCreationTimestamp="2025-09-09 05:03:14 +0000 UTC" firstStartedPulling="2025-09-09 05:03:15.41845998 +0000 UTC m=+20.257676097" lastFinishedPulling="2025-09-09 05:03:17.656392701 +0000 UTC m=+22.495608818" observedRunningTime="2025-09-09 05:03:18.33500899 +0000 UTC m=+23.174225067" watchObservedRunningTime="2025-09-09 05:03:18.335590394 +0000 UTC m=+23.174806511" Sep 9 05:03:18.384688 kubelet[2644]: E0909 05:03:18.384574 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.384688 kubelet[2644]: W0909 05:03:18.384596 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.384688 kubelet[2644]: E0909 05:03:18.384616 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.385041 kubelet[2644]: E0909 05:03:18.384931 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.385041 kubelet[2644]: W0909 05:03:18.384945 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.385041 kubelet[2644]: E0909 05:03:18.384955 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.385329 kubelet[2644]: E0909 05:03:18.385223 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.385329 kubelet[2644]: W0909 05:03:18.385235 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.385329 kubelet[2644]: E0909 05:03:18.385248 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.385496 kubelet[2644]: E0909 05:03:18.385484 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.385550 kubelet[2644]: W0909 05:03:18.385540 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.385603 kubelet[2644]: E0909 05:03:18.385593 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.385914 kubelet[2644]: E0909 05:03:18.385807 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.385914 kubelet[2644]: W0909 05:03:18.385819 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.385914 kubelet[2644]: E0909 05:03:18.385828 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.386081 kubelet[2644]: E0909 05:03:18.386068 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.386135 kubelet[2644]: W0909 05:03:18.386125 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.386217 kubelet[2644]: E0909 05:03:18.386181 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.386513 kubelet[2644]: E0909 05:03:18.386413 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.386513 kubelet[2644]: W0909 05:03:18.386425 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.386513 kubelet[2644]: E0909 05:03:18.386435 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.386685 kubelet[2644]: E0909 05:03:18.386673 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.386844 kubelet[2644]: W0909 05:03:18.386738 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.386844 kubelet[2644]: E0909 05:03:18.386756 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.386997 kubelet[2644]: E0909 05:03:18.386984 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.387141 kubelet[2644]: W0909 05:03:18.387048 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.387141 kubelet[2644]: E0909 05:03:18.387063 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.387297 kubelet[2644]: E0909 05:03:18.387285 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.387438 kubelet[2644]: W0909 05:03:18.387349 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.387438 kubelet[2644]: E0909 05:03:18.387363 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.387562 kubelet[2644]: E0909 05:03:18.387551 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.387707 kubelet[2644]: W0909 05:03:18.387610 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.387707 kubelet[2644]: E0909 05:03:18.387624 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.387835 kubelet[2644]: E0909 05:03:18.387824 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.387901 kubelet[2644]: W0909 05:03:18.387890 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.388040 kubelet[2644]: E0909 05:03:18.387948 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.388147 kubelet[2644]: E0909 05:03:18.388136 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.388298 kubelet[2644]: W0909 05:03:18.388208 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.388298 kubelet[2644]: E0909 05:03:18.388224 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.388428 kubelet[2644]: E0909 05:03:18.388416 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.388570 kubelet[2644]: W0909 05:03:18.388477 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.388570 kubelet[2644]: E0909 05:03:18.388491 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.388693 kubelet[2644]: E0909 05:03:18.388682 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.388816 kubelet[2644]: W0909 05:03:18.388753 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.388816 kubelet[2644]: E0909 05:03:18.388767 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.404514 kubelet[2644]: E0909 05:03:18.404492 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.404514 kubelet[2644]: W0909 05:03:18.404510 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.404627 kubelet[2644]: E0909 05:03:18.404524 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.404722 kubelet[2644]: E0909 05:03:18.404711 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.404758 kubelet[2644]: W0909 05:03:18.404722 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.404758 kubelet[2644]: E0909 05:03:18.404735 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.404910 kubelet[2644]: E0909 05:03:18.404897 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.404910 kubelet[2644]: W0909 05:03:18.404909 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.404968 kubelet[2644]: E0909 05:03:18.404925 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.405157 kubelet[2644]: E0909 05:03:18.405126 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.405157 kubelet[2644]: W0909 05:03:18.405139 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.405157 kubelet[2644]: E0909 05:03:18.405152 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.405318 kubelet[2644]: E0909 05:03:18.405306 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.405318 kubelet[2644]: W0909 05:03:18.405317 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.405374 kubelet[2644]: E0909 05:03:18.405330 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.405471 kubelet[2644]: E0909 05:03:18.405452 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.405471 kubelet[2644]: W0909 05:03:18.405465 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.405471 kubelet[2644]: E0909 05:03:18.405476 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.405738 kubelet[2644]: E0909 05:03:18.405724 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.405795 kubelet[2644]: W0909 05:03:18.405783 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.405870 kubelet[2644]: E0909 05:03:18.405850 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.406010 kubelet[2644]: E0909 05:03:18.405992 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.406010 kubelet[2644]: W0909 05:03:18.406003 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.406010 kubelet[2644]: E0909 05:03:18.406016 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.406299 kubelet[2644]: E0909 05:03:18.406285 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.406356 kubelet[2644]: W0909 05:03:18.406345 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.406419 kubelet[2644]: E0909 05:03:18.406408 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.406622 kubelet[2644]: E0909 05:03:18.406595 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.406622 kubelet[2644]: W0909 05:03:18.406606 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.406622 kubelet[2644]: E0909 05:03:18.406619 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.406743 kubelet[2644]: E0909 05:03:18.406733 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.406743 kubelet[2644]: W0909 05:03:18.406742 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.406793 kubelet[2644]: E0909 05:03:18.406754 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.406929 kubelet[2644]: E0909 05:03:18.406915 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.406929 kubelet[2644]: W0909 05:03:18.406927 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.406995 kubelet[2644]: E0909 05:03:18.406938 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.407293 kubelet[2644]: E0909 05:03:18.407171 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.407293 kubelet[2644]: W0909 05:03:18.407187 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.407293 kubelet[2644]: E0909 05:03:18.407221 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.407454 kubelet[2644]: E0909 05:03:18.407442 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.407511 kubelet[2644]: W0909 05:03:18.407501 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.407573 kubelet[2644]: E0909 05:03:18.407562 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.407776 kubelet[2644]: E0909 05:03:18.407764 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.407837 kubelet[2644]: W0909 05:03:18.407826 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.407975 kubelet[2644]: E0909 05:03:18.407939 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.408067 kubelet[2644]: E0909 05:03:18.408055 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.408128 kubelet[2644]: W0909 05:03:18.408116 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.408220 kubelet[2644]: E0909 05:03:18.408190 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.408428 kubelet[2644]: E0909 05:03:18.408414 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.408493 kubelet[2644]: W0909 05:03:18.408482 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.408558 kubelet[2644]: E0909 05:03:18.408548 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.408790 kubelet[2644]: E0909 05:03:18.408750 2644 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:03:18.408790 kubelet[2644]: W0909 05:03:18.408770 2644 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:03:18.408790 kubelet[2644]: E0909 05:03:18.408783 2644 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:03:18.819363 containerd[1498]: time="2025-09-09T05:03:18.819290468Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:18.820157 containerd[1498]: time="2025-09-09T05:03:18.820123793Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 05:03:18.822283 containerd[1498]: time="2025-09-09T05:03:18.821825962Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:18.824423 containerd[1498]: time="2025-09-09T05:03:18.824391537Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:18.825286 containerd[1498]: time="2025-09-09T05:03:18.825039100Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.168313197s" Sep 9 05:03:18.825286 containerd[1498]: time="2025-09-09T05:03:18.825075821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 05:03:18.828003 containerd[1498]: time="2025-09-09T05:03:18.827962117Z" level=info msg="CreateContainer within sandbox \"68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 05:03:18.843475 containerd[1498]: time="2025-09-09T05:03:18.843171722Z" level=info msg="Container d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:18.850545 containerd[1498]: time="2025-09-09T05:03:18.850506083Z" level=info msg="CreateContainer within sandbox \"68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87\"" Sep 9 05:03:18.851400 containerd[1498]: time="2025-09-09T05:03:18.851258248Z" level=info msg="StartContainer for \"d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87\"" Sep 9 05:03:18.852940 containerd[1498]: time="2025-09-09T05:03:18.852909817Z" level=info msg="connecting to shim d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87" address="unix:///run/containerd/s/7a71b9ee8541ff8121720d83121f25ffe07e860fd42a1127011479b3a2d9e31f" protocol=ttrpc version=3 Sep 9 05:03:18.872354 systemd[1]: Started cri-containerd-d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87.scope - libcontainer container d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87. Sep 9 05:03:18.911378 containerd[1498]: time="2025-09-09T05:03:18.911329065Z" level=info msg="StartContainer for \"d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87\" returns successfully" Sep 9 05:03:18.923959 systemd[1]: cri-containerd-d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87.scope: Deactivated successfully. Sep 9 05:03:18.926002 containerd[1498]: time="2025-09-09T05:03:18.925964507Z" level=info msg="received exit event container_id:\"d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87\" id:\"d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87\" pid:3386 exited_at:{seconds:1757394198 nanos:925726226}" Sep 9 05:03:18.926086 containerd[1498]: time="2025-09-09T05:03:18.926045907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87\" id:\"d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87\" pid:3386 exited_at:{seconds:1757394198 nanos:925726226}" Sep 9 05:03:18.945530 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d7f8879595d80b379113f1110fe18a52d21c21d58781f8c413edbea3e5adee87-rootfs.mount: Deactivated successfully. Sep 9 05:03:19.244437 kubelet[2644]: E0909 05:03:19.244343 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-z6qqc" podUID="a7ba1d3b-babe-4d10-8083-57470bbd8f30" Sep 9 05:03:19.328860 kubelet[2644]: I0909 05:03:19.328830 2644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:03:19.330396 containerd[1498]: time="2025-09-09T05:03:19.330363944Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 05:03:21.244244 kubelet[2644]: E0909 05:03:21.244157 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-z6qqc" podUID="a7ba1d3b-babe-4d10-8083-57470bbd8f30" Sep 9 05:03:21.902995 containerd[1498]: time="2025-09-09T05:03:21.902924164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:21.904433 containerd[1498]: time="2025-09-09T05:03:21.904219971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 05:03:21.905235 containerd[1498]: time="2025-09-09T05:03:21.905209176Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:21.907316 containerd[1498]: time="2025-09-09T05:03:21.907280026Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:21.908262 containerd[1498]: time="2025-09-09T05:03:21.907979710Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.577575285s" Sep 9 05:03:21.908262 containerd[1498]: time="2025-09-09T05:03:21.908008670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 05:03:21.910225 containerd[1498]: time="2025-09-09T05:03:21.910158000Z" level=info msg="CreateContainer within sandbox \"68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 05:03:21.919262 containerd[1498]: time="2025-09-09T05:03:21.918151760Z" level=info msg="Container 7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:21.926099 containerd[1498]: time="2025-09-09T05:03:21.926052880Z" level=info msg="CreateContainer within sandbox \"68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a\"" Sep 9 05:03:21.926665 containerd[1498]: time="2025-09-09T05:03:21.926576082Z" level=info msg="StartContainer for \"7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a\"" Sep 9 05:03:21.929623 containerd[1498]: time="2025-09-09T05:03:21.929571857Z" level=info msg="connecting to shim 7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a" address="unix:///run/containerd/s/7a71b9ee8541ff8121720d83121f25ffe07e860fd42a1127011479b3a2d9e31f" protocol=ttrpc version=3 Sep 9 05:03:21.954375 systemd[1]: Started cri-containerd-7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a.scope - libcontainer container 7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a. Sep 9 05:03:21.992738 containerd[1498]: time="2025-09-09T05:03:21.992669572Z" level=info msg="StartContainer for \"7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a\" returns successfully" Sep 9 05:03:22.668354 systemd[1]: cri-containerd-7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a.scope: Deactivated successfully. Sep 9 05:03:22.669592 containerd[1498]: time="2025-09-09T05:03:22.668938225Z" level=info msg="received exit event container_id:\"7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a\" id:\"7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a\" pid:3450 exited_at:{seconds:1757394202 nanos:668713464}" Sep 9 05:03:22.669592 containerd[1498]: time="2025-09-09T05:03:22.669017666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a\" id:\"7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a\" pid:3450 exited_at:{seconds:1757394202 nanos:668713464}" Sep 9 05:03:22.668841 systemd[1]: cri-containerd-7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a.scope: Consumed 451ms CPU time, 177.2M memory peak, 2.3M read from disk, 165.8M written to disk. Sep 9 05:03:22.688358 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7e3d850f1af1d019117c7f1e83448d6f2fc503a855a0b932da1e65b1754cec5a-rootfs.mount: Deactivated successfully. Sep 9 05:03:22.704803 kubelet[2644]: I0909 05:03:22.704632 2644 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 9 05:03:22.811441 systemd[1]: Created slice kubepods-besteffort-pod950a549d_a68c_4846_bd7a_40dfe0e229ea.slice - libcontainer container kubepods-besteffort-pod950a549d_a68c_4846_bd7a_40dfe0e229ea.slice. Sep 9 05:03:22.821038 systemd[1]: Created slice kubepods-burstable-pod47c42195_8a89_46d2_89a4_d9b91ec95439.slice - libcontainer container kubepods-burstable-pod47c42195_8a89_46d2_89a4_d9b91ec95439.slice. Sep 9 05:03:22.831633 systemd[1]: Created slice kubepods-besteffort-pod0fadd76c_7e28_42cc_9fa8_a89467a3987a.slice - libcontainer container kubepods-besteffort-pod0fadd76c_7e28_42cc_9fa8_a89467a3987a.slice. Sep 9 05:03:22.841682 kubelet[2644]: I0909 05:03:22.841577 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-backend-key-pair\") pod \"whisker-659b96fb96-4mphx\" (UID: \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\") " pod="calico-system/whisker-659b96fb96-4mphx" Sep 9 05:03:22.841923 kubelet[2644]: I0909 05:03:22.841739 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdsxd\" (UniqueName: \"kubernetes.io/projected/0fadd76c-7e28-42cc-9fa8-a89467a3987a-kube-api-access-vdsxd\") pod \"calico-apiserver-5878b694cc-xhqwp\" (UID: \"0fadd76c-7e28-42cc-9fa8-a89467a3987a\") " pod="calico-apiserver/calico-apiserver-5878b694cc-xhqwp" Sep 9 05:03:22.841923 kubelet[2644]: I0909 05:03:22.841870 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfqzv\" (UniqueName: \"kubernetes.io/projected/950a549d-a68c-4846-bd7a-40dfe0e229ea-kube-api-access-kfqzv\") pod \"goldmane-7988f88666-mwh5j\" (UID: \"950a549d-a68c-4846-bd7a-40dfe0e229ea\") " pod="calico-system/goldmane-7988f88666-mwh5j" Sep 9 05:03:22.842034 kubelet[2644]: I0909 05:03:22.841900 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jkz\" (UniqueName: \"kubernetes.io/projected/47c42195-8a89-46d2-89a4-d9b91ec95439-kube-api-access-k8jkz\") pod \"coredns-7c65d6cfc9-wgbtl\" (UID: \"47c42195-8a89-46d2-89a4-d9b91ec95439\") " pod="kube-system/coredns-7c65d6cfc9-wgbtl" Sep 9 05:03:22.842071 kubelet[2644]: I0909 05:03:22.842048 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-ca-bundle\") pod \"whisker-659b96fb96-4mphx\" (UID: \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\") " pod="calico-system/whisker-659b96fb96-4mphx" Sep 9 05:03:22.842206 kubelet[2644]: I0909 05:03:22.842177 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a77ad90-2494-4662-8595-2d49ea8185e1-config-volume\") pod \"coredns-7c65d6cfc9-8972p\" (UID: \"9a77ad90-2494-4662-8595-2d49ea8185e1\") " pod="kube-system/coredns-7c65d6cfc9-8972p" Sep 9 05:03:22.846958 kubelet[2644]: I0909 05:03:22.846908 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0fadd76c-7e28-42cc-9fa8-a89467a3987a-calico-apiserver-certs\") pod \"calico-apiserver-5878b694cc-xhqwp\" (UID: \"0fadd76c-7e28-42cc-9fa8-a89467a3987a\") " pod="calico-apiserver/calico-apiserver-5878b694cc-xhqwp" Sep 9 05:03:22.846958 kubelet[2644]: I0909 05:03:22.846954 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47c42195-8a89-46d2-89a4-d9b91ec95439-config-volume\") pod \"coredns-7c65d6cfc9-wgbtl\" (UID: \"47c42195-8a89-46d2-89a4-d9b91ec95439\") " pod="kube-system/coredns-7c65d6cfc9-wgbtl" Sep 9 05:03:22.847189 kubelet[2644]: I0909 05:03:22.846981 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/950a549d-a68c-4846-bd7a-40dfe0e229ea-goldmane-ca-bundle\") pod \"goldmane-7988f88666-mwh5j\" (UID: \"950a549d-a68c-4846-bd7a-40dfe0e229ea\") " pod="calico-system/goldmane-7988f88666-mwh5j" Sep 9 05:03:22.847236 kubelet[2644]: I0909 05:03:22.847226 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b0e4dd1b-c7c3-4572-9518-fc34d9799979-calico-apiserver-certs\") pod \"calico-apiserver-5878b694cc-lzmv7\" (UID: \"b0e4dd1b-c7c3-4572-9518-fc34d9799979\") " pod="calico-apiserver/calico-apiserver-5878b694cc-lzmv7" Sep 9 05:03:22.847263 kubelet[2644]: I0909 05:03:22.847253 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/950a549d-a68c-4846-bd7a-40dfe0e229ea-config\") pod \"goldmane-7988f88666-mwh5j\" (UID: \"950a549d-a68c-4846-bd7a-40dfe0e229ea\") " pod="calico-system/goldmane-7988f88666-mwh5j" Sep 9 05:03:22.847292 kubelet[2644]: I0909 05:03:22.847271 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2dmd\" (UniqueName: \"kubernetes.io/projected/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-kube-api-access-r2dmd\") pod \"whisker-659b96fb96-4mphx\" (UID: \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\") " pod="calico-system/whisker-659b96fb96-4mphx" Sep 9 05:03:22.847316 kubelet[2644]: I0909 05:03:22.847287 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49qzk\" (UniqueName: \"kubernetes.io/projected/9a77ad90-2494-4662-8595-2d49ea8185e1-kube-api-access-49qzk\") pod \"coredns-7c65d6cfc9-8972p\" (UID: \"9a77ad90-2494-4662-8595-2d49ea8185e1\") " pod="kube-system/coredns-7c65d6cfc9-8972p" Sep 9 05:03:22.847340 kubelet[2644]: I0909 05:03:22.847313 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdwt2\" (UniqueName: \"kubernetes.io/projected/b0e4dd1b-c7c3-4572-9518-fc34d9799979-kube-api-access-gdwt2\") pod \"calico-apiserver-5878b694cc-lzmv7\" (UID: \"b0e4dd1b-c7c3-4572-9518-fc34d9799979\") " pod="calico-apiserver/calico-apiserver-5878b694cc-lzmv7" Sep 9 05:03:22.847372 kubelet[2644]: I0909 05:03:22.847340 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn58p\" (UniqueName: \"kubernetes.io/projected/0eb5ed0c-378a-4cfa-9362-7644979ac62c-kube-api-access-gn58p\") pod \"calico-kube-controllers-7d46bd58b4-x87dl\" (UID: \"0eb5ed0c-378a-4cfa-9362-7644979ac62c\") " pod="calico-system/calico-kube-controllers-7d46bd58b4-x87dl" Sep 9 05:03:22.847372 kubelet[2644]: I0909 05:03:22.847366 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/950a549d-a68c-4846-bd7a-40dfe0e229ea-goldmane-key-pair\") pod \"goldmane-7988f88666-mwh5j\" (UID: \"950a549d-a68c-4846-bd7a-40dfe0e229ea\") " pod="calico-system/goldmane-7988f88666-mwh5j" Sep 9 05:03:22.847419 kubelet[2644]: I0909 05:03:22.847380 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb5ed0c-378a-4cfa-9362-7644979ac62c-tigera-ca-bundle\") pod \"calico-kube-controllers-7d46bd58b4-x87dl\" (UID: \"0eb5ed0c-378a-4cfa-9362-7644979ac62c\") " pod="calico-system/calico-kube-controllers-7d46bd58b4-x87dl" Sep 9 05:03:22.847688 systemd[1]: Created slice kubepods-burstable-pod9a77ad90_2494_4662_8595_2d49ea8185e1.slice - libcontainer container kubepods-burstable-pod9a77ad90_2494_4662_8595_2d49ea8185e1.slice. Sep 9 05:03:22.853452 systemd[1]: Created slice kubepods-besteffort-pod0eb5ed0c_378a_4cfa_9362_7644979ac62c.slice - libcontainer container kubepods-besteffort-pod0eb5ed0c_378a_4cfa_9362_7644979ac62c.slice. Sep 9 05:03:22.860422 systemd[1]: Created slice kubepods-besteffort-podb0e4dd1b_c7c3_4572_9518_fc34d9799979.slice - libcontainer container kubepods-besteffort-podb0e4dd1b_c7c3_4572_9518_fc34d9799979.slice. Sep 9 05:03:22.865392 systemd[1]: Created slice kubepods-besteffort-poda1bf6ae3_ed8a_46d3_85b0_a85c680283d4.slice - libcontainer container kubepods-besteffort-poda1bf6ae3_ed8a_46d3_85b0_a85c680283d4.slice. Sep 9 05:03:23.118467 containerd[1498]: time="2025-09-09T05:03:23.118421687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mwh5j,Uid:950a549d-a68c-4846-bd7a-40dfe0e229ea,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:23.129979 containerd[1498]: time="2025-09-09T05:03:23.129527898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wgbtl,Uid:47c42195-8a89-46d2-89a4-d9b91ec95439,Namespace:kube-system,Attempt:0,}" Sep 9 05:03:23.141381 containerd[1498]: time="2025-09-09T05:03:23.141338313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-xhqwp,Uid:0fadd76c-7e28-42cc-9fa8-a89467a3987a,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:03:23.152914 containerd[1498]: time="2025-09-09T05:03:23.152851927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8972p,Uid:9a77ad90-2494-4662-8595-2d49ea8185e1,Namespace:kube-system,Attempt:0,}" Sep 9 05:03:23.156928 containerd[1498]: time="2025-09-09T05:03:23.156832985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7d46bd58b4-x87dl,Uid:0eb5ed0c-378a-4cfa-9362-7644979ac62c,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:23.168487 containerd[1498]: time="2025-09-09T05:03:23.168429639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-lzmv7,Uid:b0e4dd1b-c7c3-4572-9518-fc34d9799979,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:03:23.169505 containerd[1498]: time="2025-09-09T05:03:23.169449444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-659b96fb96-4mphx,Uid:a1bf6ae3-ed8a-46d3-85b0-a85c680283d4,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:23.251031 systemd[1]: Created slice kubepods-besteffort-poda7ba1d3b_babe_4d10_8083_57470bbd8f30.slice - libcontainer container kubepods-besteffort-poda7ba1d3b_babe_4d10_8083_57470bbd8f30.slice. Sep 9 05:03:23.254387 containerd[1498]: time="2025-09-09T05:03:23.254277837Z" level=error msg="Failed to destroy network for sandbox \"53f78226f23cac054a003053b08195bdf89bac2b7d088488ef75e88f0e3c822c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.255987 containerd[1498]: time="2025-09-09T05:03:23.255947205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-z6qqc,Uid:a7ba1d3b-babe-4d10-8083-57470bbd8f30,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:23.256353 containerd[1498]: time="2025-09-09T05:03:23.256311686Z" level=error msg="Failed to destroy network for sandbox \"77cb7ce12166b31d285a4b1e80af4377cebf408f37e30c175e7fd65a0b4576ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.257056 containerd[1498]: time="2025-09-09T05:03:23.257010530Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mwh5j,Uid:950a549d-a68c-4846-bd7a-40dfe0e229ea,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"53f78226f23cac054a003053b08195bdf89bac2b7d088488ef75e88f0e3c822c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.258995 kubelet[2644]: E0909 05:03:23.258927 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53f78226f23cac054a003053b08195bdf89bac2b7d088488ef75e88f0e3c822c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.259188 kubelet[2644]: E0909 05:03:23.259039 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53f78226f23cac054a003053b08195bdf89bac2b7d088488ef75e88f0e3c822c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-mwh5j" Sep 9 05:03:23.259188 kubelet[2644]: E0909 05:03:23.259062 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53f78226f23cac054a003053b08195bdf89bac2b7d088488ef75e88f0e3c822c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-mwh5j" Sep 9 05:03:23.259188 kubelet[2644]: E0909 05:03:23.259128 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-mwh5j_calico-system(950a549d-a68c-4846-bd7a-40dfe0e229ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-mwh5j_calico-system(950a549d-a68c-4846-bd7a-40dfe0e229ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"53f78226f23cac054a003053b08195bdf89bac2b7d088488ef75e88f0e3c822c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-mwh5j" podUID="950a549d-a68c-4846-bd7a-40dfe0e229ea" Sep 9 05:03:23.261007 containerd[1498]: time="2025-09-09T05:03:23.260891108Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-xhqwp,Uid:0fadd76c-7e28-42cc-9fa8-a89467a3987a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"77cb7ce12166b31d285a4b1e80af4377cebf408f37e30c175e7fd65a0b4576ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.261181 kubelet[2644]: E0909 05:03:23.261119 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77cb7ce12166b31d285a4b1e80af4377cebf408f37e30c175e7fd65a0b4576ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.261281 kubelet[2644]: E0909 05:03:23.261179 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77cb7ce12166b31d285a4b1e80af4377cebf408f37e30c175e7fd65a0b4576ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5878b694cc-xhqwp" Sep 9 05:03:23.261281 kubelet[2644]: E0909 05:03:23.261213 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77cb7ce12166b31d285a4b1e80af4377cebf408f37e30c175e7fd65a0b4576ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5878b694cc-xhqwp" Sep 9 05:03:23.261281 kubelet[2644]: E0909 05:03:23.261261 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5878b694cc-xhqwp_calico-apiserver(0fadd76c-7e28-42cc-9fa8-a89467a3987a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5878b694cc-xhqwp_calico-apiserver(0fadd76c-7e28-42cc-9fa8-a89467a3987a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"77cb7ce12166b31d285a4b1e80af4377cebf408f37e30c175e7fd65a0b4576ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5878b694cc-xhqwp" podUID="0fadd76c-7e28-42cc-9fa8-a89467a3987a" Sep 9 05:03:23.269473 containerd[1498]: time="2025-09-09T05:03:23.269404907Z" level=error msg="Failed to destroy network for sandbox \"50a5474d7d53b342a4d8371f96a5a3f3b59894313eaf250c3d822dd271a7ac32\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.272019 containerd[1498]: time="2025-09-09T05:03:23.271959159Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wgbtl,Uid:47c42195-8a89-46d2-89a4-d9b91ec95439,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"50a5474d7d53b342a4d8371f96a5a3f3b59894313eaf250c3d822dd271a7ac32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.272261 kubelet[2644]: E0909 05:03:23.272206 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50a5474d7d53b342a4d8371f96a5a3f3b59894313eaf250c3d822dd271a7ac32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.272389 kubelet[2644]: E0909 05:03:23.272261 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50a5474d7d53b342a4d8371f96a5a3f3b59894313eaf250c3d822dd271a7ac32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-wgbtl" Sep 9 05:03:23.272389 kubelet[2644]: E0909 05:03:23.272285 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50a5474d7d53b342a4d8371f96a5a3f3b59894313eaf250c3d822dd271a7ac32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-wgbtl" Sep 9 05:03:23.272389 kubelet[2644]: E0909 05:03:23.272331 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-wgbtl_kube-system(47c42195-8a89-46d2-89a4-d9b91ec95439)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-wgbtl_kube-system(47c42195-8a89-46d2-89a4-d9b91ec95439)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"50a5474d7d53b342a4d8371f96a5a3f3b59894313eaf250c3d822dd271a7ac32\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-wgbtl" podUID="47c42195-8a89-46d2-89a4-d9b91ec95439" Sep 9 05:03:23.287129 containerd[1498]: time="2025-09-09T05:03:23.287083589Z" level=error msg="Failed to destroy network for sandbox \"3ca08c915923b1931796c85080e7ccff43c92253fcf63d26bd96d3016b8922a6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.290561 containerd[1498]: time="2025-09-09T05:03:23.290423285Z" level=error msg="Failed to destroy network for sandbox \"fc700a4760a9e6405c0009438a25cd2ed63b517db1f5346996a1fd7bcabbd480\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.291117 containerd[1498]: time="2025-09-09T05:03:23.291079408Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8972p,Uid:9a77ad90-2494-4662-8595-2d49ea8185e1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ca08c915923b1931796c85080e7ccff43c92253fcf63d26bd96d3016b8922a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.291685 kubelet[2644]: E0909 05:03:23.291644 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ca08c915923b1931796c85080e7ccff43c92253fcf63d26bd96d3016b8922a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.291767 kubelet[2644]: E0909 05:03:23.291708 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ca08c915923b1931796c85080e7ccff43c92253fcf63d26bd96d3016b8922a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8972p" Sep 9 05:03:23.291767 kubelet[2644]: E0909 05:03:23.291727 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ca08c915923b1931796c85080e7ccff43c92253fcf63d26bd96d3016b8922a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8972p" Sep 9 05:03:23.291850 kubelet[2644]: E0909 05:03:23.291768 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-8972p_kube-system(9a77ad90-2494-4662-8595-2d49ea8185e1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-8972p_kube-system(9a77ad90-2494-4662-8595-2d49ea8185e1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ca08c915923b1931796c85080e7ccff43c92253fcf63d26bd96d3016b8922a6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8972p" podUID="9a77ad90-2494-4662-8595-2d49ea8185e1" Sep 9 05:03:23.291959 containerd[1498]: time="2025-09-09T05:03:23.291919131Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-lzmv7,Uid:b0e4dd1b-c7c3-4572-9518-fc34d9799979,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc700a4760a9e6405c0009438a25cd2ed63b517db1f5346996a1fd7bcabbd480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.292430 kubelet[2644]: E0909 05:03:23.292400 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc700a4760a9e6405c0009438a25cd2ed63b517db1f5346996a1fd7bcabbd480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.292512 kubelet[2644]: E0909 05:03:23.292447 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc700a4760a9e6405c0009438a25cd2ed63b517db1f5346996a1fd7bcabbd480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5878b694cc-lzmv7" Sep 9 05:03:23.292512 kubelet[2644]: E0909 05:03:23.292467 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc700a4760a9e6405c0009438a25cd2ed63b517db1f5346996a1fd7bcabbd480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5878b694cc-lzmv7" Sep 9 05:03:23.292687 kubelet[2644]: E0909 05:03:23.292549 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5878b694cc-lzmv7_calico-apiserver(b0e4dd1b-c7c3-4572-9518-fc34d9799979)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5878b694cc-lzmv7_calico-apiserver(b0e4dd1b-c7c3-4572-9518-fc34d9799979)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fc700a4760a9e6405c0009438a25cd2ed63b517db1f5346996a1fd7bcabbd480\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5878b694cc-lzmv7" podUID="b0e4dd1b-c7c3-4572-9518-fc34d9799979" Sep 9 05:03:23.298394 containerd[1498]: time="2025-09-09T05:03:23.298330881Z" level=error msg="Failed to destroy network for sandbox \"48a38b12c522f5d0c0321eb0f4e347c3bacde80f701939c3eecfb586b35cdc9c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.300915 containerd[1498]: time="2025-09-09T05:03:23.300870933Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-659b96fb96-4mphx,Uid:a1bf6ae3-ed8a-46d3-85b0-a85c680283d4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"48a38b12c522f5d0c0321eb0f4e347c3bacde80f701939c3eecfb586b35cdc9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.301122 kubelet[2644]: E0909 05:03:23.301048 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48a38b12c522f5d0c0321eb0f4e347c3bacde80f701939c3eecfb586b35cdc9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.301122 kubelet[2644]: E0909 05:03:23.301098 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48a38b12c522f5d0c0321eb0f4e347c3bacde80f701939c3eecfb586b35cdc9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-659b96fb96-4mphx" Sep 9 05:03:23.301273 kubelet[2644]: E0909 05:03:23.301115 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48a38b12c522f5d0c0321eb0f4e347c3bacde80f701939c3eecfb586b35cdc9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-659b96fb96-4mphx" Sep 9 05:03:23.301314 kubelet[2644]: E0909 05:03:23.301277 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-659b96fb96-4mphx_calico-system(a1bf6ae3-ed8a-46d3-85b0-a85c680283d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-659b96fb96-4mphx_calico-system(a1bf6ae3-ed8a-46d3-85b0-a85c680283d4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48a38b12c522f5d0c0321eb0f4e347c3bacde80f701939c3eecfb586b35cdc9c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-659b96fb96-4mphx" podUID="a1bf6ae3-ed8a-46d3-85b0-a85c680283d4" Sep 9 05:03:23.305181 containerd[1498]: time="2025-09-09T05:03:23.305133993Z" level=error msg="Failed to destroy network for sandbox \"04ae7537d20eaa043f671bf8a8c127fff65a50d354de222a4e78f04b346f6ac8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.306476 containerd[1498]: time="2025-09-09T05:03:23.306421879Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7d46bd58b4-x87dl,Uid:0eb5ed0c-378a-4cfa-9362-7644979ac62c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ae7537d20eaa043f671bf8a8c127fff65a50d354de222a4e78f04b346f6ac8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.306710 kubelet[2644]: E0909 05:03:23.306678 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ae7537d20eaa043f671bf8a8c127fff65a50d354de222a4e78f04b346f6ac8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.306773 kubelet[2644]: E0909 05:03:23.306730 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ae7537d20eaa043f671bf8a8c127fff65a50d354de222a4e78f04b346f6ac8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7d46bd58b4-x87dl" Sep 9 05:03:23.306773 kubelet[2644]: E0909 05:03:23.306748 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ae7537d20eaa043f671bf8a8c127fff65a50d354de222a4e78f04b346f6ac8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7d46bd58b4-x87dl" Sep 9 05:03:23.306898 kubelet[2644]: E0909 05:03:23.306794 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7d46bd58b4-x87dl_calico-system(0eb5ed0c-378a-4cfa-9362-7644979ac62c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7d46bd58b4-x87dl_calico-system(0eb5ed0c-378a-4cfa-9362-7644979ac62c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"04ae7537d20eaa043f671bf8a8c127fff65a50d354de222a4e78f04b346f6ac8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7d46bd58b4-x87dl" podUID="0eb5ed0c-378a-4cfa-9362-7644979ac62c" Sep 9 05:03:23.325592 containerd[1498]: time="2025-09-09T05:03:23.325547207Z" level=error msg="Failed to destroy network for sandbox \"2dab40124349ab27eec52f267319dba378fb311b4854a941d05492922e4b9f6a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.326555 containerd[1498]: time="2025-09-09T05:03:23.326522252Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-z6qqc,Uid:a7ba1d3b-babe-4d10-8083-57470bbd8f30,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dab40124349ab27eec52f267319dba378fb311b4854a941d05492922e4b9f6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.326810 kubelet[2644]: E0909 05:03:23.326744 2644 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dab40124349ab27eec52f267319dba378fb311b4854a941d05492922e4b9f6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:03:23.326899 kubelet[2644]: E0909 05:03:23.326823 2644 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dab40124349ab27eec52f267319dba378fb311b4854a941d05492922e4b9f6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-z6qqc" Sep 9 05:03:23.326899 kubelet[2644]: E0909 05:03:23.326854 2644 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dab40124349ab27eec52f267319dba378fb311b4854a941d05492922e4b9f6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-z6qqc" Sep 9 05:03:23.326952 kubelet[2644]: E0909 05:03:23.326909 2644 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-z6qqc_calico-system(a7ba1d3b-babe-4d10-8083-57470bbd8f30)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-z6qqc_calico-system(a7ba1d3b-babe-4d10-8083-57470bbd8f30)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2dab40124349ab27eec52f267319dba378fb311b4854a941d05492922e4b9f6a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-z6qqc" podUID="a7ba1d3b-babe-4d10-8083-57470bbd8f30" Sep 9 05:03:23.347259 containerd[1498]: time="2025-09-09T05:03:23.347218388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 05:03:26.460215 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1574217546.mount: Deactivated successfully. Sep 9 05:03:26.597099 containerd[1498]: time="2025-09-09T05:03:26.597040002Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:26.618519 containerd[1498]: time="2025-09-09T05:03:26.597528524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 05:03:26.618519 containerd[1498]: time="2025-09-09T05:03:26.599061730Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:26.618809 containerd[1498]: time="2025-09-09T05:03:26.611841303Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 3.264580675s" Sep 9 05:03:26.618809 containerd[1498]: time="2025-09-09T05:03:26.618722132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 05:03:26.619096 containerd[1498]: time="2025-09-09T05:03:26.619052254Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:26.628545 containerd[1498]: time="2025-09-09T05:03:26.628503573Z" level=info msg="CreateContainer within sandbox \"68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 05:03:26.642429 containerd[1498]: time="2025-09-09T05:03:26.642384191Z" level=info msg="Container 46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:26.660884 containerd[1498]: time="2025-09-09T05:03:26.660749548Z" level=info msg="CreateContainer within sandbox \"68f61b868d7e36753b9ca92035ef9f8d7a2342e7ebad6fea03b2e866fb1e377c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a\"" Sep 9 05:03:26.661401 containerd[1498]: time="2025-09-09T05:03:26.661374071Z" level=info msg="StartContainer for \"46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a\"" Sep 9 05:03:26.662930 containerd[1498]: time="2025-09-09T05:03:26.662829957Z" level=info msg="connecting to shim 46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a" address="unix:///run/containerd/s/7a71b9ee8541ff8121720d83121f25ffe07e860fd42a1127011479b3a2d9e31f" protocol=ttrpc version=3 Sep 9 05:03:26.686388 systemd[1]: Started cri-containerd-46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a.scope - libcontainer container 46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a. Sep 9 05:03:26.720679 containerd[1498]: time="2025-09-09T05:03:26.720585959Z" level=info msg="StartContainer for \"46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a\" returns successfully" Sep 9 05:03:26.848126 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 05:03:26.848259 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 05:03:27.076420 kubelet[2644]: I0909 05:03:27.076342 2644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-ca-bundle\") pod \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\" (UID: \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\") " Sep 9 05:03:27.076853 kubelet[2644]: I0909 05:03:27.076414 2644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-backend-key-pair\") pod \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\" (UID: \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\") " Sep 9 05:03:27.076853 kubelet[2644]: I0909 05:03:27.076456 2644 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2dmd\" (UniqueName: \"kubernetes.io/projected/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-kube-api-access-r2dmd\") pod \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\" (UID: \"a1bf6ae3-ed8a-46d3-85b0-a85c680283d4\") " Sep 9 05:03:27.085236 kubelet[2644]: I0909 05:03:27.084433 2644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "a1bf6ae3-ed8a-46d3-85b0-a85c680283d4" (UID: "a1bf6ae3-ed8a-46d3-85b0-a85c680283d4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 9 05:03:27.085236 kubelet[2644]: I0909 05:03:27.084696 2644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-kube-api-access-r2dmd" (OuterVolumeSpecName: "kube-api-access-r2dmd") pod "a1bf6ae3-ed8a-46d3-85b0-a85c680283d4" (UID: "a1bf6ae3-ed8a-46d3-85b0-a85c680283d4"). InnerVolumeSpecName "kube-api-access-r2dmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 9 05:03:27.086744 kubelet[2644]: I0909 05:03:27.086675 2644 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "a1bf6ae3-ed8a-46d3-85b0-a85c680283d4" (UID: "a1bf6ae3-ed8a-46d3-85b0-a85c680283d4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 9 05:03:27.177515 kubelet[2644]: I0909 05:03:27.177476 2644 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 05:03:27.177515 kubelet[2644]: I0909 05:03:27.177510 2644 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 05:03:27.177515 kubelet[2644]: I0909 05:03:27.177520 2644 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2dmd\" (UniqueName: \"kubernetes.io/projected/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4-kube-api-access-r2dmd\") on node \"localhost\" DevicePath \"\"" Sep 9 05:03:27.257697 systemd[1]: Removed slice kubepods-besteffort-poda1bf6ae3_ed8a_46d3_85b0_a85c680283d4.slice - libcontainer container kubepods-besteffort-poda1bf6ae3_ed8a_46d3_85b0_a85c680283d4.slice. Sep 9 05:03:27.394923 kubelet[2644]: I0909 05:03:27.394460 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-nw9hr" podStartSLOduration=1.522354629 podStartE2EDuration="12.394440129s" podCreationTimestamp="2025-09-09 05:03:15 +0000 UTC" firstStartedPulling="2025-09-09 05:03:15.747601356 +0000 UTC m=+20.586817433" lastFinishedPulling="2025-09-09 05:03:26.619686816 +0000 UTC m=+31.458902933" observedRunningTime="2025-09-09 05:03:27.394274448 +0000 UTC m=+32.233490565" watchObservedRunningTime="2025-09-09 05:03:27.394440129 +0000 UTC m=+32.233656206" Sep 9 05:03:27.406142 systemd[1]: Created slice kubepods-besteffort-podc2612133_65b2_4c9f_a1be_0ac0f824dade.slice - libcontainer container kubepods-besteffort-podc2612133_65b2_4c9f_a1be_0ac0f824dade.slice. Sep 9 05:03:27.460301 systemd[1]: var-lib-kubelet-pods-a1bf6ae3\x2ded8a\x2d46d3\x2d85b0\x2da85c680283d4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dr2dmd.mount: Deactivated successfully. Sep 9 05:03:27.460409 systemd[1]: var-lib-kubelet-pods-a1bf6ae3\x2ded8a\x2d46d3\x2d85b0\x2da85c680283d4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 05:03:27.478684 kubelet[2644]: I0909 05:03:27.478639 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2612133-65b2-4c9f-a1be-0ac0f824dade-whisker-ca-bundle\") pod \"whisker-5b8c5bd7d7-4dxx9\" (UID: \"c2612133-65b2-4c9f-a1be-0ac0f824dade\") " pod="calico-system/whisker-5b8c5bd7d7-4dxx9" Sep 9 05:03:27.478817 kubelet[2644]: I0909 05:03:27.478703 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wmn\" (UniqueName: \"kubernetes.io/projected/c2612133-65b2-4c9f-a1be-0ac0f824dade-kube-api-access-g8wmn\") pod \"whisker-5b8c5bd7d7-4dxx9\" (UID: \"c2612133-65b2-4c9f-a1be-0ac0f824dade\") " pod="calico-system/whisker-5b8c5bd7d7-4dxx9" Sep 9 05:03:27.478817 kubelet[2644]: I0909 05:03:27.478798 2644 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c2612133-65b2-4c9f-a1be-0ac0f824dade-whisker-backend-key-pair\") pod \"whisker-5b8c5bd7d7-4dxx9\" (UID: \"c2612133-65b2-4c9f-a1be-0ac0f824dade\") " pod="calico-system/whisker-5b8c5bd7d7-4dxx9" Sep 9 05:03:27.709626 containerd[1498]: time="2025-09-09T05:03:27.709525846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b8c5bd7d7-4dxx9,Uid:c2612133-65b2-4c9f-a1be-0ac0f824dade,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:27.856636 systemd-networkd[1442]: cali9c144cda579: Link UP Sep 9 05:03:27.857267 systemd-networkd[1442]: cali9c144cda579: Gained carrier Sep 9 05:03:27.874972 containerd[1498]: 2025-09-09 05:03:27.730 [INFO][3827] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:03:27.874972 containerd[1498]: 2025-09-09 05:03:27.759 [INFO][3827] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0 whisker-5b8c5bd7d7- calico-system c2612133-65b2-4c9f-a1be-0ac0f824dade 857 0 2025-09-09 05:03:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5b8c5bd7d7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5b8c5bd7d7-4dxx9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9c144cda579 [] [] }} ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-" Sep 9 05:03:27.874972 containerd[1498]: 2025-09-09 05:03:27.759 [INFO][3827] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" Sep 9 05:03:27.874972 containerd[1498]: 2025-09-09 05:03:27.811 [INFO][3842] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" HandleID="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Workload="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.812 [INFO][3842] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" HandleID="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Workload="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d910), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5b8c5bd7d7-4dxx9", "timestamp":"2025-09-09 05:03:27.811976662 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.812 [INFO][3842] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.812 [INFO][3842] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.812 [INFO][3842] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.823 [INFO][3842] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" host="localhost" Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.829 [INFO][3842] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.833 [INFO][3842] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.835 [INFO][3842] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.836 [INFO][3842] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:27.875188 containerd[1498]: 2025-09-09 05:03:27.836 [INFO][3842] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" host="localhost" Sep 9 05:03:27.875457 containerd[1498]: 2025-09-09 05:03:27.838 [INFO][3842] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a Sep 9 05:03:27.875457 containerd[1498]: 2025-09-09 05:03:27.841 [INFO][3842] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" host="localhost" Sep 9 05:03:27.875457 containerd[1498]: 2025-09-09 05:03:27.846 [INFO][3842] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" host="localhost" Sep 9 05:03:27.875457 containerd[1498]: 2025-09-09 05:03:27.847 [INFO][3842] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" host="localhost" Sep 9 05:03:27.875457 containerd[1498]: 2025-09-09 05:03:27.847 [INFO][3842] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:27.875457 containerd[1498]: 2025-09-09 05:03:27.847 [INFO][3842] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" HandleID="k8s-pod-network.13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Workload="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" Sep 9 05:03:27.875566 containerd[1498]: 2025-09-09 05:03:27.849 [INFO][3827] cni-plugin/k8s.go 418: Populated endpoint ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0", GenerateName:"whisker-5b8c5bd7d7-", Namespace:"calico-system", SelfLink:"", UID:"c2612133-65b2-4c9f-a1be-0ac0f824dade", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b8c5bd7d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5b8c5bd7d7-4dxx9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9c144cda579", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:27.875566 containerd[1498]: 2025-09-09 05:03:27.849 [INFO][3827] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" Sep 9 05:03:27.875636 containerd[1498]: 2025-09-09 05:03:27.850 [INFO][3827] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c144cda579 ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" Sep 9 05:03:27.875636 containerd[1498]: 2025-09-09 05:03:27.859 [INFO][3827] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" Sep 9 05:03:27.875673 containerd[1498]: 2025-09-09 05:03:27.860 [INFO][3827] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0", GenerateName:"whisker-5b8c5bd7d7-", Namespace:"calico-system", SelfLink:"", UID:"c2612133-65b2-4c9f-a1be-0ac0f824dade", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b8c5bd7d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a", Pod:"whisker-5b8c5bd7d7-4dxx9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9c144cda579", MAC:"e2:af:9d:59:33:6c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:27.875721 containerd[1498]: 2025-09-09 05:03:27.872 [INFO][3827] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" Namespace="calico-system" Pod="whisker-5b8c5bd7d7-4dxx9" WorkloadEndpoint="localhost-k8s-whisker--5b8c5bd7d7--4dxx9-eth0" Sep 9 05:03:27.935299 containerd[1498]: time="2025-09-09T05:03:27.935238722Z" level=info msg="connecting to shim 13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a" address="unix:///run/containerd/s/cdd21a908ca312d51d0b6508eb636eaa884ab4dd3a6a242b436cd410616154d7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:27.959360 systemd[1]: Started cri-containerd-13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a.scope - libcontainer container 13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a. Sep 9 05:03:27.969839 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:27.991875 containerd[1498]: time="2025-09-09T05:03:27.991826671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b8c5bd7d7-4dxx9,Uid:c2612133-65b2-4c9f-a1be-0ac0f824dade,Namespace:calico-system,Attempt:0,} returns sandbox id \"13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a\"" Sep 9 05:03:27.994320 containerd[1498]: time="2025-09-09T05:03:27.994290041Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 05:03:28.486091 containerd[1498]: time="2025-09-09T05:03:28.486055656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a\" id:\"faa9dd23170aba3bd8da5edbec5caa8171eec673b3531539cbe2ee6f95958e43\" pid:4013 exit_status:1 exited_at:{seconds:1757394208 nanos:485762134}" Sep 9 05:03:29.118170 containerd[1498]: time="2025-09-09T05:03:29.118119047Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:29.118622 containerd[1498]: time="2025-09-09T05:03:29.118585889Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 05:03:29.119501 containerd[1498]: time="2025-09-09T05:03:29.119480852Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:29.121670 containerd[1498]: time="2025-09-09T05:03:29.121330099Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:29.122133 containerd[1498]: time="2025-09-09T05:03:29.122107742Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.127764901s" Sep 9 05:03:29.122174 containerd[1498]: time="2025-09-09T05:03:29.122139703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 05:03:29.125080 containerd[1498]: time="2025-09-09T05:03:29.124950353Z" level=info msg="CreateContainer within sandbox \"13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 05:03:29.133073 containerd[1498]: time="2025-09-09T05:03:29.133042344Z" level=info msg="Container b24e752b6c633470e8a2b15979e76c2f777abfc039a3d9566546deb84cf8d896: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:29.139635 containerd[1498]: time="2025-09-09T05:03:29.139606769Z" level=info msg="CreateContainer within sandbox \"13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"b24e752b6c633470e8a2b15979e76c2f777abfc039a3d9566546deb84cf8d896\"" Sep 9 05:03:29.140218 containerd[1498]: time="2025-09-09T05:03:29.140166251Z" level=info msg="StartContainer for \"b24e752b6c633470e8a2b15979e76c2f777abfc039a3d9566546deb84cf8d896\"" Sep 9 05:03:29.141411 containerd[1498]: time="2025-09-09T05:03:29.141386696Z" level=info msg="connecting to shim b24e752b6c633470e8a2b15979e76c2f777abfc039a3d9566546deb84cf8d896" address="unix:///run/containerd/s/cdd21a908ca312d51d0b6508eb636eaa884ab4dd3a6a242b436cd410616154d7" protocol=ttrpc version=3 Sep 9 05:03:29.160336 systemd[1]: Started cri-containerd-b24e752b6c633470e8a2b15979e76c2f777abfc039a3d9566546deb84cf8d896.scope - libcontainer container b24e752b6c633470e8a2b15979e76c2f777abfc039a3d9566546deb84cf8d896. Sep 9 05:03:29.185329 systemd-networkd[1442]: cali9c144cda579: Gained IPv6LL Sep 9 05:03:29.195935 containerd[1498]: time="2025-09-09T05:03:29.195890704Z" level=info msg="StartContainer for \"b24e752b6c633470e8a2b15979e76c2f777abfc039a3d9566546deb84cf8d896\" returns successfully" Sep 9 05:03:29.196963 containerd[1498]: time="2025-09-09T05:03:29.196940468Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 05:03:29.246813 kubelet[2644]: I0909 05:03:29.246631 2644 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1bf6ae3-ed8a-46d3-85b0-a85c680283d4" path="/var/lib/kubelet/pods/a1bf6ae3-ed8a-46d3-85b0-a85c680283d4/volumes" Sep 9 05:03:29.449997 containerd[1498]: time="2025-09-09T05:03:29.449897993Z" level=info msg="TaskExit event in podsandbox handler container_id:\"46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a\" id:\"b6f23ef3a3065153ccff5d489ef0381389e462d1b8dceb830bacc7cb574a50ae\" pid:4099 exit_status:1 exited_at:{seconds:1757394209 nanos:449617752}" Sep 9 05:03:30.686646 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3056783389.mount: Deactivated successfully. Sep 9 05:03:30.728217 containerd[1498]: time="2025-09-09T05:03:30.728159033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:30.729218 containerd[1498]: time="2025-09-09T05:03:30.728162513Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 05:03:30.730898 containerd[1498]: time="2025-09-09T05:03:30.730843163Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:30.736220 containerd[1498]: time="2025-09-09T05:03:30.735535380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:30.736220 containerd[1498]: time="2025-09-09T05:03:30.736167342Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.539198514s" Sep 9 05:03:30.736220 containerd[1498]: time="2025-09-09T05:03:30.736209143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 05:03:30.740875 containerd[1498]: time="2025-09-09T05:03:30.740826360Z" level=info msg="CreateContainer within sandbox \"13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 05:03:30.756460 containerd[1498]: time="2025-09-09T05:03:30.756391817Z" level=info msg="Container 226004fdb550deda8403187f05e0eca33661d2fefb12a841c4f1f044d0be2db7: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:30.762605 containerd[1498]: time="2025-09-09T05:03:30.762574280Z" level=info msg="CreateContainer within sandbox \"13294b5c787f17992ff244bb4f84fb197bbe3ea860fc422e46983f1ceb85cc0a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"226004fdb550deda8403187f05e0eca33661d2fefb12a841c4f1f044d0be2db7\"" Sep 9 05:03:30.763056 containerd[1498]: time="2025-09-09T05:03:30.763034762Z" level=info msg="StartContainer for \"226004fdb550deda8403187f05e0eca33661d2fefb12a841c4f1f044d0be2db7\"" Sep 9 05:03:30.764387 containerd[1498]: time="2025-09-09T05:03:30.764341567Z" level=info msg="connecting to shim 226004fdb550deda8403187f05e0eca33661d2fefb12a841c4f1f044d0be2db7" address="unix:///run/containerd/s/cdd21a908ca312d51d0b6508eb636eaa884ab4dd3a6a242b436cd410616154d7" protocol=ttrpc version=3 Sep 9 05:03:30.788387 systemd[1]: Started cri-containerd-226004fdb550deda8403187f05e0eca33661d2fefb12a841c4f1f044d0be2db7.scope - libcontainer container 226004fdb550deda8403187f05e0eca33661d2fefb12a841c4f1f044d0be2db7. Sep 9 05:03:30.824872 containerd[1498]: time="2025-09-09T05:03:30.824749631Z" level=info msg="StartContainer for \"226004fdb550deda8403187f05e0eca33661d2fefb12a841c4f1f044d0be2db7\" returns successfully" Sep 9 05:03:31.410910 kubelet[2644]: I0909 05:03:31.410607 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5b8c5bd7d7-4dxx9" podStartSLOduration=1.664264963 podStartE2EDuration="4.410589401s" podCreationTimestamp="2025-09-09 05:03:27 +0000 UTC" firstStartedPulling="2025-09-09 05:03:27.993079516 +0000 UTC m=+32.832295633" lastFinishedPulling="2025-09-09 05:03:30.739403954 +0000 UTC m=+35.578620071" observedRunningTime="2025-09-09 05:03:31.40750051 +0000 UTC m=+36.246716627" watchObservedRunningTime="2025-09-09 05:03:31.410589401 +0000 UTC m=+36.249805478" Sep 9 05:03:32.995413 kubelet[2644]: I0909 05:03:32.995351 2644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:03:33.804569 systemd-networkd[1442]: vxlan.calico: Link UP Sep 9 05:03:33.804575 systemd-networkd[1442]: vxlan.calico: Gained carrier Sep 9 05:03:34.244616 containerd[1498]: time="2025-09-09T05:03:34.244567794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8972p,Uid:9a77ad90-2494-4662-8595-2d49ea8185e1,Namespace:kube-system,Attempt:0,}" Sep 9 05:03:34.245018 containerd[1498]: time="2025-09-09T05:03:34.244570634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-lzmv7,Uid:b0e4dd1b-c7c3-4572-9518-fc34d9799979,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:03:34.245018 containerd[1498]: time="2025-09-09T05:03:34.244570514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-xhqwp,Uid:0fadd76c-7e28-42cc-9fa8-a89467a3987a,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:03:34.411676 systemd-networkd[1442]: cali55d7f3450d0: Link UP Sep 9 05:03:34.411804 systemd-networkd[1442]: cali55d7f3450d0: Gained carrier Sep 9 05:03:34.424251 containerd[1498]: 2025-09-09 05:03:34.340 [INFO][4386] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0 calico-apiserver-5878b694cc- calico-apiserver b0e4dd1b-c7c3-4572-9518-fc34d9799979 796 0 2025-09-09 05:03:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5878b694cc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5878b694cc-lzmv7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali55d7f3450d0 [] [] }} ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-" Sep 9 05:03:34.424251 containerd[1498]: 2025-09-09 05:03:34.340 [INFO][4386] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" Sep 9 05:03:34.424251 containerd[1498]: 2025-09-09 05:03:34.369 [INFO][4411] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" HandleID="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Workload="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.369 [INFO][4411] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" HandleID="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Workload="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001374f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5878b694cc-lzmv7", "timestamp":"2025-09-09 05:03:34.369732691 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.369 [INFO][4411] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.369 [INFO][4411] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.369 [INFO][4411] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.383 [INFO][4411] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" host="localhost" Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.387 [INFO][4411] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.391 [INFO][4411] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.394 [INFO][4411] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.396 [INFO][4411] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:34.424458 containerd[1498]: 2025-09-09 05:03:34.396 [INFO][4411] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" host="localhost" Sep 9 05:03:34.424681 containerd[1498]: 2025-09-09 05:03:34.397 [INFO][4411] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6 Sep 9 05:03:34.424681 containerd[1498]: 2025-09-09 05:03:34.401 [INFO][4411] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" host="localhost" Sep 9 05:03:34.424681 containerd[1498]: 2025-09-09 05:03:34.406 [INFO][4411] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" host="localhost" Sep 9 05:03:34.424681 containerd[1498]: 2025-09-09 05:03:34.406 [INFO][4411] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" host="localhost" Sep 9 05:03:34.424681 containerd[1498]: 2025-09-09 05:03:34.406 [INFO][4411] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:34.424681 containerd[1498]: 2025-09-09 05:03:34.406 [INFO][4411] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" HandleID="k8s-pod-network.698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Workload="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" Sep 9 05:03:34.424792 containerd[1498]: 2025-09-09 05:03:34.408 [INFO][4386] cni-plugin/k8s.go 418: Populated endpoint ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0", GenerateName:"calico-apiserver-5878b694cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"b0e4dd1b-c7c3-4572-9518-fc34d9799979", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5878b694cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5878b694cc-lzmv7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali55d7f3450d0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:34.424844 containerd[1498]: 2025-09-09 05:03:34.409 [INFO][4386] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" Sep 9 05:03:34.424844 containerd[1498]: 2025-09-09 05:03:34.409 [INFO][4386] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali55d7f3450d0 ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" Sep 9 05:03:34.424844 containerd[1498]: 2025-09-09 05:03:34.410 [INFO][4386] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" Sep 9 05:03:34.424920 containerd[1498]: 2025-09-09 05:03:34.411 [INFO][4386] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0", GenerateName:"calico-apiserver-5878b694cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"b0e4dd1b-c7c3-4572-9518-fc34d9799979", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5878b694cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6", Pod:"calico-apiserver-5878b694cc-lzmv7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali55d7f3450d0", MAC:"9e:05:b0:1a:b3:92", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:34.424973 containerd[1498]: 2025-09-09 05:03:34.422 [INFO][4386] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-lzmv7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--lzmv7-eth0" Sep 9 05:03:34.447412 containerd[1498]: time="2025-09-09T05:03:34.447366831Z" level=info msg="connecting to shim 698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6" address="unix:///run/containerd/s/17fbe378be2182709102324d098acd674703f0c05279ab02becb4ebf8d7ba20a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:34.471361 systemd[1]: Started cri-containerd-698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6.scope - libcontainer container 698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6. Sep 9 05:03:34.495362 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:34.517475 systemd-networkd[1442]: cali628e1ad45c0: Link UP Sep 9 05:03:34.519833 systemd-networkd[1442]: cali628e1ad45c0: Gained carrier Sep 9 05:03:34.535427 containerd[1498]: time="2025-09-09T05:03:34.535381564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-lzmv7,Uid:b0e4dd1b-c7c3-4572-9518-fc34d9799979,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6\"" Sep 9 05:03:34.536238 containerd[1498]: 2025-09-09 05:03:34.337 [INFO][4368] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--8972p-eth0 coredns-7c65d6cfc9- kube-system 9a77ad90-2494-4662-8595-2d49ea8185e1 793 0 2025-09-09 05:03:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-8972p eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali628e1ad45c0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-" Sep 9 05:03:34.536238 containerd[1498]: 2025-09-09 05:03:34.337 [INFO][4368] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" Sep 9 05:03:34.536238 containerd[1498]: 2025-09-09 05:03:34.372 [INFO][4413] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" HandleID="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Workload="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.372 [INFO][4413] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" HandleID="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Workload="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fa0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-8972p", "timestamp":"2025-09-09 05:03:34.372716941 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.373 [INFO][4413] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.406 [INFO][4413] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.406 [INFO][4413] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.483 [INFO][4413] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" host="localhost" Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.488 [INFO][4413] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.492 [INFO][4413] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.494 [INFO][4413] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.497 [INFO][4413] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:34.536437 containerd[1498]: 2025-09-09 05:03:34.497 [INFO][4413] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" host="localhost" Sep 9 05:03:34.536661 containerd[1498]: 2025-09-09 05:03:34.498 [INFO][4413] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40 Sep 9 05:03:34.536661 containerd[1498]: 2025-09-09 05:03:34.503 [INFO][4413] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" host="localhost" Sep 9 05:03:34.536661 containerd[1498]: 2025-09-09 05:03:34.508 [INFO][4413] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" host="localhost" Sep 9 05:03:34.536661 containerd[1498]: 2025-09-09 05:03:34.508 [INFO][4413] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" host="localhost" Sep 9 05:03:34.536661 containerd[1498]: 2025-09-09 05:03:34.508 [INFO][4413] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:34.536661 containerd[1498]: 2025-09-09 05:03:34.508 [INFO][4413] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" HandleID="k8s-pod-network.22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Workload="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" Sep 9 05:03:34.536773 containerd[1498]: 2025-09-09 05:03:34.513 [INFO][4368] cni-plugin/k8s.go 418: Populated endpoint ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--8972p-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"9a77ad90-2494-4662-8595-2d49ea8185e1", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-8972p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali628e1ad45c0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:34.536838 containerd[1498]: 2025-09-09 05:03:34.513 [INFO][4368] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" Sep 9 05:03:34.536838 containerd[1498]: 2025-09-09 05:03:34.513 [INFO][4368] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali628e1ad45c0 ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" Sep 9 05:03:34.536838 containerd[1498]: 2025-09-09 05:03:34.522 [INFO][4368] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" Sep 9 05:03:34.536919 containerd[1498]: 2025-09-09 05:03:34.522 [INFO][4368] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--8972p-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"9a77ad90-2494-4662-8595-2d49ea8185e1", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40", Pod:"coredns-7c65d6cfc9-8972p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali628e1ad45c0", MAC:"42:f1:01:b5:9b:6b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:34.536919 containerd[1498]: 2025-09-09 05:03:34.531 [INFO][4368] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8972p" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8972p-eth0" Sep 9 05:03:34.537898 containerd[1498]: time="2025-09-09T05:03:34.537843852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:03:34.555731 containerd[1498]: time="2025-09-09T05:03:34.555681512Z" level=info msg="connecting to shim 22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40" address="unix:///run/containerd/s/38924d2a325dfcf2f56786f7d1dbf4ccf6479dc2658aad93cab2dd2d748ecf92" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:34.577376 systemd[1]: Started cri-containerd-22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40.scope - libcontainer container 22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40. Sep 9 05:03:34.591011 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:34.613708 containerd[1498]: time="2025-09-09T05:03:34.613671745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8972p,Uid:9a77ad90-2494-4662-8595-2d49ea8185e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40\"" Sep 9 05:03:34.616415 systemd-networkd[1442]: cali1178691fdd6: Link UP Sep 9 05:03:34.618467 systemd-networkd[1442]: cali1178691fdd6: Gained carrier Sep 9 05:03:34.631540 containerd[1498]: time="2025-09-09T05:03:34.631507605Z" level=info msg="CreateContainer within sandbox \"22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.347 [INFO][4380] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0 calico-apiserver-5878b694cc- calico-apiserver 0fadd76c-7e28-42cc-9fa8-a89467a3987a 797 0 2025-09-09 05:03:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5878b694cc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5878b694cc-xhqwp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1178691fdd6 [] [] }} ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.348 [INFO][4380] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.380 [INFO][4426] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" HandleID="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Workload="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.380 [INFO][4426] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" HandleID="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Workload="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5878b694cc-xhqwp", "timestamp":"2025-09-09 05:03:34.380024886 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.380 [INFO][4426] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.508 [INFO][4426] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.510 [INFO][4426] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.584 [INFO][4426] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.588 [INFO][4426] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.593 [INFO][4426] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.594 [INFO][4426] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.596 [INFO][4426] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.596 [INFO][4426] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.598 [INFO][4426] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681 Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.602 [INFO][4426] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.610 [INFO][4426] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.611 [INFO][4426] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" host="localhost" Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.611 [INFO][4426] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:34.634414 containerd[1498]: 2025-09-09 05:03:34.611 [INFO][4426] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" HandleID="k8s-pod-network.e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Workload="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" Sep 9 05:03:34.634895 containerd[1498]: 2025-09-09 05:03:34.613 [INFO][4380] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0", GenerateName:"calico-apiserver-5878b694cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"0fadd76c-7e28-42cc-9fa8-a89467a3987a", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5878b694cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5878b694cc-xhqwp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1178691fdd6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:34.634895 containerd[1498]: 2025-09-09 05:03:34.613 [INFO][4380] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" Sep 9 05:03:34.634895 containerd[1498]: 2025-09-09 05:03:34.613 [INFO][4380] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1178691fdd6 ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" Sep 9 05:03:34.634895 containerd[1498]: 2025-09-09 05:03:34.616 [INFO][4380] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" Sep 9 05:03:34.634895 containerd[1498]: 2025-09-09 05:03:34.617 [INFO][4380] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0", GenerateName:"calico-apiserver-5878b694cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"0fadd76c-7e28-42cc-9fa8-a89467a3987a", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5878b694cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681", Pod:"calico-apiserver-5878b694cc-xhqwp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1178691fdd6", MAC:"2a:bd:48:b2:2b:38", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:34.634895 containerd[1498]: 2025-09-09 05:03:34.631 [INFO][4380] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" Namespace="calico-apiserver" Pod="calico-apiserver-5878b694cc-xhqwp" WorkloadEndpoint="localhost-k8s-calico--apiserver--5878b694cc--xhqwp-eth0" Sep 9 05:03:34.642847 containerd[1498]: time="2025-09-09T05:03:34.642792003Z" level=info msg="Container 14fbb91731cc91dd6ad72116db8fdd2de86c37a9e649db7b9bb823bf344f0256: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:34.650465 containerd[1498]: time="2025-09-09T05:03:34.650426428Z" level=info msg="CreateContainer within sandbox \"22a44c44ea0a1f92014c5bd796fede6eade527cc14e6daa63b311c5e52d55d40\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"14fbb91731cc91dd6ad72116db8fdd2de86c37a9e649db7b9bb823bf344f0256\"" Sep 9 05:03:34.650941 containerd[1498]: time="2025-09-09T05:03:34.650915550Z" level=info msg="StartContainer for \"14fbb91731cc91dd6ad72116db8fdd2de86c37a9e649db7b9bb823bf344f0256\"" Sep 9 05:03:34.651768 containerd[1498]: time="2025-09-09T05:03:34.651737392Z" level=info msg="connecting to shim 14fbb91731cc91dd6ad72116db8fdd2de86c37a9e649db7b9bb823bf344f0256" address="unix:///run/containerd/s/38924d2a325dfcf2f56786f7d1dbf4ccf6479dc2658aad93cab2dd2d748ecf92" protocol=ttrpc version=3 Sep 9 05:03:34.656218 containerd[1498]: time="2025-09-09T05:03:34.656113727Z" level=info msg="connecting to shim e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681" address="unix:///run/containerd/s/088933f691624bdca2d7cf908800f8f56e9f9dc75794f3cca440f112adf03ef4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:34.674338 systemd[1]: Started cri-containerd-14fbb91731cc91dd6ad72116db8fdd2de86c37a9e649db7b9bb823bf344f0256.scope - libcontainer container 14fbb91731cc91dd6ad72116db8fdd2de86c37a9e649db7b9bb823bf344f0256. Sep 9 05:03:34.678426 systemd[1]: Started cri-containerd-e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681.scope - libcontainer container e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681. Sep 9 05:03:34.692953 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:34.705161 containerd[1498]: time="2025-09-09T05:03:34.705111571Z" level=info msg="StartContainer for \"14fbb91731cc91dd6ad72116db8fdd2de86c37a9e649db7b9bb823bf344f0256\" returns successfully" Sep 9 05:03:34.717881 containerd[1498]: time="2025-09-09T05:03:34.717816853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5878b694cc-xhqwp,Uid:0fadd76c-7e28-42cc-9fa8-a89467a3987a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681\"" Sep 9 05:03:35.244415 containerd[1498]: time="2025-09-09T05:03:35.244362231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wgbtl,Uid:47c42195-8a89-46d2-89a4-d9b91ec95439,Namespace:kube-system,Attempt:0,}" Sep 9 05:03:35.245220 containerd[1498]: time="2025-09-09T05:03:35.245180554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mwh5j,Uid:950a549d-a68c-4846-bd7a-40dfe0e229ea,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:35.366788 systemd-networkd[1442]: cali8f23220c17e: Link UP Sep 9 05:03:35.367516 systemd-networkd[1442]: cali8f23220c17e: Gained carrier Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.288 [INFO][4645] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0 coredns-7c65d6cfc9- kube-system 47c42195-8a89-46d2-89a4-d9b91ec95439 795 0 2025-09-09 05:03:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-wgbtl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8f23220c17e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.288 [INFO][4645] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.323 [INFO][4674] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" HandleID="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Workload="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.323 [INFO][4674] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" HandleID="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Workload="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400018f7b0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-wgbtl", "timestamp":"2025-09-09 05:03:35.323367048 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.324 [INFO][4674] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.324 [INFO][4674] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.324 [INFO][4674] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.334 [INFO][4674] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.339 [INFO][4674] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.342 [INFO][4674] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.344 [INFO][4674] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.346 [INFO][4674] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.346 [INFO][4674] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.347 [INFO][4674] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26 Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.351 [INFO][4674] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.356 [INFO][4674] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.356 [INFO][4674] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" host="localhost" Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.356 [INFO][4674] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:35.377876 containerd[1498]: 2025-09-09 05:03:35.356 [INFO][4674] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" HandleID="k8s-pod-network.d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Workload="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" Sep 9 05:03:35.378776 containerd[1498]: 2025-09-09 05:03:35.361 [INFO][4645] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"47c42195-8a89-46d2-89a4-d9b91ec95439", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-wgbtl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8f23220c17e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:35.378776 containerd[1498]: 2025-09-09 05:03:35.363 [INFO][4645] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" Sep 9 05:03:35.378776 containerd[1498]: 2025-09-09 05:03:35.363 [INFO][4645] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8f23220c17e ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" Sep 9 05:03:35.378776 containerd[1498]: 2025-09-09 05:03:35.367 [INFO][4645] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" Sep 9 05:03:35.378776 containerd[1498]: 2025-09-09 05:03:35.368 [INFO][4645] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"47c42195-8a89-46d2-89a4-d9b91ec95439", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26", Pod:"coredns-7c65d6cfc9-wgbtl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8f23220c17e", MAC:"de:c3:33:80:60:b2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:35.378776 containerd[1498]: 2025-09-09 05:03:35.375 [INFO][4645] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-wgbtl" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--wgbtl-eth0" Sep 9 05:03:35.400709 kubelet[2644]: I0909 05:03:35.400526 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-8972p" podStartSLOduration=34.40050938 podStartE2EDuration="34.40050938s" podCreationTimestamp="2025-09-09 05:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:03:35.398049092 +0000 UTC m=+40.237265209" watchObservedRunningTime="2025-09-09 05:03:35.40050938 +0000 UTC m=+40.239725497" Sep 9 05:03:35.422207 containerd[1498]: time="2025-09-09T05:03:35.422097770Z" level=info msg="connecting to shim d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26" address="unix:///run/containerd/s/fbbc1ec6dcd61e7352bcd7d3d4f90164f12bfc7233f92ab9b029e5e884c2c939" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:35.448661 systemd[1]: Started sshd@7-10.0.0.90:22-10.0.0.1:47636.service - OpenSSH per-connection server daemon (10.0.0.1:47636). Sep 9 05:03:35.477625 systemd[1]: Started cri-containerd-d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26.scope - libcontainer container d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26. Sep 9 05:03:35.498480 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:35.501347 systemd-networkd[1442]: cali44ae8616af3: Link UP Sep 9 05:03:35.504301 systemd-networkd[1442]: cali44ae8616af3: Gained carrier Sep 9 05:03:35.535965 containerd[1498]: time="2025-09-09T05:03:35.535929621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-wgbtl,Uid:47c42195-8a89-46d2-89a4-d9b91ec95439,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26\"" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.300 [INFO][4657] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--mwh5j-eth0 goldmane-7988f88666- calico-system 950a549d-a68c-4846-bd7a-40dfe0e229ea 799 0 2025-09-09 05:03:14 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-mwh5j eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali44ae8616af3 [] [] }} ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.304 [INFO][4657] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.337 [INFO][4681] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" HandleID="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Workload="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.337 [INFO][4681] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" HandleID="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Workload="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034bdb0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-mwh5j", "timestamp":"2025-09-09 05:03:35.337387254 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.337 [INFO][4681] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.357 [INFO][4681] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.357 [INFO][4681] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.436 [INFO][4681] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.448 [INFO][4681] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.458 [INFO][4681] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.464 [INFO][4681] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.467 [INFO][4681] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.467 [INFO][4681] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.472 [INFO][4681] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212 Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.479 [INFO][4681] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.491 [INFO][4681] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.491 [INFO][4681] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" host="localhost" Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.491 [INFO][4681] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:35.544600 containerd[1498]: 2025-09-09 05:03:35.491 [INFO][4681] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" HandleID="k8s-pod-network.d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Workload="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" Sep 9 05:03:35.545097 containerd[1498]: 2025-09-09 05:03:35.496 [INFO][4657] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--mwh5j-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"950a549d-a68c-4846-bd7a-40dfe0e229ea", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-mwh5j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali44ae8616af3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:35.545097 containerd[1498]: 2025-09-09 05:03:35.496 [INFO][4657] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" Sep 9 05:03:35.545097 containerd[1498]: 2025-09-09 05:03:35.496 [INFO][4657] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali44ae8616af3 ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" Sep 9 05:03:35.545097 containerd[1498]: 2025-09-09 05:03:35.504 [INFO][4657] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" Sep 9 05:03:35.545097 containerd[1498]: 2025-09-09 05:03:35.505 [INFO][4657] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--mwh5j-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"950a549d-a68c-4846-bd7a-40dfe0e229ea", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212", Pod:"goldmane-7988f88666-mwh5j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali44ae8616af3", MAC:"f2:51:bc:84:27:c6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:35.545097 containerd[1498]: 2025-09-09 05:03:35.520 [INFO][4657] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" Namespace="calico-system" Pod="goldmane-7988f88666-mwh5j" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--mwh5j-eth0" Sep 9 05:03:35.581335 containerd[1498]: time="2025-09-09T05:03:35.581185568Z" level=info msg="connecting to shim d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212" address="unix:///run/containerd/s/58c33d87e4718c657b4821acecc3ddd02731737f86536b11af48a227ef5d1918" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:35.581875 containerd[1498]: time="2025-09-09T05:03:35.581840291Z" level=info msg="CreateContainer within sandbox \"d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:03:35.584474 systemd-networkd[1442]: vxlan.calico: Gained IPv6LL Sep 9 05:03:35.585917 sshd[4730]: Accepted publickey for core from 10.0.0.1 port 47636 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:35.587683 sshd-session[4730]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:35.598063 systemd-logind[1481]: New session 8 of user core. Sep 9 05:03:35.599386 containerd[1498]: time="2025-09-09T05:03:35.599247307Z" level=info msg="Container 32ef82975aa3022abb1427c2a659d4e3f73884795836f622ca09281d04ca765d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:35.603348 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 05:03:35.609496 containerd[1498]: time="2025-09-09T05:03:35.609356180Z" level=info msg="CreateContainer within sandbox \"d1e1d774f6607f862a4beabd22ecf98e74c5b44b33e40e71f52d2b04e2f6df26\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"32ef82975aa3022abb1427c2a659d4e3f73884795836f622ca09281d04ca765d\"" Sep 9 05:03:35.610830 containerd[1498]: time="2025-09-09T05:03:35.610745385Z" level=info msg="StartContainer for \"32ef82975aa3022abb1427c2a659d4e3f73884795836f622ca09281d04ca765d\"" Sep 9 05:03:35.616338 containerd[1498]: time="2025-09-09T05:03:35.616307443Z" level=info msg="connecting to shim 32ef82975aa3022abb1427c2a659d4e3f73884795836f622ca09281d04ca765d" address="unix:///run/containerd/s/fbbc1ec6dcd61e7352bcd7d3d4f90164f12bfc7233f92ab9b029e5e884c2c939" protocol=ttrpc version=3 Sep 9 05:03:35.647607 systemd[1]: Started cri-containerd-32ef82975aa3022abb1427c2a659d4e3f73884795836f622ca09281d04ca765d.scope - libcontainer container 32ef82975aa3022abb1427c2a659d4e3f73884795836f622ca09281d04ca765d. Sep 9 05:03:35.650960 systemd[1]: Started cri-containerd-d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212.scope - libcontainer container d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212. Sep 9 05:03:35.665412 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:35.695231 containerd[1498]: time="2025-09-09T05:03:35.695129820Z" level=info msg="StartContainer for \"32ef82975aa3022abb1427c2a659d4e3f73884795836f622ca09281d04ca765d\" returns successfully" Sep 9 05:03:35.702772 containerd[1498]: time="2025-09-09T05:03:35.702737044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-mwh5j,Uid:950a549d-a68c-4846-bd7a-40dfe0e229ea,Namespace:calico-system,Attempt:0,} returns sandbox id \"d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212\"" Sep 9 05:03:35.777318 systemd-networkd[1442]: cali1178691fdd6: Gained IPv6LL Sep 9 05:03:35.885898 sshd[4800]: Connection closed by 10.0.0.1 port 47636 Sep 9 05:03:35.886807 sshd-session[4730]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:35.891015 systemd[1]: sshd@7-10.0.0.90:22-10.0.0.1:47636.service: Deactivated successfully. Sep 9 05:03:35.892769 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 05:03:35.893495 systemd-logind[1481]: Session 8 logged out. Waiting for processes to exit. Sep 9 05:03:35.894863 systemd-logind[1481]: Removed session 8. Sep 9 05:03:35.905392 systemd-networkd[1442]: cali628e1ad45c0: Gained IPv6LL Sep 9 05:03:36.209943 containerd[1498]: time="2025-09-09T05:03:36.209826801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:36.210715 containerd[1498]: time="2025-09-09T05:03:36.210682844Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 05:03:36.211710 containerd[1498]: time="2025-09-09T05:03:36.211658007Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:36.214236 containerd[1498]: time="2025-09-09T05:03:36.214170815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:36.214781 containerd[1498]: time="2025-09-09T05:03:36.214756817Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 1.676865764s" Sep 9 05:03:36.214851 containerd[1498]: time="2025-09-09T05:03:36.214786257Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 05:03:36.215504 containerd[1498]: time="2025-09-09T05:03:36.215477899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:03:36.217317 containerd[1498]: time="2025-09-09T05:03:36.217287065Z" level=info msg="CreateContainer within sandbox \"698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:03:36.223524 containerd[1498]: time="2025-09-09T05:03:36.223185484Z" level=info msg="Container ed5d07df624d75fb8c86e2a64f386c849889370b1421f72cdf78ca15411fa7ee: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:36.224297 systemd-networkd[1442]: cali55d7f3450d0: Gained IPv6LL Sep 9 05:03:36.229896 containerd[1498]: time="2025-09-09T05:03:36.229831025Z" level=info msg="CreateContainer within sandbox \"698aa6eb9de11815e33dd007689ccff95d51e256b100b42422ee28ff7fddb0b6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ed5d07df624d75fb8c86e2a64f386c849889370b1421f72cdf78ca15411fa7ee\"" Sep 9 05:03:36.230518 containerd[1498]: time="2025-09-09T05:03:36.230472707Z" level=info msg="StartContainer for \"ed5d07df624d75fb8c86e2a64f386c849889370b1421f72cdf78ca15411fa7ee\"" Sep 9 05:03:36.231527 containerd[1498]: time="2025-09-09T05:03:36.231504870Z" level=info msg="connecting to shim ed5d07df624d75fb8c86e2a64f386c849889370b1421f72cdf78ca15411fa7ee" address="unix:///run/containerd/s/17fbe378be2182709102324d098acd674703f0c05279ab02becb4ebf8d7ba20a" protocol=ttrpc version=3 Sep 9 05:03:36.250474 systemd[1]: Started cri-containerd-ed5d07df624d75fb8c86e2a64f386c849889370b1421f72cdf78ca15411fa7ee.scope - libcontainer container ed5d07df624d75fb8c86e2a64f386c849889370b1421f72cdf78ca15411fa7ee. Sep 9 05:03:36.286594 containerd[1498]: time="2025-09-09T05:03:36.286551846Z" level=info msg="StartContainer for \"ed5d07df624d75fb8c86e2a64f386c849889370b1421f72cdf78ca15411fa7ee\" returns successfully" Sep 9 05:03:36.402133 kubelet[2644]: I0909 05:03:36.402065 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5878b694cc-lzmv7" podStartSLOduration=23.723714444 podStartE2EDuration="25.402040653s" podCreationTimestamp="2025-09-09 05:03:11 +0000 UTC" firstStartedPulling="2025-09-09 05:03:34.53700881 +0000 UTC m=+39.376224927" lastFinishedPulling="2025-09-09 05:03:36.215335059 +0000 UTC m=+41.054551136" observedRunningTime="2025-09-09 05:03:36.40105161 +0000 UTC m=+41.240267727" watchObservedRunningTime="2025-09-09 05:03:36.402040653 +0000 UTC m=+41.241256730" Sep 9 05:03:36.417113 kubelet[2644]: I0909 05:03:36.416690 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-wgbtl" podStartSLOduration=35.4166709 podStartE2EDuration="35.4166709s" podCreationTimestamp="2025-09-09 05:03:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:03:36.415506216 +0000 UTC m=+41.254722293" watchObservedRunningTime="2025-09-09 05:03:36.4166709 +0000 UTC m=+41.255887017" Sep 9 05:03:36.504291 containerd[1498]: time="2025-09-09T05:03:36.504243219Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:36.504899 containerd[1498]: time="2025-09-09T05:03:36.504707700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:03:36.506642 containerd[1498]: time="2025-09-09T05:03:36.506591466Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 291.086007ms" Sep 9 05:03:36.506642 containerd[1498]: time="2025-09-09T05:03:36.506628106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 05:03:36.507957 containerd[1498]: time="2025-09-09T05:03:36.507918431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 05:03:36.510465 containerd[1498]: time="2025-09-09T05:03:36.509935597Z" level=info msg="CreateContainer within sandbox \"e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:03:36.516729 containerd[1498]: time="2025-09-09T05:03:36.516701819Z" level=info msg="Container 5845204d6e7431053a3a51e9e1841fc18c4a0169f670ef0ee74e8ee5265d8af6: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:36.525836 containerd[1498]: time="2025-09-09T05:03:36.525711887Z" level=info msg="CreateContainer within sandbox \"e4b9868d28349f61ece18c41213f092086bd40376c61a6c1474992d32047a681\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5845204d6e7431053a3a51e9e1841fc18c4a0169f670ef0ee74e8ee5265d8af6\"" Sep 9 05:03:36.526508 containerd[1498]: time="2025-09-09T05:03:36.526445770Z" level=info msg="StartContainer for \"5845204d6e7431053a3a51e9e1841fc18c4a0169f670ef0ee74e8ee5265d8af6\"" Sep 9 05:03:36.528972 containerd[1498]: time="2025-09-09T05:03:36.528940978Z" level=info msg="connecting to shim 5845204d6e7431053a3a51e9e1841fc18c4a0169f670ef0ee74e8ee5265d8af6" address="unix:///run/containerd/s/088933f691624bdca2d7cf908800f8f56e9f9dc75794f3cca440f112adf03ef4" protocol=ttrpc version=3 Sep 9 05:03:36.552346 systemd[1]: Started cri-containerd-5845204d6e7431053a3a51e9e1841fc18c4a0169f670ef0ee74e8ee5265d8af6.scope - libcontainer container 5845204d6e7431053a3a51e9e1841fc18c4a0169f670ef0ee74e8ee5265d8af6. Sep 9 05:03:36.583988 containerd[1498]: time="2025-09-09T05:03:36.583948033Z" level=info msg="StartContainer for \"5845204d6e7431053a3a51e9e1841fc18c4a0169f670ef0ee74e8ee5265d8af6\" returns successfully" Sep 9 05:03:37.312421 systemd-networkd[1442]: cali44ae8616af3: Gained IPv6LL Sep 9 05:03:37.378230 systemd-networkd[1442]: cali8f23220c17e: Gained IPv6LL Sep 9 05:03:37.400611 kubelet[2644]: I0909 05:03:37.400178 2644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:03:37.430077 kubelet[2644]: I0909 05:03:37.429978 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5878b694cc-xhqwp" podStartSLOduration=24.641511205 podStartE2EDuration="26.429957817s" podCreationTimestamp="2025-09-09 05:03:11 +0000 UTC" firstStartedPulling="2025-09-09 05:03:34.718976737 +0000 UTC m=+39.558192854" lastFinishedPulling="2025-09-09 05:03:36.507423349 +0000 UTC m=+41.346639466" observedRunningTime="2025-09-09 05:03:37.427941531 +0000 UTC m=+42.267157648" watchObservedRunningTime="2025-09-09 05:03:37.429957817 +0000 UTC m=+42.269173974" Sep 9 05:03:38.005167 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3758689840.mount: Deactivated successfully. Sep 9 05:03:38.245580 containerd[1498]: time="2025-09-09T05:03:38.245053421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7d46bd58b4-x87dl,Uid:0eb5ed0c-378a-4cfa-9362-7644979ac62c,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:38.246654 containerd[1498]: time="2025-09-09T05:03:38.245855583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-z6qqc,Uid:a7ba1d3b-babe-4d10-8083-57470bbd8f30,Namespace:calico-system,Attempt:0,}" Sep 9 05:03:38.399266 kubelet[2644]: I0909 05:03:38.398755 2644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:03:38.449969 systemd-networkd[1442]: calidf590878942: Link UP Sep 9 05:03:38.451926 systemd-networkd[1442]: calidf590878942: Gained carrier Sep 9 05:03:38.524980 containerd[1498]: time="2025-09-09T05:03:38.524756154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:38.525510 containerd[1498]: time="2025-09-09T05:03:38.525413516Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 05:03:38.527134 containerd[1498]: time="2025-09-09T05:03:38.527088521Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.352 [INFO][4967] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--z6qqc-eth0 csi-node-driver- calico-system a7ba1d3b-babe-4d10-8083-57470bbd8f30 697 0 2025-09-09 05:03:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-z6qqc eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calidf590878942 [] [] }} ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.353 [INFO][4967] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-eth0" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.389 [INFO][4997] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" HandleID="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Workload="localhost-k8s-csi--node--driver--z6qqc-eth0" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.389 [INFO][4997] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" HandleID="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Workload="localhost-k8s-csi--node--driver--z6qqc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400012f740), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-z6qqc", "timestamp":"2025-09-09 05:03:38.38903102 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.389 [INFO][4997] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.389 [INFO][4997] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.389 [INFO][4997] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.404 [INFO][4997] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.419 [INFO][4997] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.425 [INFO][4997] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.428 [INFO][4997] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.431 [INFO][4997] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.431 [INFO][4997] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.432 [INFO][4997] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99 Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.437 [INFO][4997] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.444 [INFO][4997] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.444 [INFO][4997] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" host="localhost" Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.444 [INFO][4997] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:38.528013 containerd[1498]: 2025-09-09 05:03:38.444 [INFO][4997] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" HandleID="k8s-pod-network.ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Workload="localhost-k8s-csi--node--driver--z6qqc-eth0" Sep 9 05:03:38.529258 containerd[1498]: 2025-09-09 05:03:38.447 [INFO][4967] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--z6qqc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a7ba1d3b-babe-4d10-8083-57470bbd8f30", ResourceVersion:"697", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-z6qqc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidf590878942", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:38.529258 containerd[1498]: 2025-09-09 05:03:38.447 [INFO][4967] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-eth0" Sep 9 05:03:38.529258 containerd[1498]: 2025-09-09 05:03:38.447 [INFO][4967] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidf590878942 ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-eth0" Sep 9 05:03:38.529258 containerd[1498]: 2025-09-09 05:03:38.450 [INFO][4967] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-eth0" Sep 9 05:03:38.529258 containerd[1498]: 2025-09-09 05:03:38.502 [INFO][4967] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--z6qqc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a7ba1d3b-babe-4d10-8083-57470bbd8f30", ResourceVersion:"697", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99", Pod:"csi-node-driver-z6qqc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidf590878942", MAC:"a2:75:5b:b0:b3:38", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:38.529258 containerd[1498]: 2025-09-09 05:03:38.520 [INFO][4967] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" Namespace="calico-system" Pod="csi-node-driver-z6qqc" WorkloadEndpoint="localhost-k8s-csi--node--driver--z6qqc-eth0" Sep 9 05:03:38.532108 containerd[1498]: time="2025-09-09T05:03:38.532076536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:38.535989 containerd[1498]: time="2025-09-09T05:03:38.535949988Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.027978597s" Sep 9 05:03:38.535989 containerd[1498]: time="2025-09-09T05:03:38.535987148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 05:03:38.541792 containerd[1498]: time="2025-09-09T05:03:38.541753406Z" level=info msg="CreateContainer within sandbox \"d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 05:03:38.561169 systemd-networkd[1442]: cali589556ae30a: Link UP Sep 9 05:03:38.563372 systemd-networkd[1442]: cali589556ae30a: Gained carrier Sep 9 05:03:38.570360 containerd[1498]: time="2025-09-09T05:03:38.569984572Z" level=info msg="Container 5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:38.579635 containerd[1498]: time="2025-09-09T05:03:38.579586641Z" level=info msg="connecting to shim ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99" address="unix:///run/containerd/s/1f5f59434c918a754c90575bb99b6b2c830a52317e7df247126ffd260c3df6a7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.352 [INFO][4973] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0 calico-kube-controllers-7d46bd58b4- calico-system 0eb5ed0c-378a-4cfa-9362-7644979ac62c 790 0 2025-09-09 05:03:15 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7d46bd58b4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7d46bd58b4-x87dl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali589556ae30a [] [] }} ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.352 [INFO][4973] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.409 [INFO][4999] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" HandleID="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Workload="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.409 [INFO][4999] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" HandleID="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Workload="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb820), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7d46bd58b4-x87dl", "timestamp":"2025-09-09 05:03:38.409627723 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.409 [INFO][4999] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.444 [INFO][4999] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.445 [INFO][4999] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.505 [INFO][4999] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.521 [INFO][4999] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.529 [INFO][4999] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.531 [INFO][4999] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.537 [INFO][4999] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.537 [INFO][4999] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.538 [INFO][4999] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.543 [INFO][4999] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.550 [INFO][4999] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.550 [INFO][4999] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" host="localhost" Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.550 [INFO][4999] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:03:38.580373 containerd[1498]: 2025-09-09 05:03:38.550 [INFO][4999] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" HandleID="k8s-pod-network.cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Workload="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" Sep 9 05:03:38.581772 containerd[1498]: 2025-09-09 05:03:38.556 [INFO][4973] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0", GenerateName:"calico-kube-controllers-7d46bd58b4-", Namespace:"calico-system", SelfLink:"", UID:"0eb5ed0c-378a-4cfa-9362-7644979ac62c", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7d46bd58b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7d46bd58b4-x87dl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali589556ae30a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:38.581772 containerd[1498]: 2025-09-09 05:03:38.557 [INFO][4973] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" Sep 9 05:03:38.581772 containerd[1498]: 2025-09-09 05:03:38.557 [INFO][4973] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali589556ae30a ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" Sep 9 05:03:38.581772 containerd[1498]: 2025-09-09 05:03:38.564 [INFO][4973] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" Sep 9 05:03:38.581772 containerd[1498]: 2025-09-09 05:03:38.564 [INFO][4973] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0", GenerateName:"calico-kube-controllers-7d46bd58b4-", Namespace:"calico-system", SelfLink:"", UID:"0eb5ed0c-378a-4cfa-9362-7644979ac62c", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 3, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7d46bd58b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e", Pod:"calico-kube-controllers-7d46bd58b4-x87dl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali589556ae30a", MAC:"d6:bb:4a:5d:e2:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:03:38.581772 containerd[1498]: 2025-09-09 05:03:38.577 [INFO][4973] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" Namespace="calico-system" Pod="calico-kube-controllers-7d46bd58b4-x87dl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7d46bd58b4--x87dl-eth0" Sep 9 05:03:38.597951 containerd[1498]: time="2025-09-09T05:03:38.597911977Z" level=info msg="CreateContainer within sandbox \"d369dc7d6a8d44a242ba1d95f051daa8cc6df0c6b74635e4004b00539280e212\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d\"" Sep 9 05:03:38.599883 containerd[1498]: time="2025-09-09T05:03:38.599544102Z" level=info msg="StartContainer for \"5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d\"" Sep 9 05:03:38.611468 containerd[1498]: time="2025-09-09T05:03:38.607427326Z" level=info msg="connecting to shim 5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d" address="unix:///run/containerd/s/58c33d87e4718c657b4821acecc3ddd02731737f86536b11af48a227ef5d1918" protocol=ttrpc version=3 Sep 9 05:03:38.624411 containerd[1498]: time="2025-09-09T05:03:38.624369098Z" level=info msg="connecting to shim cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e" address="unix:///run/containerd/s/a471dd9f3b4b968f850301c1345b1aed585288ad4075d5a9f9e6678039a18c11" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:03:38.649873 systemd[1]: Started cri-containerd-5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d.scope - libcontainer container 5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d. Sep 9 05:03:38.651111 systemd[1]: Started cri-containerd-ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99.scope - libcontainer container ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99. Sep 9 05:03:38.657764 systemd[1]: Started cri-containerd-cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e.scope - libcontainer container cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e. Sep 9 05:03:38.679120 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:38.679153 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 05:03:38.712131 containerd[1498]: time="2025-09-09T05:03:38.712072165Z" level=info msg="StartContainer for \"5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d\" returns successfully" Sep 9 05:03:38.727834 containerd[1498]: time="2025-09-09T05:03:38.727567413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-z6qqc,Uid:a7ba1d3b-babe-4d10-8083-57470bbd8f30,Namespace:calico-system,Attempt:0,} returns sandbox id \"ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99\"" Sep 9 05:03:38.734483 containerd[1498]: time="2025-09-09T05:03:38.734446074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 05:03:38.750459 containerd[1498]: time="2025-09-09T05:03:38.750420762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7d46bd58b4-x87dl,Uid:0eb5ed0c-378a-4cfa-9362-7644979ac62c,Namespace:calico-system,Attempt:0,} returns sandbox id \"cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e\"" Sep 9 05:03:39.418504 kubelet[2644]: I0909 05:03:39.418435 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-mwh5j" podStartSLOduration=22.586320075 podStartE2EDuration="25.418419175s" podCreationTimestamp="2025-09-09 05:03:14 +0000 UTC" firstStartedPulling="2025-09-09 05:03:35.704755811 +0000 UTC m=+40.543971928" lastFinishedPulling="2025-09-09 05:03:38.536854911 +0000 UTC m=+43.376071028" observedRunningTime="2025-09-09 05:03:39.418045494 +0000 UTC m=+44.257261611" watchObservedRunningTime="2025-09-09 05:03:39.418419175 +0000 UTC m=+44.257635252" Sep 9 05:03:39.680372 systemd-networkd[1442]: cali589556ae30a: Gained IPv6LL Sep 9 05:03:39.718849 containerd[1498]: time="2025-09-09T05:03:39.718793673Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:39.720318 containerd[1498]: time="2025-09-09T05:03:39.720286717Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 05:03:39.721377 containerd[1498]: time="2025-09-09T05:03:39.721346721Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:39.723551 containerd[1498]: time="2025-09-09T05:03:39.723511567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:39.724516 containerd[1498]: time="2025-09-09T05:03:39.724489130Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 989.995336ms" Sep 9 05:03:39.724566 containerd[1498]: time="2025-09-09T05:03:39.724522170Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 05:03:39.725910 containerd[1498]: time="2025-09-09T05:03:39.725879014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 05:03:39.726655 containerd[1498]: time="2025-09-09T05:03:39.726622816Z" level=info msg="CreateContainer within sandbox \"ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 05:03:39.747658 containerd[1498]: time="2025-09-09T05:03:39.747613279Z" level=info msg="Container 4b93487dc1a098acb83480c87d33d06e27851491453d9b4e6ee2744acf1b8e80: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:39.763427 containerd[1498]: time="2025-09-09T05:03:39.763372726Z" level=info msg="CreateContainer within sandbox \"ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"4b93487dc1a098acb83480c87d33d06e27851491453d9b4e6ee2744acf1b8e80\"" Sep 9 05:03:39.764168 containerd[1498]: time="2025-09-09T05:03:39.764133928Z" level=info msg="StartContainer for \"4b93487dc1a098acb83480c87d33d06e27851491453d9b4e6ee2744acf1b8e80\"" Sep 9 05:03:39.765601 containerd[1498]: time="2025-09-09T05:03:39.765571493Z" level=info msg="connecting to shim 4b93487dc1a098acb83480c87d33d06e27851491453d9b4e6ee2744acf1b8e80" address="unix:///run/containerd/s/1f5f59434c918a754c90575bb99b6b2c830a52317e7df247126ffd260c3df6a7" protocol=ttrpc version=3 Sep 9 05:03:39.787354 systemd[1]: Started cri-containerd-4b93487dc1a098acb83480c87d33d06e27851491453d9b4e6ee2744acf1b8e80.scope - libcontainer container 4b93487dc1a098acb83480c87d33d06e27851491453d9b4e6ee2744acf1b8e80. Sep 9 05:03:39.834869 containerd[1498]: time="2025-09-09T05:03:39.834820900Z" level=info msg="StartContainer for \"4b93487dc1a098acb83480c87d33d06e27851491453d9b4e6ee2744acf1b8e80\" returns successfully" Sep 9 05:03:39.894132 containerd[1498]: time="2025-09-09T05:03:39.894083157Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d\" id:\"1b0415e16939e7cd7a321919f56040f445f9efa5d1eff125a6c3558fb9f7c3a8\" pid:5212 exited_at:{seconds:1757394219 nanos:893593156}" Sep 9 05:03:40.384423 systemd-networkd[1442]: calidf590878942: Gained IPv6LL Sep 9 05:03:40.471789 containerd[1498]: time="2025-09-09T05:03:40.471596817Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d\" id:\"fc635503ba24ca6217de8267f8d0c324f887110786c6ec597956d7d63234e502\" pid:5242 exit_status:1 exited_at:{seconds:1757394220 nanos:471316696}" Sep 9 05:03:40.905846 systemd[1]: Started sshd@8-10.0.0.90:22-10.0.0.1:55374.service - OpenSSH per-connection server daemon (10.0.0.1:55374). Sep 9 05:03:40.983586 sshd[5258]: Accepted publickey for core from 10.0.0.1 port 55374 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:40.985187 sshd-session[5258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:40.989792 systemd-logind[1481]: New session 9 of user core. Sep 9 05:03:40.996487 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 05:03:41.212686 sshd[5261]: Connection closed by 10.0.0.1 port 55374 Sep 9 05:03:41.213159 sshd-session[5258]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:41.216933 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 05:03:41.217002 systemd-logind[1481]: Session 9 logged out. Waiting for processes to exit. Sep 9 05:03:41.218638 systemd[1]: sshd@8-10.0.0.90:22-10.0.0.1:55374.service: Deactivated successfully. Sep 9 05:03:41.223895 systemd-logind[1481]: Removed session 9. Sep 9 05:03:41.466483 containerd[1498]: time="2025-09-09T05:03:41.466099749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:41.468345 containerd[1498]: time="2025-09-09T05:03:41.468292395Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 05:03:41.470453 containerd[1498]: time="2025-09-09T05:03:41.470408641Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:41.473691 containerd[1498]: time="2025-09-09T05:03:41.473646011Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:41.474921 containerd[1498]: time="2025-09-09T05:03:41.474861414Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 1.74894712s" Sep 9 05:03:41.475064 containerd[1498]: time="2025-09-09T05:03:41.475005895Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 05:03:41.478718 containerd[1498]: time="2025-09-09T05:03:41.477438702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 05:03:41.480692 containerd[1498]: time="2025-09-09T05:03:41.480639351Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d\" id:\"543a61ea377bde8370ff8a89fe129e3e3c2a2866d52617816e03f4c723814cdd\" pid:5287 exit_status:1 exited_at:{seconds:1757394221 nanos:479422347}" Sep 9 05:03:41.492418 containerd[1498]: time="2025-09-09T05:03:41.491023381Z" level=info msg="CreateContainer within sandbox \"cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 05:03:41.498686 containerd[1498]: time="2025-09-09T05:03:41.498650443Z" level=info msg="Container 9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:41.505287 containerd[1498]: time="2025-09-09T05:03:41.505247462Z" level=info msg="CreateContainer within sandbox \"cbfd9efde195c1252773603b17c7be6976768ba4a0efcd8c48213acaa89ddb9e\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e\"" Sep 9 05:03:41.506717 containerd[1498]: time="2025-09-09T05:03:41.506660306Z" level=info msg="StartContainer for \"9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e\"" Sep 9 05:03:41.508123 containerd[1498]: time="2025-09-09T05:03:41.508082470Z" level=info msg="connecting to shim 9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e" address="unix:///run/containerd/s/a471dd9f3b4b968f850301c1345b1aed585288ad4075d5a9f9e6678039a18c11" protocol=ttrpc version=3 Sep 9 05:03:41.535419 systemd[1]: Started cri-containerd-9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e.scope - libcontainer container 9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e. Sep 9 05:03:41.575750 containerd[1498]: time="2025-09-09T05:03:41.575709184Z" level=info msg="StartContainer for \"9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e\" returns successfully" Sep 9 05:03:42.448695 kubelet[2644]: I0909 05:03:42.448590 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7d46bd58b4-x87dl" podStartSLOduration=24.723318023 podStartE2EDuration="27.448572876s" podCreationTimestamp="2025-09-09 05:03:15 +0000 UTC" firstStartedPulling="2025-09-09 05:03:38.751462846 +0000 UTC m=+43.590678963" lastFinishedPulling="2025-09-09 05:03:41.476717699 +0000 UTC m=+46.315933816" observedRunningTime="2025-09-09 05:03:42.447112471 +0000 UTC m=+47.286328708" watchObservedRunningTime="2025-09-09 05:03:42.448572876 +0000 UTC m=+47.287788993" Sep 9 05:03:42.461023 containerd[1498]: time="2025-09-09T05:03:42.460974391Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e\" id:\"9c76893ac87651aaf98e36f693d31574606c4910610d2f331a20e3aac8072a3d\" pid:5361 exited_at:{seconds:1757394222 nanos:459471946}" Sep 9 05:03:42.613375 containerd[1498]: time="2025-09-09T05:03:42.612823780Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:42.614592 containerd[1498]: time="2025-09-09T05:03:42.614472785Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 05:03:42.616151 containerd[1498]: time="2025-09-09T05:03:42.615876109Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:42.618538 containerd[1498]: time="2025-09-09T05:03:42.618499836Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:03:42.619603 containerd[1498]: time="2025-09-09T05:03:42.619571279Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.141582576s" Sep 9 05:03:42.619977 containerd[1498]: time="2025-09-09T05:03:42.619693680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 05:03:42.621429 kubelet[2644]: I0909 05:03:42.621394 2644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:03:42.623093 containerd[1498]: time="2025-09-09T05:03:42.622667648Z" level=info msg="CreateContainer within sandbox \"ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 05:03:42.630746 containerd[1498]: time="2025-09-09T05:03:42.630708351Z" level=info msg="Container 582c499bcd7bd8df58e49337766c8db724b755c5aa3a65f1f12dece1e9c4a140: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:03:42.650446 containerd[1498]: time="2025-09-09T05:03:42.650400247Z" level=info msg="CreateContainer within sandbox \"ccc48281f0d25a6fc19c55284b1bdb7d06288730168121825d2c8cf02f1a1a99\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"582c499bcd7bd8df58e49337766c8db724b755c5aa3a65f1f12dece1e9c4a140\"" Sep 9 05:03:42.651414 containerd[1498]: time="2025-09-09T05:03:42.651324529Z" level=info msg="StartContainer for \"582c499bcd7bd8df58e49337766c8db724b755c5aa3a65f1f12dece1e9c4a140\"" Sep 9 05:03:42.652676 containerd[1498]: time="2025-09-09T05:03:42.652646773Z" level=info msg="connecting to shim 582c499bcd7bd8df58e49337766c8db724b755c5aa3a65f1f12dece1e9c4a140" address="unix:///run/containerd/s/1f5f59434c918a754c90575bb99b6b2c830a52317e7df247126ffd260c3df6a7" protocol=ttrpc version=3 Sep 9 05:03:42.709332 systemd[1]: Started cri-containerd-582c499bcd7bd8df58e49337766c8db724b755c5aa3a65f1f12dece1e9c4a140.scope - libcontainer container 582c499bcd7bd8df58e49337766c8db724b755c5aa3a65f1f12dece1e9c4a140. Sep 9 05:03:42.746700 containerd[1498]: time="2025-09-09T05:03:42.746660599Z" level=info msg="StartContainer for \"582c499bcd7bd8df58e49337766c8db724b755c5aa3a65f1f12dece1e9c4a140\" returns successfully" Sep 9 05:03:43.329491 kubelet[2644]: I0909 05:03:43.329382 2644 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 05:03:43.331243 kubelet[2644]: I0909 05:03:43.331224 2644 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 05:03:46.232384 systemd[1]: Started sshd@9-10.0.0.90:22-10.0.0.1:55384.service - OpenSSH per-connection server daemon (10.0.0.1:55384). Sep 9 05:03:46.310483 sshd[5419]: Accepted publickey for core from 10.0.0.1 port 55384 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:46.312154 sshd-session[5419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:46.316334 systemd-logind[1481]: New session 10 of user core. Sep 9 05:03:46.322412 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 05:03:46.597023 sshd[5422]: Connection closed by 10.0.0.1 port 55384 Sep 9 05:03:46.598167 sshd-session[5419]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:46.610004 systemd[1]: sshd@9-10.0.0.90:22-10.0.0.1:55384.service: Deactivated successfully. Sep 9 05:03:46.611927 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 05:03:46.612682 systemd-logind[1481]: Session 10 logged out. Waiting for processes to exit. Sep 9 05:03:46.615736 systemd[1]: Started sshd@10-10.0.0.90:22-10.0.0.1:55390.service - OpenSSH per-connection server daemon (10.0.0.1:55390). Sep 9 05:03:46.616286 systemd-logind[1481]: Removed session 10. Sep 9 05:03:46.664490 sshd[5442]: Accepted publickey for core from 10.0.0.1 port 55390 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:46.666363 sshd-session[5442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:46.672347 systemd-logind[1481]: New session 11 of user core. Sep 9 05:03:46.681497 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 05:03:46.880542 sshd[5445]: Connection closed by 10.0.0.1 port 55390 Sep 9 05:03:46.880733 sshd-session[5442]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:46.892939 systemd[1]: sshd@10-10.0.0.90:22-10.0.0.1:55390.service: Deactivated successfully. Sep 9 05:03:46.895881 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 05:03:46.897018 systemd-logind[1481]: Session 11 logged out. Waiting for processes to exit. Sep 9 05:03:46.902742 systemd[1]: Started sshd@11-10.0.0.90:22-10.0.0.1:55394.service - OpenSSH per-connection server daemon (10.0.0.1:55394). Sep 9 05:03:46.904684 systemd-logind[1481]: Removed session 11. Sep 9 05:03:46.955550 sshd[5457]: Accepted publickey for core from 10.0.0.1 port 55394 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:46.956990 sshd-session[5457]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:46.961186 systemd-logind[1481]: New session 12 of user core. Sep 9 05:03:46.977401 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 05:03:47.166487 sshd[5460]: Connection closed by 10.0.0.1 port 55394 Sep 9 05:03:47.166906 sshd-session[5457]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:47.170441 systemd[1]: sshd@11-10.0.0.90:22-10.0.0.1:55394.service: Deactivated successfully. Sep 9 05:03:47.174089 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 05:03:47.175623 systemd-logind[1481]: Session 12 logged out. Waiting for processes to exit. Sep 9 05:03:47.176717 systemd-logind[1481]: Removed session 12. Sep 9 05:03:48.276854 containerd[1498]: time="2025-09-09T05:03:48.276329571Z" level=info msg="TaskExit event in podsandbox handler container_id:\"46c02b4e316b8d01b1461971a0da5028cc60e3283e30bc5eaaf4e9a1cbd36e4a\" id:\"b6a4e24e7c3b2f0a35982d43553e718c139dc48b72e8e97b90813be886f10f19\" pid:5486 exit_status:1 exited_at:{seconds:1757394228 nanos:275993130}" Sep 9 05:03:52.183389 systemd[1]: Started sshd@12-10.0.0.90:22-10.0.0.1:56764.service - OpenSSH per-connection server daemon (10.0.0.1:56764). Sep 9 05:03:52.248879 sshd[5502]: Accepted publickey for core from 10.0.0.1 port 56764 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:52.249528 sshd-session[5502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:52.257650 systemd-logind[1481]: New session 13 of user core. Sep 9 05:03:52.266397 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 05:03:52.423409 sshd[5505]: Connection closed by 10.0.0.1 port 56764 Sep 9 05:03:52.423935 sshd-session[5502]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:52.434621 systemd[1]: sshd@12-10.0.0.90:22-10.0.0.1:56764.service: Deactivated successfully. Sep 9 05:03:52.437610 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 05:03:52.439108 systemd-logind[1481]: Session 13 logged out. Waiting for processes to exit. Sep 9 05:03:52.442038 systemd[1]: Started sshd@13-10.0.0.90:22-10.0.0.1:56774.service - OpenSSH per-connection server daemon (10.0.0.1:56774). Sep 9 05:03:52.443141 systemd-logind[1481]: Removed session 13. Sep 9 05:03:52.504014 sshd[5519]: Accepted publickey for core from 10.0.0.1 port 56774 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:52.505431 sshd-session[5519]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:52.509929 systemd-logind[1481]: New session 14 of user core. Sep 9 05:03:52.527407 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 05:03:52.774223 sshd[5522]: Connection closed by 10.0.0.1 port 56774 Sep 9 05:03:52.774049 sshd-session[5519]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:52.786905 systemd[1]: sshd@13-10.0.0.90:22-10.0.0.1:56774.service: Deactivated successfully. Sep 9 05:03:52.788787 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 05:03:52.789692 systemd-logind[1481]: Session 14 logged out. Waiting for processes to exit. Sep 9 05:03:52.792741 systemd[1]: Started sshd@14-10.0.0.90:22-10.0.0.1:56790.service - OpenSSH per-connection server daemon (10.0.0.1:56790). Sep 9 05:03:52.794263 systemd-logind[1481]: Removed session 14. Sep 9 05:03:52.844364 sshd[5534]: Accepted publickey for core from 10.0.0.1 port 56790 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:52.845858 sshd-session[5534]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:52.850883 systemd-logind[1481]: New session 15 of user core. Sep 9 05:03:52.857412 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 05:03:54.483154 sshd[5537]: Connection closed by 10.0.0.1 port 56790 Sep 9 05:03:54.482307 sshd-session[5534]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:54.500240 systemd[1]: sshd@14-10.0.0.90:22-10.0.0.1:56790.service: Deactivated successfully. Sep 9 05:03:54.502010 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 05:03:54.502205 systemd[1]: session-15.scope: Consumed 559ms CPU time, 74.5M memory peak. Sep 9 05:03:54.505787 systemd-logind[1481]: Session 15 logged out. Waiting for processes to exit. Sep 9 05:03:54.511252 systemd[1]: Started sshd@15-10.0.0.90:22-10.0.0.1:56796.service - OpenSSH per-connection server daemon (10.0.0.1:56796). Sep 9 05:03:54.513795 systemd-logind[1481]: Removed session 15. Sep 9 05:03:54.573520 sshd[5560]: Accepted publickey for core from 10.0.0.1 port 56796 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:54.574909 sshd-session[5560]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:54.579004 systemd-logind[1481]: New session 16 of user core. Sep 9 05:03:54.589364 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 05:03:54.878235 sshd[5568]: Connection closed by 10.0.0.1 port 56796 Sep 9 05:03:54.878708 sshd-session[5560]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:54.890751 systemd[1]: sshd@15-10.0.0.90:22-10.0.0.1:56796.service: Deactivated successfully. Sep 9 05:03:54.892669 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 05:03:54.894434 systemd-logind[1481]: Session 16 logged out. Waiting for processes to exit. Sep 9 05:03:54.898507 systemd[1]: Started sshd@16-10.0.0.90:22-10.0.0.1:56810.service - OpenSSH per-connection server daemon (10.0.0.1:56810). Sep 9 05:03:54.899697 systemd-logind[1481]: Removed session 16. Sep 9 05:03:54.957655 sshd[5579]: Accepted publickey for core from 10.0.0.1 port 56810 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:03:54.959125 sshd-session[5579]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:03:54.965284 systemd-logind[1481]: New session 17 of user core. Sep 9 05:03:54.971408 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 05:03:55.105209 sshd[5582]: Connection closed by 10.0.0.1 port 56810 Sep 9 05:03:55.105528 sshd-session[5579]: pam_unix(sshd:session): session closed for user core Sep 9 05:03:55.109128 systemd[1]: sshd@16-10.0.0.90:22-10.0.0.1:56810.service: Deactivated successfully. Sep 9 05:03:55.112821 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 05:03:55.113608 systemd-logind[1481]: Session 17 logged out. Waiting for processes to exit. Sep 9 05:03:55.114993 systemd-logind[1481]: Removed session 17. Sep 9 05:03:59.347927 containerd[1498]: time="2025-09-09T05:03:59.347886147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9a5b86330afe5009b5c05c3d7d810173a2f153d2d55e385ec14f9743b23db60e\" id:\"de962e9ff12c6be63f2c71652abaf50c09c5f9f2aee932bc00ec52a5b9a1cbde\" pid:5613 exited_at:{seconds:1757394239 nanos:347522659}" Sep 9 05:04:00.127548 systemd[1]: Started sshd@17-10.0.0.90:22-10.0.0.1:49868.service - OpenSSH per-connection server daemon (10.0.0.1:49868). Sep 9 05:04:00.179402 sshd[5624]: Accepted publickey for core from 10.0.0.1 port 49868 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:04:00.180927 sshd-session[5624]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:04:00.184796 systemd-logind[1481]: New session 18 of user core. Sep 9 05:04:00.204416 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 05:04:00.346275 sshd[5630]: Connection closed by 10.0.0.1 port 49868 Sep 9 05:04:00.347395 sshd-session[5624]: pam_unix(sshd:session): session closed for user core Sep 9 05:04:00.351112 systemd[1]: sshd@17-10.0.0.90:22-10.0.0.1:49868.service: Deactivated successfully. Sep 9 05:04:00.353051 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 05:04:00.353803 systemd-logind[1481]: Session 18 logged out. Waiting for processes to exit. Sep 9 05:04:00.354944 systemd-logind[1481]: Removed session 18. Sep 9 05:04:00.582342 kubelet[2644]: I0909 05:04:00.582143 2644 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:04:00.612521 kubelet[2644]: I0909 05:04:00.612309 2644 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-z6qqc" podStartSLOduration=41.723090558 podStartE2EDuration="45.612293896s" podCreationTimestamp="2025-09-09 05:03:15 +0000 UTC" firstStartedPulling="2025-09-09 05:03:38.731343184 +0000 UTC m=+43.570559301" lastFinishedPulling="2025-09-09 05:03:42.620546522 +0000 UTC m=+47.459762639" observedRunningTime="2025-09-09 05:03:43.43641017 +0000 UTC m=+48.275626327" watchObservedRunningTime="2025-09-09 05:04:00.612293896 +0000 UTC m=+65.451510013" Sep 9 05:04:03.163061 containerd[1498]: time="2025-09-09T05:04:03.163002637Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5b1c44003db10bbeb9876c77cce95ad04138cda7e25eb92fd1111fb0c24efb4d\" id:\"422d2ab3c660a458aaf07431f82804a062fcaab3a79cc0126bd8783b9348738e\" pid:5660 exited_at:{seconds:1757394243 nanos:162448186}" Sep 9 05:04:05.370210 systemd[1]: Started sshd@18-10.0.0.90:22-10.0.0.1:49872.service - OpenSSH per-connection server daemon (10.0.0.1:49872). Sep 9 05:04:05.422204 sshd[5674]: Accepted publickey for core from 10.0.0.1 port 49872 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:04:05.423672 sshd-session[5674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:04:05.428244 systemd-logind[1481]: New session 19 of user core. Sep 9 05:04:05.436390 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 05:04:05.575221 sshd[5677]: Connection closed by 10.0.0.1 port 49872 Sep 9 05:04:05.575434 sshd-session[5674]: pam_unix(sshd:session): session closed for user core Sep 9 05:04:05.579157 systemd[1]: sshd@18-10.0.0.90:22-10.0.0.1:49872.service: Deactivated successfully. Sep 9 05:04:05.581204 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 05:04:05.582076 systemd-logind[1481]: Session 19 logged out. Waiting for processes to exit. Sep 9 05:04:05.583043 systemd-logind[1481]: Removed session 19. Sep 9 05:04:10.586501 systemd[1]: Started sshd@19-10.0.0.90:22-10.0.0.1:47184.service - OpenSSH per-connection server daemon (10.0.0.1:47184). Sep 9 05:04:10.651505 sshd[5692]: Accepted publickey for core from 10.0.0.1 port 47184 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 05:04:10.652916 sshd-session[5692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:04:10.658923 systemd-logind[1481]: New session 20 of user core. Sep 9 05:04:10.678395 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 05:04:10.820990 sshd[5695]: Connection closed by 10.0.0.1 port 47184 Sep 9 05:04:10.821323 sshd-session[5692]: pam_unix(sshd:session): session closed for user core Sep 9 05:04:10.824917 systemd-logind[1481]: Session 20 logged out. Waiting for processes to exit. Sep 9 05:04:10.825477 systemd[1]: sshd@19-10.0.0.90:22-10.0.0.1:47184.service: Deactivated successfully. Sep 9 05:04:10.827176 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 05:04:10.828964 systemd-logind[1481]: Removed session 20.