Sep 9 04:50:39.766106 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 9 04:50:39.766126 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 03:38:34 -00 2025 Sep 9 04:50:39.766173 kernel: KASLR enabled Sep 9 04:50:39.766180 kernel: efi: EFI v2.7 by EDK II Sep 9 04:50:39.766186 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb832018 ACPI 2.0=0xdbfd0018 RNG=0xdbfd0a18 MEMRESERVE=0xdb838218 Sep 9 04:50:39.766191 kernel: random: crng init done Sep 9 04:50:39.766198 kernel: secureboot: Secure boot disabled Sep 9 04:50:39.766204 kernel: ACPI: Early table checksum verification disabled Sep 9 04:50:39.766209 kernel: ACPI: RSDP 0x00000000DBFD0018 000024 (v02 BOCHS ) Sep 9 04:50:39.766218 kernel: ACPI: XSDT 0x00000000DBFD0F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 9 04:50:39.766224 kernel: ACPI: FACP 0x00000000DBFD0B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766230 kernel: ACPI: DSDT 0x00000000DBF0E018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766236 kernel: ACPI: APIC 0x00000000DBFD0C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766242 kernel: ACPI: PPTT 0x00000000DBFD0098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766249 kernel: ACPI: GTDT 0x00000000DBFD0818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766256 kernel: ACPI: MCFG 0x00000000DBFD0A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766262 kernel: ACPI: SPCR 0x00000000DBFD0918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766268 kernel: ACPI: DBG2 0x00000000DBFD0998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766274 kernel: ACPI: IORT 0x00000000DBFD0198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:50:39.766280 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 9 04:50:39.766286 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 04:50:39.766292 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:50:39.766298 kernel: NODE_DATA(0) allocated [mem 0xdc965a00-0xdc96cfff] Sep 9 04:50:39.766304 kernel: Zone ranges: Sep 9 04:50:39.766310 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:50:39.766317 kernel: DMA32 empty Sep 9 04:50:39.766323 kernel: Normal empty Sep 9 04:50:39.766329 kernel: Device empty Sep 9 04:50:39.766335 kernel: Movable zone start for each node Sep 9 04:50:39.766341 kernel: Early memory node ranges Sep 9 04:50:39.766347 kernel: node 0: [mem 0x0000000040000000-0x00000000db81ffff] Sep 9 04:50:39.766352 kernel: node 0: [mem 0x00000000db820000-0x00000000db82ffff] Sep 9 04:50:39.766358 kernel: node 0: [mem 0x00000000db830000-0x00000000dc09ffff] Sep 9 04:50:39.766364 kernel: node 0: [mem 0x00000000dc0a0000-0x00000000dc2dffff] Sep 9 04:50:39.766370 kernel: node 0: [mem 0x00000000dc2e0000-0x00000000dc36ffff] Sep 9 04:50:39.766376 kernel: node 0: [mem 0x00000000dc370000-0x00000000dc45ffff] Sep 9 04:50:39.766382 kernel: node 0: [mem 0x00000000dc460000-0x00000000dc52ffff] Sep 9 04:50:39.766389 kernel: node 0: [mem 0x00000000dc530000-0x00000000dc5cffff] Sep 9 04:50:39.766395 kernel: node 0: [mem 0x00000000dc5d0000-0x00000000dce1ffff] Sep 9 04:50:39.766408 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Sep 9 04:50:39.766417 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Sep 9 04:50:39.766424 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Sep 9 04:50:39.766430 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 9 04:50:39.766438 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:50:39.766444 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 9 04:50:39.766451 kernel: cma: Reserved 16 MiB at 0x00000000d8000000 on node -1 Sep 9 04:50:39.766457 kernel: psci: probing for conduit method from ACPI. Sep 9 04:50:39.766463 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 04:50:39.766470 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 04:50:39.766476 kernel: psci: Trusted OS migration not required Sep 9 04:50:39.766483 kernel: psci: SMC Calling Convention v1.1 Sep 9 04:50:39.766489 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 9 04:50:39.766496 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 04:50:39.766504 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 04:50:39.766511 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 9 04:50:39.766517 kernel: Detected PIPT I-cache on CPU0 Sep 9 04:50:39.766524 kernel: CPU features: detected: GIC system register CPU interface Sep 9 04:50:39.766530 kernel: CPU features: detected: Spectre-v4 Sep 9 04:50:39.766537 kernel: CPU features: detected: Spectre-BHB Sep 9 04:50:39.766543 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 04:50:39.766549 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 04:50:39.766556 kernel: CPU features: detected: ARM erratum 1418040 Sep 9 04:50:39.766562 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 04:50:39.766568 kernel: alternatives: applying boot alternatives Sep 9 04:50:39.766575 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:50:39.766583 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 04:50:39.766590 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 04:50:39.766596 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 04:50:39.766602 kernel: Fallback order for Node 0: 0 Sep 9 04:50:39.766609 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 9 04:50:39.766616 kernel: Policy zone: DMA Sep 9 04:50:39.766622 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 04:50:39.766628 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 9 04:50:39.766634 kernel: software IO TLB: area num 4. Sep 9 04:50:39.766641 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 9 04:50:39.766648 kernel: software IO TLB: mapped [mem 0x00000000d7c00000-0x00000000d8000000] (4MB) Sep 9 04:50:39.766655 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 04:50:39.766662 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 04:50:39.766669 kernel: rcu: RCU event tracing is enabled. Sep 9 04:50:39.766676 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 04:50:39.766682 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 04:50:39.766689 kernel: Tracing variant of Tasks RCU enabled. Sep 9 04:50:39.766695 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 04:50:39.766702 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 04:50:39.766708 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:50:39.766715 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:50:39.766721 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 04:50:39.766728 kernel: GICv3: 256 SPIs implemented Sep 9 04:50:39.766735 kernel: GICv3: 0 Extended SPIs implemented Sep 9 04:50:39.766741 kernel: Root IRQ handler: gic_handle_irq Sep 9 04:50:39.766747 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 9 04:50:39.766754 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 9 04:50:39.766760 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 9 04:50:39.766767 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 9 04:50:39.766773 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 9 04:50:39.766779 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 9 04:50:39.766786 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 9 04:50:39.766793 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 9 04:50:39.766799 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 04:50:39.766807 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:50:39.766813 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 9 04:50:39.766820 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 9 04:50:39.766826 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 9 04:50:39.766833 kernel: arm-pv: using stolen time PV Sep 9 04:50:39.766840 kernel: Console: colour dummy device 80x25 Sep 9 04:50:39.766846 kernel: ACPI: Core revision 20240827 Sep 9 04:50:39.766853 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 9 04:50:39.766860 kernel: pid_max: default: 32768 minimum: 301 Sep 9 04:50:39.766866 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 04:50:39.766875 kernel: landlock: Up and running. Sep 9 04:50:39.766881 kernel: SELinux: Initializing. Sep 9 04:50:39.766888 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:50:39.766895 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:50:39.766902 kernel: rcu: Hierarchical SRCU implementation. Sep 9 04:50:39.766908 kernel: rcu: Max phase no-delay instances is 400. Sep 9 04:50:39.766915 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 04:50:39.766922 kernel: Remapping and enabling EFI services. Sep 9 04:50:39.766929 kernel: smp: Bringing up secondary CPUs ... Sep 9 04:50:39.766942 kernel: Detected PIPT I-cache on CPU1 Sep 9 04:50:39.766949 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 9 04:50:39.766956 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 9 04:50:39.766965 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:50:39.766972 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 9 04:50:39.766979 kernel: Detected PIPT I-cache on CPU2 Sep 9 04:50:39.766986 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 9 04:50:39.766994 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 9 04:50:39.767003 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:50:39.767010 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 9 04:50:39.767017 kernel: Detected PIPT I-cache on CPU3 Sep 9 04:50:39.767024 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 9 04:50:39.767031 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 9 04:50:39.767038 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:50:39.767045 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 9 04:50:39.767052 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 04:50:39.767059 kernel: SMP: Total of 4 processors activated. Sep 9 04:50:39.767067 kernel: CPU: All CPU(s) started at EL1 Sep 9 04:50:39.767074 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 04:50:39.767081 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 04:50:39.767088 kernel: CPU features: detected: Common not Private translations Sep 9 04:50:39.767095 kernel: CPU features: detected: CRC32 instructions Sep 9 04:50:39.767102 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 9 04:50:39.767109 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 04:50:39.767116 kernel: CPU features: detected: LSE atomic instructions Sep 9 04:50:39.767123 kernel: CPU features: detected: Privileged Access Never Sep 9 04:50:39.767130 kernel: CPU features: detected: RAS Extension Support Sep 9 04:50:39.767187 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 04:50:39.767195 kernel: alternatives: applying system-wide alternatives Sep 9 04:50:39.767202 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 9 04:50:39.767210 kernel: Memory: 2424480K/2572288K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38976K init, 1038K bss, 125472K reserved, 16384K cma-reserved) Sep 9 04:50:39.767217 kernel: devtmpfs: initialized Sep 9 04:50:39.767224 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 04:50:39.767232 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 04:50:39.767239 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 04:50:39.767248 kernel: 0 pages in range for non-PLT usage Sep 9 04:50:39.767254 kernel: 508560 pages in range for PLT usage Sep 9 04:50:39.767261 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 04:50:39.767268 kernel: SMBIOS 3.0.0 present. Sep 9 04:50:39.767275 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 9 04:50:39.767282 kernel: DMI: Memory slots populated: 1/1 Sep 9 04:50:39.767289 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 04:50:39.767296 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 04:50:39.767304 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 04:50:39.767312 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 04:50:39.767319 kernel: audit: initializing netlink subsys (disabled) Sep 9 04:50:39.767327 kernel: audit: type=2000 audit(0.020:1): state=initialized audit_enabled=0 res=1 Sep 9 04:50:39.767334 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 04:50:39.767341 kernel: cpuidle: using governor menu Sep 9 04:50:39.767348 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 04:50:39.767355 kernel: ASID allocator initialised with 32768 entries Sep 9 04:50:39.767363 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 04:50:39.767369 kernel: Serial: AMBA PL011 UART driver Sep 9 04:50:39.767378 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 04:50:39.767385 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 04:50:39.767392 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 04:50:39.767403 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 04:50:39.767411 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 04:50:39.767418 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 04:50:39.767425 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 04:50:39.767432 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 04:50:39.767439 kernel: ACPI: Added _OSI(Module Device) Sep 9 04:50:39.767445 kernel: ACPI: Added _OSI(Processor Device) Sep 9 04:50:39.767454 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 04:50:39.767461 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 04:50:39.767468 kernel: ACPI: Interpreter enabled Sep 9 04:50:39.767475 kernel: ACPI: Using GIC for interrupt routing Sep 9 04:50:39.767481 kernel: ACPI: MCFG table detected, 1 entries Sep 9 04:50:39.767488 kernel: ACPI: CPU0 has been hot-added Sep 9 04:50:39.767495 kernel: ACPI: CPU1 has been hot-added Sep 9 04:50:39.767502 kernel: ACPI: CPU2 has been hot-added Sep 9 04:50:39.767508 kernel: ACPI: CPU3 has been hot-added Sep 9 04:50:39.767517 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 9 04:50:39.767524 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 04:50:39.767531 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 04:50:39.767664 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 04:50:39.767728 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 04:50:39.767786 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 04:50:39.767843 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 9 04:50:39.767902 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 9 04:50:39.767911 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 9 04:50:39.767918 kernel: PCI host bridge to bus 0000:00 Sep 9 04:50:39.767985 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 9 04:50:39.768088 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 9 04:50:39.768155 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 9 04:50:39.768210 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 04:50:39.768289 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 9 04:50:39.768363 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 04:50:39.768436 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 9 04:50:39.768498 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 9 04:50:39.768557 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 9 04:50:39.768617 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 9 04:50:39.768675 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 9 04:50:39.768737 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 9 04:50:39.768790 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 9 04:50:39.768841 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 9 04:50:39.768893 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 9 04:50:39.768902 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 9 04:50:39.768909 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 9 04:50:39.768916 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 9 04:50:39.768925 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 9 04:50:39.768932 kernel: iommu: Default domain type: Translated Sep 9 04:50:39.768939 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 04:50:39.768946 kernel: efivars: Registered efivars operations Sep 9 04:50:39.768953 kernel: vgaarb: loaded Sep 9 04:50:39.768959 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 04:50:39.768966 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 04:50:39.768973 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 04:50:39.768980 kernel: pnp: PnP ACPI init Sep 9 04:50:39.769045 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 9 04:50:39.769055 kernel: pnp: PnP ACPI: found 1 devices Sep 9 04:50:39.769062 kernel: NET: Registered PF_INET protocol family Sep 9 04:50:39.769069 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 04:50:39.769077 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 04:50:39.769084 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 04:50:39.769091 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 04:50:39.769098 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 04:50:39.769106 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 04:50:39.769113 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:50:39.769120 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:50:39.769127 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 04:50:39.769142 kernel: PCI: CLS 0 bytes, default 64 Sep 9 04:50:39.769150 kernel: kvm [1]: HYP mode not available Sep 9 04:50:39.769156 kernel: Initialise system trusted keyrings Sep 9 04:50:39.769163 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 04:50:39.769170 kernel: Key type asymmetric registered Sep 9 04:50:39.769177 kernel: Asymmetric key parser 'x509' registered Sep 9 04:50:39.769186 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 04:50:39.769193 kernel: io scheduler mq-deadline registered Sep 9 04:50:39.769200 kernel: io scheduler kyber registered Sep 9 04:50:39.769207 kernel: io scheduler bfq registered Sep 9 04:50:39.769214 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 9 04:50:39.769221 kernel: ACPI: button: Power Button [PWRB] Sep 9 04:50:39.769228 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 9 04:50:39.769291 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 9 04:50:39.769300 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 04:50:39.769309 kernel: thunder_xcv, ver 1.0 Sep 9 04:50:39.769315 kernel: thunder_bgx, ver 1.0 Sep 9 04:50:39.769322 kernel: nicpf, ver 1.0 Sep 9 04:50:39.769329 kernel: nicvf, ver 1.0 Sep 9 04:50:39.769397 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 04:50:39.769465 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T04:50:39 UTC (1757393439) Sep 9 04:50:39.769475 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 04:50:39.769482 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 04:50:39.769492 kernel: watchdog: NMI not fully supported Sep 9 04:50:39.769498 kernel: watchdog: Hard watchdog permanently disabled Sep 9 04:50:39.769505 kernel: NET: Registered PF_INET6 protocol family Sep 9 04:50:39.769513 kernel: Segment Routing with IPv6 Sep 9 04:50:39.769520 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 04:50:39.769526 kernel: NET: Registered PF_PACKET protocol family Sep 9 04:50:39.769533 kernel: Key type dns_resolver registered Sep 9 04:50:39.769540 kernel: registered taskstats version 1 Sep 9 04:50:39.769547 kernel: Loading compiled-in X.509 certificates Sep 9 04:50:39.769555 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 44d1e8b5c5ffbaa3cedd99c03d41580671fabec5' Sep 9 04:50:39.769562 kernel: Demotion targets for Node 0: null Sep 9 04:50:39.769568 kernel: Key type .fscrypt registered Sep 9 04:50:39.769575 kernel: Key type fscrypt-provisioning registered Sep 9 04:50:39.769582 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 04:50:39.769589 kernel: ima: Allocated hash algorithm: sha1 Sep 9 04:50:39.769596 kernel: ima: No architecture policies found Sep 9 04:50:39.769602 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 04:50:39.769611 kernel: clk: Disabling unused clocks Sep 9 04:50:39.769617 kernel: PM: genpd: Disabling unused power domains Sep 9 04:50:39.769624 kernel: Warning: unable to open an initial console. Sep 9 04:50:39.769632 kernel: Freeing unused kernel memory: 38976K Sep 9 04:50:39.769639 kernel: Run /init as init process Sep 9 04:50:39.769645 kernel: with arguments: Sep 9 04:50:39.769652 kernel: /init Sep 9 04:50:39.769659 kernel: with environment: Sep 9 04:50:39.769666 kernel: HOME=/ Sep 9 04:50:39.769672 kernel: TERM=linux Sep 9 04:50:39.769680 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 04:50:39.769688 systemd[1]: Successfully made /usr/ read-only. Sep 9 04:50:39.769698 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:50:39.769706 systemd[1]: Detected virtualization kvm. Sep 9 04:50:39.769713 systemd[1]: Detected architecture arm64. Sep 9 04:50:39.769720 systemd[1]: Running in initrd. Sep 9 04:50:39.769727 systemd[1]: No hostname configured, using default hostname. Sep 9 04:50:39.769737 systemd[1]: Hostname set to . Sep 9 04:50:39.769744 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:50:39.769751 systemd[1]: Queued start job for default target initrd.target. Sep 9 04:50:39.769758 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:50:39.769766 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:50:39.769774 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 04:50:39.769782 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:50:39.769789 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 04:50:39.769799 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 04:50:39.769807 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 04:50:39.769815 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 04:50:39.769822 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:50:39.769830 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:50:39.769837 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:50:39.769845 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:50:39.769853 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:50:39.769861 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:50:39.769868 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:50:39.769875 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:50:39.769883 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 04:50:39.769890 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 04:50:39.769898 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:50:39.769905 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:50:39.769914 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:50:39.769921 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:50:39.769929 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 04:50:39.769936 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:50:39.769944 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 04:50:39.769952 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 04:50:39.769959 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 04:50:39.769966 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:50:39.769974 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:50:39.769983 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:50:39.769991 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 04:50:39.769999 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:50:39.770006 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 04:50:39.770031 systemd-journald[245]: Collecting audit messages is disabled. Sep 9 04:50:39.770050 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:50:39.770058 systemd-journald[245]: Journal started Sep 9 04:50:39.770077 systemd-journald[245]: Runtime Journal (/run/log/journal/a30fcc5bedb64a51ab5845606243b638) is 6M, max 48.5M, 42.4M free. Sep 9 04:50:39.763658 systemd-modules-load[246]: Inserted module 'overlay' Sep 9 04:50:39.772114 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:50:39.776796 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:50:39.778687 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 04:50:39.777954 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:50:39.780819 kernel: Bridge firewalling registered Sep 9 04:50:39.779968 systemd-modules-load[246]: Inserted module 'br_netfilter' Sep 9 04:50:39.781171 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 04:50:39.782626 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:50:39.787413 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:50:39.788985 systemd-tmpfiles[263]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 04:50:39.789452 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:50:39.792422 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:50:39.793523 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:50:39.804337 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:50:39.807713 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:50:39.810229 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:50:39.813123 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 04:50:39.815680 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:50:39.845455 dracut-cmdline[289]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:50:39.858709 systemd-resolved[290]: Positive Trust Anchors: Sep 9 04:50:39.858728 systemd-resolved[290]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:50:39.858759 systemd-resolved[290]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:50:39.863531 systemd-resolved[290]: Defaulting to hostname 'linux'. Sep 9 04:50:39.864612 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:50:39.866338 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:50:39.917163 kernel: SCSI subsystem initialized Sep 9 04:50:39.922154 kernel: Loading iSCSI transport class v2.0-870. Sep 9 04:50:39.929167 kernel: iscsi: registered transport (tcp) Sep 9 04:50:39.942161 kernel: iscsi: registered transport (qla4xxx) Sep 9 04:50:39.942183 kernel: QLogic iSCSI HBA Driver Sep 9 04:50:39.958094 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:50:39.974259 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:50:39.976466 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:50:40.018818 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 04:50:40.020956 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 04:50:40.084181 kernel: raid6: neonx8 gen() 15748 MB/s Sep 9 04:50:40.101162 kernel: raid6: neonx4 gen() 15752 MB/s Sep 9 04:50:40.118165 kernel: raid6: neonx2 gen() 13202 MB/s Sep 9 04:50:40.135159 kernel: raid6: neonx1 gen() 10417 MB/s Sep 9 04:50:40.152155 kernel: raid6: int64x8 gen() 6899 MB/s Sep 9 04:50:40.169172 kernel: raid6: int64x4 gen() 7311 MB/s Sep 9 04:50:40.186163 kernel: raid6: int64x2 gen() 6083 MB/s Sep 9 04:50:40.203156 kernel: raid6: int64x1 gen() 5046 MB/s Sep 9 04:50:40.203181 kernel: raid6: using algorithm neonx4 gen() 15752 MB/s Sep 9 04:50:40.220175 kernel: raid6: .... xor() 12272 MB/s, rmw enabled Sep 9 04:50:40.220199 kernel: raid6: using neon recovery algorithm Sep 9 04:50:40.225156 kernel: xor: measuring software checksum speed Sep 9 04:50:40.225175 kernel: 8regs : 21647 MB/sec Sep 9 04:50:40.226169 kernel: 32regs : 19478 MB/sec Sep 9 04:50:40.226187 kernel: arm64_neon : 28147 MB/sec Sep 9 04:50:40.226204 kernel: xor: using function: arm64_neon (28147 MB/sec) Sep 9 04:50:40.277167 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 04:50:40.283792 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:50:40.286078 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:50:40.313546 systemd-udevd[500]: Using default interface naming scheme 'v255'. Sep 9 04:50:40.317581 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:50:40.319708 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 04:50:40.345416 dracut-pre-trigger[505]: rd.md=0: removing MD RAID activation Sep 9 04:50:40.368471 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:50:40.370511 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:50:40.423768 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:50:40.427291 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 04:50:40.494521 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 9 04:50:40.495248 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:50:40.497279 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 04:50:40.495367 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:50:40.502866 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 04:50:40.502895 kernel: GPT:9289727 != 19775487 Sep 9 04:50:40.502905 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 04:50:40.502914 kernel: GPT:9289727 != 19775487 Sep 9 04:50:40.502923 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 04:50:40.502931 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:50:40.500707 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:50:40.504886 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:50:40.533507 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 04:50:40.536087 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:50:40.543829 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 04:50:40.545945 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 04:50:40.558722 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:50:40.564805 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 04:50:40.565868 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 04:50:40.568600 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:50:40.570519 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:50:40.572234 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:50:40.574695 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 04:50:40.576307 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 04:50:40.589597 disk-uuid[594]: Primary Header is updated. Sep 9 04:50:40.589597 disk-uuid[594]: Secondary Entries is updated. Sep 9 04:50:40.589597 disk-uuid[594]: Secondary Header is updated. Sep 9 04:50:40.592342 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:50:40.593459 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:50:41.601173 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:50:41.601513 disk-uuid[597]: The operation has completed successfully. Sep 9 04:50:41.627251 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 04:50:41.627360 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 04:50:41.651266 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 04:50:41.667088 sh[614]: Success Sep 9 04:50:41.679883 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 04:50:41.679935 kernel: device-mapper: uevent: version 1.0.3 Sep 9 04:50:41.679946 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 04:50:41.687169 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 04:50:41.711522 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 04:50:41.714060 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 04:50:41.728504 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 04:50:41.733152 kernel: BTRFS: device fsid 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (626) Sep 9 04:50:41.734800 kernel: BTRFS info (device dm-0): first mount of filesystem 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 Sep 9 04:50:41.734830 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:50:41.738238 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 04:50:41.738259 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 04:50:41.739254 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 04:50:41.740294 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:50:41.741545 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 04:50:41.742276 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 04:50:41.744917 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 04:50:41.768327 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (658) Sep 9 04:50:41.768374 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:50:41.769320 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:50:41.772198 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:50:41.772238 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:50:41.776151 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:50:41.776794 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 04:50:41.779028 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 04:50:41.845861 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:50:41.849757 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:50:41.881778 ignition[700]: Ignition 2.22.0 Sep 9 04:50:41.881793 ignition[700]: Stage: fetch-offline Sep 9 04:50:41.881825 ignition[700]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:50:41.881832 ignition[700]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:50:41.881915 ignition[700]: parsed url from cmdline: "" Sep 9 04:50:41.881919 ignition[700]: no config URL provided Sep 9 04:50:41.881923 ignition[700]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:50:41.881930 ignition[700]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:50:41.881950 ignition[700]: op(1): [started] loading QEMU firmware config module Sep 9 04:50:41.881954 ignition[700]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 04:50:41.887063 ignition[700]: op(1): [finished] loading QEMU firmware config module Sep 9 04:50:41.892356 systemd-networkd[805]: lo: Link UP Sep 9 04:50:41.892370 systemd-networkd[805]: lo: Gained carrier Sep 9 04:50:41.893093 systemd-networkd[805]: Enumeration completed Sep 9 04:50:41.893511 systemd-networkd[805]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:50:41.893515 systemd-networkd[805]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:50:41.894299 systemd-networkd[805]: eth0: Link UP Sep 9 04:50:41.894402 systemd-networkd[805]: eth0: Gained carrier Sep 9 04:50:41.894411 systemd-networkd[805]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:50:41.894785 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:50:41.895914 systemd[1]: Reached target network.target - Network. Sep 9 04:50:41.914177 systemd-networkd[805]: eth0: DHCPv4 address 10.0.0.32/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:50:41.939884 ignition[700]: parsing config with SHA512: f8abe1ac94e9245011ba3704f32166335443191743da27dbc4cb7fd27d3c61f61299fb2d26b054872eb430ed9aeb882c897c0f01a5abed48a891ad4f59e24232 Sep 9 04:50:41.946599 unknown[700]: fetched base config from "system" Sep 9 04:50:41.946613 unknown[700]: fetched user config from "qemu" Sep 9 04:50:41.946968 ignition[700]: fetch-offline: fetch-offline passed Sep 9 04:50:41.947022 ignition[700]: Ignition finished successfully Sep 9 04:50:41.951204 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:50:41.953201 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 04:50:41.955344 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 04:50:41.994069 ignition[815]: Ignition 2.22.0 Sep 9 04:50:41.994088 ignition[815]: Stage: kargs Sep 9 04:50:41.994238 ignition[815]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:50:41.994248 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:50:41.994990 ignition[815]: kargs: kargs passed Sep 9 04:50:41.995036 ignition[815]: Ignition finished successfully Sep 9 04:50:41.999948 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 04:50:42.004120 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 04:50:42.034862 ignition[823]: Ignition 2.22.0 Sep 9 04:50:42.034877 ignition[823]: Stage: disks Sep 9 04:50:42.035012 ignition[823]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:50:42.038019 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 04:50:42.035021 ignition[823]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:50:42.039068 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 04:50:42.035799 ignition[823]: disks: disks passed Sep 9 04:50:42.040445 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 04:50:42.035846 ignition[823]: Ignition finished successfully Sep 9 04:50:42.042057 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:50:42.043609 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:50:42.044812 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:50:42.047203 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 04:50:42.083719 systemd-fsck[833]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 04:50:42.087407 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 04:50:42.090008 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 04:50:42.151163 kernel: EXT4-fs (vda9): mounted filesystem 88574756-967d-44b3-be66-46689c8baf27 r/w with ordered data mode. Quota mode: none. Sep 9 04:50:42.151243 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 04:50:42.152411 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 04:50:42.154546 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:50:42.156103 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 04:50:42.157008 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 04:50:42.157050 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 04:50:42.157075 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:50:42.173338 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 04:50:42.176251 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 04:50:42.179915 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (841) Sep 9 04:50:42.179947 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:50:42.179957 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:50:42.182352 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:50:42.182411 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:50:42.183501 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:50:42.214086 initrd-setup-root[866]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 04:50:42.217504 initrd-setup-root[873]: cut: /sysroot/etc/group: No such file or directory Sep 9 04:50:42.220672 initrd-setup-root[880]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 04:50:42.223663 initrd-setup-root[887]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 04:50:42.293666 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 04:50:42.295784 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 04:50:42.297176 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 04:50:42.315161 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:50:42.324487 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 04:50:42.350165 ignition[956]: INFO : Ignition 2.22.0 Sep 9 04:50:42.350165 ignition[956]: INFO : Stage: mount Sep 9 04:50:42.350165 ignition[956]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:50:42.350165 ignition[956]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:50:42.353010 ignition[956]: INFO : mount: mount passed Sep 9 04:50:42.353010 ignition[956]: INFO : Ignition finished successfully Sep 9 04:50:42.353115 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 04:50:42.356254 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 04:50:42.871444 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 04:50:42.872957 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:50:42.892280 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (970) Sep 9 04:50:42.892316 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:50:42.892327 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:50:42.895284 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:50:42.895324 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:50:42.896641 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:50:42.926907 ignition[988]: INFO : Ignition 2.22.0 Sep 9 04:50:42.926907 ignition[988]: INFO : Stage: files Sep 9 04:50:42.928218 ignition[988]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:50:42.928218 ignition[988]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:50:42.928218 ignition[988]: DEBUG : files: compiled without relabeling support, skipping Sep 9 04:50:42.930849 ignition[988]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 04:50:42.930849 ignition[988]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 04:50:42.933378 ignition[988]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 04:50:42.934353 ignition[988]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 04:50:42.934353 ignition[988]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 04:50:42.933915 unknown[988]: wrote ssh authorized keys file for user: core Sep 9 04:50:42.937224 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 04:50:42.937224 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 9 04:50:42.982601 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 04:50:43.227915 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 04:50:43.227915 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:50:43.231257 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:50:43.244413 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:50:43.244413 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:50:43.244413 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 9 04:50:43.720133 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 04:50:43.730302 systemd-networkd[805]: eth0: Gained IPv6LL Sep 9 04:50:44.520873 ignition[988]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:50:44.520873 ignition[988]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 04:50:44.524432 ignition[988]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 04:50:44.540131 ignition[988]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:50:44.543245 ignition[988]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:50:44.545827 ignition[988]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 04:50:44.545827 ignition[988]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 04:50:44.545827 ignition[988]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 04:50:44.545827 ignition[988]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:50:44.545827 ignition[988]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:50:44.545827 ignition[988]: INFO : files: files passed Sep 9 04:50:44.545827 ignition[988]: INFO : Ignition finished successfully Sep 9 04:50:44.547416 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 04:50:44.550063 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 04:50:44.551621 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 04:50:44.564690 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 04:50:44.564810 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 04:50:44.567577 initrd-setup-root-after-ignition[1016]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 04:50:44.569188 initrd-setup-root-after-ignition[1018]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:50:44.569188 initrd-setup-root-after-ignition[1018]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:50:44.573231 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:50:44.570211 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:50:44.571752 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 04:50:44.574640 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 04:50:44.618020 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 04:50:44.618123 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 04:50:44.620291 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 04:50:44.621932 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 04:50:44.623612 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 04:50:44.624365 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 04:50:44.648017 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:50:44.650182 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 04:50:44.670102 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:50:44.671980 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:50:44.674021 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 04:50:44.674929 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 04:50:44.675053 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:50:44.676968 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 04:50:44.678610 systemd[1]: Stopped target basic.target - Basic System. Sep 9 04:50:44.679970 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 04:50:44.681388 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:50:44.683001 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 04:50:44.684700 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:50:44.686155 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 04:50:44.687729 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:50:44.689212 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 04:50:44.690995 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 04:50:44.692360 systemd[1]: Stopped target swap.target - Swaps. Sep 9 04:50:44.693573 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 04:50:44.693697 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:50:44.695700 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:50:44.697175 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:50:44.698810 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 04:50:44.703246 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:50:44.704263 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 04:50:44.704399 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 04:50:44.706674 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 04:50:44.706797 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:50:44.708288 systemd[1]: Stopped target paths.target - Path Units. Sep 9 04:50:44.709551 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 04:50:44.714203 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:50:44.715214 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 04:50:44.716849 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 04:50:44.718021 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 04:50:44.718111 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:50:44.719285 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 04:50:44.719362 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:50:44.720552 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 04:50:44.720668 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:50:44.722064 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 04:50:44.722191 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 04:50:44.724287 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 04:50:44.726150 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 04:50:44.726903 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 04:50:44.727015 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:50:44.728479 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 04:50:44.728597 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:50:44.733323 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 04:50:44.734294 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 04:50:44.742368 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 04:50:44.749320 ignition[1042]: INFO : Ignition 2.22.0 Sep 9 04:50:44.749320 ignition[1042]: INFO : Stage: umount Sep 9 04:50:44.750780 ignition[1042]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:50:44.750780 ignition[1042]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:50:44.750780 ignition[1042]: INFO : umount: umount passed Sep 9 04:50:44.750780 ignition[1042]: INFO : Ignition finished successfully Sep 9 04:50:44.752921 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 04:50:44.753042 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 04:50:44.754726 systemd[1]: Stopped target network.target - Network. Sep 9 04:50:44.755915 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 04:50:44.755964 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 04:50:44.757521 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 04:50:44.757558 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 04:50:44.758891 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 04:50:44.758933 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 04:50:44.760267 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 04:50:44.760305 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 04:50:44.761868 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 04:50:44.763388 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 04:50:44.771542 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 04:50:44.772326 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 04:50:44.775289 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 04:50:44.775532 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 04:50:44.775595 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:50:44.780622 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:50:44.780836 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 04:50:44.780945 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 04:50:44.784249 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 04:50:44.784603 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 04:50:44.786678 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 04:50:44.786712 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:50:44.789159 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 04:50:44.789816 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 04:50:44.789862 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:50:44.791763 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 04:50:44.791801 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:50:44.794182 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 04:50:44.794220 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 04:50:44.796410 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:50:44.801033 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 04:50:44.811302 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 04:50:44.813292 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 04:50:44.814315 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 04:50:44.814365 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 04:50:44.816010 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 04:50:44.816163 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:50:44.818528 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 04:50:44.818613 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 04:50:44.820098 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 04:50:44.820181 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 04:50:44.821063 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 04:50:44.821092 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:50:44.822754 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 04:50:44.822803 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:50:44.825299 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 04:50:44.825371 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 04:50:44.827604 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 04:50:44.827657 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:50:44.830825 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 04:50:44.831901 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 04:50:44.831976 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:50:44.834525 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 04:50:44.834569 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:50:44.836933 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 04:50:44.836974 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:50:44.839792 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 04:50:44.839831 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:50:44.841770 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:50:44.841806 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:50:44.848990 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 04:50:44.849074 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 04:50:44.850297 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 04:50:44.852416 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 04:50:44.869634 systemd[1]: Switching root. Sep 9 04:50:44.911994 systemd-journald[245]: Journal stopped Sep 9 04:50:45.639433 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Sep 9 04:50:45.639484 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 04:50:45.639501 kernel: SELinux: policy capability open_perms=1 Sep 9 04:50:45.639512 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 04:50:45.639521 kernel: SELinux: policy capability always_check_network=0 Sep 9 04:50:45.639529 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 04:50:45.639538 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 04:50:45.639547 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 04:50:45.639556 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 04:50:45.639565 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 04:50:45.639578 kernel: audit: type=1403 audit(1757393445.088:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 04:50:45.639593 systemd[1]: Successfully loaded SELinux policy in 58.672ms. Sep 9 04:50:45.639612 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.254ms. Sep 9 04:50:45.639622 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:50:45.639633 systemd[1]: Detected virtualization kvm. Sep 9 04:50:45.639643 systemd[1]: Detected architecture arm64. Sep 9 04:50:45.639652 systemd[1]: Detected first boot. Sep 9 04:50:45.639662 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:50:45.639672 zram_generator::config[1087]: No configuration found. Sep 9 04:50:45.639682 kernel: NET: Registered PF_VSOCK protocol family Sep 9 04:50:45.639693 systemd[1]: Populated /etc with preset unit settings. Sep 9 04:50:45.639703 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 04:50:45.639713 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 04:50:45.639722 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 04:50:45.639732 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 04:50:45.639742 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 04:50:45.639752 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 04:50:45.639762 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 04:50:45.639772 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 04:50:45.639783 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 04:50:45.639793 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 04:50:45.639803 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 04:50:45.639813 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 04:50:45.639823 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:50:45.639833 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:50:45.639843 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 04:50:45.639852 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 04:50:45.639863 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 04:50:45.639874 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:50:45.639883 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 04:50:45.639893 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:50:45.639906 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:50:45.639916 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 04:50:45.639926 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 04:50:45.639935 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 04:50:45.639946 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 04:50:45.639956 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:50:45.639966 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:50:45.639975 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:50:45.639985 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:50:45.639995 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 04:50:45.640005 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 04:50:45.640015 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 04:50:45.640024 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:50:45.640035 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:50:45.640045 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:50:45.640055 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 04:50:45.640065 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 04:50:45.640075 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 04:50:45.640084 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 04:50:45.640094 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 04:50:45.640104 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 04:50:45.640114 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 04:50:45.640126 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 04:50:45.640146 systemd[1]: Reached target machines.target - Containers. Sep 9 04:50:45.640156 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 04:50:45.640166 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:50:45.640177 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:50:45.640187 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 04:50:45.640197 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:50:45.640207 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:50:45.640217 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:50:45.640229 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 04:50:45.640239 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:50:45.640249 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 04:50:45.640259 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 04:50:45.640269 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 04:50:45.640278 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 04:50:45.640288 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 04:50:45.640298 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:50:45.640311 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:50:45.640320 kernel: ACPI: bus type drm_connector registered Sep 9 04:50:45.640330 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:50:45.640340 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:50:45.640349 kernel: loop: module loaded Sep 9 04:50:45.640359 kernel: fuse: init (API version 7.41) Sep 9 04:50:45.640368 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 04:50:45.640384 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 04:50:45.640395 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:50:45.640406 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 04:50:45.640416 systemd[1]: Stopped verity-setup.service. Sep 9 04:50:45.640426 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 04:50:45.640436 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 04:50:45.640445 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 04:50:45.640456 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 04:50:45.640485 systemd-journald[1159]: Collecting audit messages is disabled. Sep 9 04:50:45.640513 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 04:50:45.640534 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 04:50:45.640594 systemd-journald[1159]: Journal started Sep 9 04:50:45.640617 systemd-journald[1159]: Runtime Journal (/run/log/journal/a30fcc5bedb64a51ab5845606243b638) is 6M, max 48.5M, 42.4M free. Sep 9 04:50:45.440432 systemd[1]: Queued start job for default target multi-user.target. Sep 9 04:50:45.466015 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 04:50:45.466414 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 04:50:45.643152 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:50:45.643822 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 04:50:45.645061 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:50:45.646418 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 04:50:45.646574 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 04:50:45.647706 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:50:45.647852 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:50:45.648980 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:50:45.649150 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:50:45.650277 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:50:45.650441 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:50:45.651708 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 04:50:45.651857 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 04:50:45.652974 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:50:45.653153 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:50:45.654460 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:50:45.655563 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:50:45.656847 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 04:50:45.658337 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 04:50:45.669785 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:50:45.671858 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 04:50:45.673647 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 04:50:45.674600 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 04:50:45.674628 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:50:45.676233 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 04:50:45.680272 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 04:50:45.681160 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:50:45.682429 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 04:50:45.684160 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 04:50:45.685046 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:50:45.688264 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 04:50:45.689417 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:50:45.690349 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:50:45.692260 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 04:50:45.693898 systemd-journald[1159]: Time spent on flushing to /var/log/journal/a30fcc5bedb64a51ab5845606243b638 is 14.479ms for 886 entries. Sep 9 04:50:45.693898 systemd-journald[1159]: System Journal (/var/log/journal/a30fcc5bedb64a51ab5845606243b638) is 8M, max 195.6M, 187.6M free. Sep 9 04:50:45.718196 systemd-journald[1159]: Received client request to flush runtime journal. Sep 9 04:50:45.718256 kernel: loop0: detected capacity change from 0 to 207008 Sep 9 04:50:45.695279 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:50:45.706185 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:50:45.707428 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 04:50:45.709369 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 04:50:45.713750 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 04:50:45.716047 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 04:50:45.720330 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 04:50:45.722223 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 04:50:45.723767 systemd-tmpfiles[1205]: ACLs are not supported, ignoring. Sep 9 04:50:45.723993 systemd-tmpfiles[1205]: ACLs are not supported, ignoring. Sep 9 04:50:45.726175 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 04:50:45.728219 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:50:45.730672 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:50:45.741010 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 04:50:45.747232 kernel: loop1: detected capacity change from 0 to 100632 Sep 9 04:50:45.755423 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 04:50:45.764190 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 04:50:45.768606 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:50:45.776158 kernel: loop2: detected capacity change from 0 to 119368 Sep 9 04:50:45.787334 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Sep 9 04:50:45.787352 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Sep 9 04:50:45.790488 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:50:45.805176 kernel: loop3: detected capacity change from 0 to 207008 Sep 9 04:50:45.811160 kernel: loop4: detected capacity change from 0 to 100632 Sep 9 04:50:45.816155 kernel: loop5: detected capacity change from 0 to 119368 Sep 9 04:50:45.819404 (sd-merge)[1230]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 04:50:45.819766 (sd-merge)[1230]: Merged extensions into '/usr'. Sep 9 04:50:45.823257 systemd[1]: Reload requested from client PID 1204 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 04:50:45.823277 systemd[1]: Reloading... Sep 9 04:50:45.890159 zram_generator::config[1259]: No configuration found. Sep 9 04:50:45.933842 ldconfig[1199]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 04:50:46.024964 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 04:50:46.025074 systemd[1]: Reloading finished in 201 ms. Sep 9 04:50:46.039763 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 04:50:46.042171 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 04:50:46.057352 systemd[1]: Starting ensure-sysext.service... Sep 9 04:50:46.058965 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:50:46.068475 systemd[1]: Reload requested from client PID 1290 ('systemctl') (unit ensure-sysext.service)... Sep 9 04:50:46.068492 systemd[1]: Reloading... Sep 9 04:50:46.072090 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 04:50:46.072126 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 04:50:46.072371 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 04:50:46.072568 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 04:50:46.073238 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 04:50:46.073464 systemd-tmpfiles[1291]: ACLs are not supported, ignoring. Sep 9 04:50:46.073513 systemd-tmpfiles[1291]: ACLs are not supported, ignoring. Sep 9 04:50:46.076301 systemd-tmpfiles[1291]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:50:46.076311 systemd-tmpfiles[1291]: Skipping /boot Sep 9 04:50:46.082176 systemd-tmpfiles[1291]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:50:46.082190 systemd-tmpfiles[1291]: Skipping /boot Sep 9 04:50:46.115232 zram_generator::config[1320]: No configuration found. Sep 9 04:50:46.249775 systemd[1]: Reloading finished in 181 ms. Sep 9 04:50:46.260641 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 04:50:46.279490 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:50:46.289617 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:50:46.293442 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 04:50:46.305045 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 04:50:46.317466 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:50:46.327103 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:50:46.333356 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 04:50:46.341360 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 04:50:46.343844 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:50:46.345068 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:50:46.347075 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:50:46.350366 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:50:46.351390 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:50:46.351507 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:50:46.354470 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 04:50:46.357413 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 04:50:46.359633 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:50:46.360328 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:50:46.367468 augenrules[1385]: No rules Sep 9 04:50:46.369087 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:50:46.369315 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:50:46.370988 systemd-udevd[1365]: Using default interface naming scheme 'v255'. Sep 9 04:50:46.371781 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 04:50:46.373461 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 04:50:46.376809 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:50:46.376973 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:50:46.379000 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:50:46.379166 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:50:46.382105 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 04:50:46.390000 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:50:46.390994 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:50:46.392368 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:50:46.395343 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:50:46.405436 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:50:46.407826 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:50:46.409417 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:50:46.409555 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:50:46.409662 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 04:50:46.410676 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:50:46.412735 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 04:50:46.414866 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:50:46.415877 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:50:46.418002 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:50:46.420661 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:50:46.423473 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:50:46.423624 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:50:46.425058 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:50:46.425249 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:50:46.431536 systemd[1]: Finished ensure-sysext.service. Sep 9 04:50:46.444964 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:50:46.447282 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:50:46.447348 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:50:46.449021 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 04:50:46.459899 augenrules[1397]: /sbin/augenrules: No change Sep 9 04:50:46.463637 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 04:50:46.473096 augenrules[1458]: No rules Sep 9 04:50:46.476392 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:50:46.476606 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:50:46.516334 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:50:46.520866 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 04:50:46.546295 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 04:50:46.598590 systemd-networkd[1443]: lo: Link UP Sep 9 04:50:46.598602 systemd-networkd[1443]: lo: Gained carrier Sep 9 04:50:46.599396 systemd-networkd[1443]: Enumeration completed Sep 9 04:50:46.599575 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:50:46.602002 systemd-networkd[1443]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:50:46.602011 systemd-networkd[1443]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:50:46.602266 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 04:50:46.603599 systemd-networkd[1443]: eth0: Link UP Sep 9 04:50:46.603703 systemd-networkd[1443]: eth0: Gained carrier Sep 9 04:50:46.603718 systemd-networkd[1443]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:50:46.604438 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 04:50:46.616117 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:50:46.619201 systemd-networkd[1443]: eth0: DHCPv4 address 10.0.0.32/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:50:46.626702 systemd-resolved[1358]: Positive Trust Anchors: Sep 9 04:50:46.627025 systemd-resolved[1358]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:50:46.627104 systemd-resolved[1358]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:50:46.629624 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 04:50:46.637321 systemd-resolved[1358]: Defaulting to hostname 'linux'. Sep 9 04:50:46.638895 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:50:46.640070 systemd[1]: Reached target network.target - Network. Sep 9 04:50:46.643457 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:50:46.669878 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 04:50:46.671078 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 04:50:46.671735 systemd-timesyncd[1446]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 04:50:46.671779 systemd-timesyncd[1446]: Initial clock synchronization to Tue 2025-09-09 04:50:46.985698 UTC. Sep 9 04:50:46.681183 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:50:46.682275 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:50:46.683119 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 04:50:46.684102 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 04:50:46.685394 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 04:50:46.686259 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 04:50:46.687181 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 04:50:46.688074 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 04:50:46.688104 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:50:46.688846 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:50:46.690277 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 04:50:46.692526 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 04:50:46.694983 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 04:50:46.696204 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 04:50:46.697130 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 04:50:46.702082 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 04:50:46.703517 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 04:50:46.704962 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 04:50:46.705953 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:50:46.706746 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:50:46.707513 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:50:46.707541 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:50:46.708481 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 04:50:46.710151 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 04:50:46.711737 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 04:50:46.713415 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 04:50:46.716112 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 04:50:46.716989 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 04:50:46.717899 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 04:50:46.720642 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 04:50:46.724292 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 04:50:46.724940 jq[1504]: false Sep 9 04:50:46.726246 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 04:50:46.730013 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 04:50:46.731715 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 04:50:46.732714 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 04:50:46.733693 extend-filesystems[1505]: Found /dev/vda6 Sep 9 04:50:46.733879 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 04:50:46.738249 extend-filesystems[1505]: Found /dev/vda9 Sep 9 04:50:46.736400 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 04:50:46.739054 extend-filesystems[1505]: Checking size of /dev/vda9 Sep 9 04:50:46.739524 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 04:50:46.741467 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 04:50:46.741991 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 04:50:46.742801 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 04:50:46.742973 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 04:50:46.751785 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 04:50:46.752044 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 04:50:46.758013 jq[1518]: true Sep 9 04:50:46.766571 extend-filesystems[1505]: Resized partition /dev/vda9 Sep 9 04:50:46.768406 extend-filesystems[1543]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 04:50:46.771592 tar[1527]: linux-arm64/LICENSE Sep 9 04:50:46.771592 tar[1527]: linux-arm64/helm Sep 9 04:50:46.777819 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 04:50:46.775457 (ntainerd)[1537]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 04:50:46.784940 jq[1539]: true Sep 9 04:50:46.788902 systemd-logind[1514]: Watching system buttons on /dev/input/event0 (Power Button) Sep 9 04:50:46.790409 dbus-daemon[1502]: [system] SELinux support is enabled Sep 9 04:50:46.791477 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 04:50:46.791910 systemd-logind[1514]: New seat seat0. Sep 9 04:50:46.795310 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 04:50:46.795344 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 04:50:46.797616 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 04:50:46.797641 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 04:50:46.799127 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 04:50:46.808059 update_engine[1516]: I20250909 04:50:46.807780 1516 main.cc:92] Flatcar Update Engine starting Sep 9 04:50:46.811662 systemd[1]: Started update-engine.service - Update Engine. Sep 9 04:50:46.815328 update_engine[1516]: I20250909 04:50:46.812813 1516 update_check_scheduler.cc:74] Next update check in 2m11s Sep 9 04:50:46.818294 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 04:50:46.831158 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 04:50:46.846702 extend-filesystems[1543]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 04:50:46.846702 extend-filesystems[1543]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 04:50:46.846702 extend-filesystems[1543]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 04:50:46.856277 extend-filesystems[1505]: Resized filesystem in /dev/vda9 Sep 9 04:50:46.857017 bash[1562]: Updated "/home/core/.ssh/authorized_keys" Sep 9 04:50:46.848356 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 04:50:46.850426 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 04:50:46.854413 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 04:50:46.863184 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 04:50:46.916230 locksmithd[1549]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 04:50:46.953281 containerd[1537]: time="2025-09-09T04:50:46Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 04:50:46.955191 containerd[1537]: time="2025-09-09T04:50:46.955125880Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 04:50:46.966149 containerd[1537]: time="2025-09-09T04:50:46.966049240Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.56µs" Sep 9 04:50:46.966149 containerd[1537]: time="2025-09-09T04:50:46.966090760Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 04:50:46.966149 containerd[1537]: time="2025-09-09T04:50:46.966116080Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 04:50:46.966592 containerd[1537]: time="2025-09-09T04:50:46.966555840Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 04:50:46.966621 containerd[1537]: time="2025-09-09T04:50:46.966605080Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 04:50:46.966657 containerd[1537]: time="2025-09-09T04:50:46.966641640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:50:46.966726 containerd[1537]: time="2025-09-09T04:50:46.966701320Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:50:46.966726 containerd[1537]: time="2025-09-09T04:50:46.966722960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967004 containerd[1537]: time="2025-09-09T04:50:46.966979920Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967037 containerd[1537]: time="2025-09-09T04:50:46.967005120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967037 containerd[1537]: time="2025-09-09T04:50:46.967018360Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967037 containerd[1537]: time="2025-09-09T04:50:46.967031640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967146 containerd[1537]: time="2025-09-09T04:50:46.967108760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967351 containerd[1537]: time="2025-09-09T04:50:46.967327680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967400 containerd[1537]: time="2025-09-09T04:50:46.967368520Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:50:46.967424 containerd[1537]: time="2025-09-09T04:50:46.967402120Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 04:50:46.967478 containerd[1537]: time="2025-09-09T04:50:46.967461520Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 04:50:46.967769 containerd[1537]: time="2025-09-09T04:50:46.967745120Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 04:50:46.967836 containerd[1537]: time="2025-09-09T04:50:46.967814640Z" level=info msg="metadata content store policy set" policy=shared Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971424360Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971483240Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971501400Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971515360Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971527400Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971541560Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971554920Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971566720Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971582680Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971593800Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971602680Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971615800Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971724560Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 04:50:46.972145 containerd[1537]: time="2025-09-09T04:50:46.971743520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971758120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971773840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971783640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971794560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971805400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971815120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971825880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971836000Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.971845720Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.972026040Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.972040960Z" level=info msg="Start snapshots syncer" Sep 9 04:50:46.972408 containerd[1537]: time="2025-09-09T04:50:46.972071120Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 04:50:46.972595 containerd[1537]: time="2025-09-09T04:50:46.972306080Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 04:50:46.972595 containerd[1537]: time="2025-09-09T04:50:46.972352920Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972432920Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972530240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972553000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972565680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972576560Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972587880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972598360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972608800Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972634320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972645480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972656440Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 04:50:46.972696 containerd[1537]: time="2025-09-09T04:50:46.972698040Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972713600Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972723400Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972732120Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972740240Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972748840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972758600Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972834280Z" level=info msg="runtime interface created" Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972840120Z" level=info msg="created NRI interface" Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972848120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 04:50:46.972881 containerd[1537]: time="2025-09-09T04:50:46.972866920Z" level=info msg="Connect containerd service" Sep 9 04:50:46.973034 containerd[1537]: time="2025-09-09T04:50:46.972894240Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 04:50:46.973625 containerd[1537]: time="2025-09-09T04:50:46.973593480Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:50:47.053150 containerd[1537]: time="2025-09-09T04:50:47.052973322Z" level=info msg="Start subscribing containerd event" Sep 9 04:50:47.053150 containerd[1537]: time="2025-09-09T04:50:47.053111376Z" level=info msg="Start recovering state" Sep 9 04:50:47.053280 containerd[1537]: time="2025-09-09T04:50:47.053255042Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 04:50:47.053316 containerd[1537]: time="2025-09-09T04:50:47.053261444Z" level=info msg="Start event monitor" Sep 9 04:50:47.053341 containerd[1537]: time="2025-09-09T04:50:47.053314529Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 04:50:47.053388 containerd[1537]: time="2025-09-09T04:50:47.053366700Z" level=info msg="Start cni network conf syncer for default" Sep 9 04:50:47.053413 containerd[1537]: time="2025-09-09T04:50:47.053380958Z" level=info msg="Start streaming server" Sep 9 04:50:47.053413 containerd[1537]: time="2025-09-09T04:50:47.053402616Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 04:50:47.053465 containerd[1537]: time="2025-09-09T04:50:47.053411845Z" level=info msg="runtime interface starting up..." Sep 9 04:50:47.053465 containerd[1537]: time="2025-09-09T04:50:47.053442648Z" level=info msg="starting plugins..." Sep 9 04:50:47.053465 containerd[1537]: time="2025-09-09T04:50:47.053460773Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 04:50:47.053665 containerd[1537]: time="2025-09-09T04:50:47.053644471Z" level=info msg="containerd successfully booted in 0.100729s" Sep 9 04:50:47.053783 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 04:50:47.067061 sshd_keygen[1534]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 04:50:47.087254 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 04:50:47.090002 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 04:50:47.106937 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 04:50:47.107131 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 04:50:47.109679 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 04:50:47.119202 tar[1527]: linux-arm64/README.md Sep 9 04:50:47.127873 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 04:50:47.130016 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 04:50:47.133889 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 04:50:47.136794 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 04:50:47.138065 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 04:50:47.957603 systemd-networkd[1443]: eth0: Gained IPv6LL Sep 9 04:50:47.960374 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 04:50:47.961857 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 04:50:47.965610 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 04:50:47.968135 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:47.987452 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 04:50:48.013434 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 04:50:48.015036 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 04:50:48.015268 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 04:50:48.018191 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 04:50:48.574941 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:48.576891 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 04:50:48.578698 systemd[1]: Startup finished in 2.002s (kernel) + 5.471s (initrd) + 3.549s (userspace) = 11.024s. Sep 9 04:50:48.579033 (kubelet)[1635]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:50:48.930394 kubelet[1635]: E0909 04:50:48.930317 1635 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:50:48.932892 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:50:48.933030 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:50:48.933366 systemd[1]: kubelet.service: Consumed 734ms CPU time, 259.4M memory peak. Sep 9 04:50:52.246395 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 04:50:52.247405 systemd[1]: Started sshd@0-10.0.0.32:22-10.0.0.1:58056.service - OpenSSH per-connection server daemon (10.0.0.1:58056). Sep 9 04:50:52.359861 sshd[1649]: Accepted publickey for core from 10.0.0.1 port 58056 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:50:52.361790 sshd-session[1649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:50:52.367932 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 04:50:52.368812 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 04:50:52.373838 systemd-logind[1514]: New session 1 of user core. Sep 9 04:50:52.395409 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 04:50:52.397697 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 04:50:52.413189 (systemd)[1654]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 04:50:52.415432 systemd-logind[1514]: New session c1 of user core. Sep 9 04:50:52.527034 systemd[1654]: Queued start job for default target default.target. Sep 9 04:50:52.546194 systemd[1654]: Created slice app.slice - User Application Slice. Sep 9 04:50:52.546220 systemd[1654]: Reached target paths.target - Paths. Sep 9 04:50:52.546257 systemd[1654]: Reached target timers.target - Timers. Sep 9 04:50:52.547445 systemd[1654]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 04:50:52.556953 systemd[1654]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 04:50:52.557016 systemd[1654]: Reached target sockets.target - Sockets. Sep 9 04:50:52.557052 systemd[1654]: Reached target basic.target - Basic System. Sep 9 04:50:52.557079 systemd[1654]: Reached target default.target - Main User Target. Sep 9 04:50:52.557104 systemd[1654]: Startup finished in 136ms. Sep 9 04:50:52.557827 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 04:50:52.561056 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 04:50:52.618855 systemd[1]: Started sshd@1-10.0.0.32:22-10.0.0.1:58066.service - OpenSSH per-connection server daemon (10.0.0.1:58066). Sep 9 04:50:52.666358 sshd[1665]: Accepted publickey for core from 10.0.0.1 port 58066 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:50:52.667542 sshd-session[1665]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:50:52.671369 systemd-logind[1514]: New session 2 of user core. Sep 9 04:50:52.695397 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 04:50:52.751015 sshd[1668]: Connection closed by 10.0.0.1 port 58066 Sep 9 04:50:52.752139 sshd-session[1665]: pam_unix(sshd:session): session closed for user core Sep 9 04:50:52.763081 systemd[1]: sshd@1-10.0.0.32:22-10.0.0.1:58066.service: Deactivated successfully. Sep 9 04:50:52.764606 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 04:50:52.766568 systemd-logind[1514]: Session 2 logged out. Waiting for processes to exit. Sep 9 04:50:52.767514 systemd[1]: Started sshd@2-10.0.0.32:22-10.0.0.1:58070.service - OpenSSH per-connection server daemon (10.0.0.1:58070). Sep 9 04:50:52.770597 systemd-logind[1514]: Removed session 2. Sep 9 04:50:52.839087 sshd[1674]: Accepted publickey for core from 10.0.0.1 port 58070 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:50:52.839592 sshd-session[1674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:50:52.843408 systemd-logind[1514]: New session 3 of user core. Sep 9 04:50:52.852315 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 04:50:52.900103 sshd[1677]: Connection closed by 10.0.0.1 port 58070 Sep 9 04:50:52.900575 sshd-session[1674]: pam_unix(sshd:session): session closed for user core Sep 9 04:50:52.913998 systemd[1]: sshd@2-10.0.0.32:22-10.0.0.1:58070.service: Deactivated successfully. Sep 9 04:50:52.915267 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 04:50:52.915900 systemd-logind[1514]: Session 3 logged out. Waiting for processes to exit. Sep 9 04:50:52.917773 systemd[1]: Started sshd@3-10.0.0.32:22-10.0.0.1:58082.service - OpenSSH per-connection server daemon (10.0.0.1:58082). Sep 9 04:50:52.918517 systemd-logind[1514]: Removed session 3. Sep 9 04:50:52.969825 sshd[1683]: Accepted publickey for core from 10.0.0.1 port 58082 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:50:52.971001 sshd-session[1683]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:50:52.975232 systemd-logind[1514]: New session 4 of user core. Sep 9 04:50:52.985317 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 04:50:53.037174 sshd[1687]: Connection closed by 10.0.0.1 port 58082 Sep 9 04:50:53.037609 sshd-session[1683]: pam_unix(sshd:session): session closed for user core Sep 9 04:50:53.049031 systemd[1]: sshd@3-10.0.0.32:22-10.0.0.1:58082.service: Deactivated successfully. Sep 9 04:50:53.051336 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 04:50:53.053168 systemd-logind[1514]: Session 4 logged out. Waiting for processes to exit. Sep 9 04:50:53.054119 systemd[1]: Started sshd@4-10.0.0.32:22-10.0.0.1:58086.service - OpenSSH per-connection server daemon (10.0.0.1:58086). Sep 9 04:50:53.055764 systemd-logind[1514]: Removed session 4. Sep 9 04:50:53.104288 sshd[1693]: Accepted publickey for core from 10.0.0.1 port 58086 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:50:53.105883 sshd-session[1693]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:50:53.110043 systemd-logind[1514]: New session 5 of user core. Sep 9 04:50:53.119294 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 04:50:53.176306 sudo[1697]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 04:50:53.176555 sudo[1697]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:50:53.190059 sudo[1697]: pam_unix(sudo:session): session closed for user root Sep 9 04:50:53.191864 sshd[1696]: Connection closed by 10.0.0.1 port 58086 Sep 9 04:50:53.191770 sshd-session[1693]: pam_unix(sshd:session): session closed for user core Sep 9 04:50:53.200073 systemd[1]: sshd@4-10.0.0.32:22-10.0.0.1:58086.service: Deactivated successfully. Sep 9 04:50:53.203593 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 04:50:53.204553 systemd-logind[1514]: Session 5 logged out. Waiting for processes to exit. Sep 9 04:50:53.207658 systemd[1]: Started sshd@5-10.0.0.32:22-10.0.0.1:58092.service - OpenSSH per-connection server daemon (10.0.0.1:58092). Sep 9 04:50:53.210447 systemd-logind[1514]: Removed session 5. Sep 9 04:50:53.267574 sshd[1703]: Accepted publickey for core from 10.0.0.1 port 58092 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:50:53.269032 sshd-session[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:50:53.273217 systemd-logind[1514]: New session 6 of user core. Sep 9 04:50:53.280286 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 04:50:53.330701 sudo[1708]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 04:50:53.330964 sudo[1708]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:50:53.335379 sudo[1708]: pam_unix(sudo:session): session closed for user root Sep 9 04:50:53.339498 sudo[1707]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 04:50:53.339957 sudo[1707]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:50:53.348000 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:50:53.384827 augenrules[1730]: No rules Sep 9 04:50:53.385453 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:50:53.385633 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:50:53.386400 sudo[1707]: pam_unix(sudo:session): session closed for user root Sep 9 04:50:53.387585 sshd[1706]: Connection closed by 10.0.0.1 port 58092 Sep 9 04:50:53.389273 sshd-session[1703]: pam_unix(sshd:session): session closed for user core Sep 9 04:50:53.400083 systemd[1]: sshd@5-10.0.0.32:22-10.0.0.1:58092.service: Deactivated successfully. Sep 9 04:50:53.401774 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 04:50:53.402915 systemd-logind[1514]: Session 6 logged out. Waiting for processes to exit. Sep 9 04:50:53.405085 systemd[1]: Started sshd@6-10.0.0.32:22-10.0.0.1:58108.service - OpenSSH per-connection server daemon (10.0.0.1:58108). Sep 9 04:50:53.407069 systemd-logind[1514]: Removed session 6. Sep 9 04:50:53.458060 sshd[1739]: Accepted publickey for core from 10.0.0.1 port 58108 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:50:53.459992 sshd-session[1739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:50:53.468647 systemd-logind[1514]: New session 7 of user core. Sep 9 04:50:53.478300 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 04:50:53.530075 sudo[1743]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 04:50:53.530358 sudo[1743]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:50:53.804912 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 04:50:53.828510 (dockerd)[1763]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 04:50:54.022184 dockerd[1763]: time="2025-09-09T04:50:54.022110082Z" level=info msg="Starting up" Sep 9 04:50:54.023325 dockerd[1763]: time="2025-09-09T04:50:54.023289019Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 04:50:54.033933 dockerd[1763]: time="2025-09-09T04:50:54.033893197Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 04:50:54.133550 dockerd[1763]: time="2025-09-09T04:50:54.133501271Z" level=info msg="Loading containers: start." Sep 9 04:50:54.142183 kernel: Initializing XFRM netlink socket Sep 9 04:50:54.331715 systemd-networkd[1443]: docker0: Link UP Sep 9 04:50:54.335066 dockerd[1763]: time="2025-09-09T04:50:54.335003240Z" level=info msg="Loading containers: done." Sep 9 04:50:54.376069 dockerd[1763]: time="2025-09-09T04:50:54.376013157Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 04:50:54.376247 dockerd[1763]: time="2025-09-09T04:50:54.376107712Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 04:50:54.376304 dockerd[1763]: time="2025-09-09T04:50:54.376282891Z" level=info msg="Initializing buildkit" Sep 9 04:50:54.408967 dockerd[1763]: time="2025-09-09T04:50:54.408861333Z" level=info msg="Completed buildkit initialization" Sep 9 04:50:54.413843 dockerd[1763]: time="2025-09-09T04:50:54.413799701Z" level=info msg="Daemon has completed initialization" Sep 9 04:50:54.413979 dockerd[1763]: time="2025-09-09T04:50:54.413871917Z" level=info msg="API listen on /run/docker.sock" Sep 9 04:50:54.414016 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 04:50:55.080693 containerd[1537]: time="2025-09-09T04:50:55.080654265Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 04:50:55.665642 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2309747187.mount: Deactivated successfully. Sep 9 04:50:57.034171 containerd[1537]: time="2025-09-09T04:50:57.034107744Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:57.035094 containerd[1537]: time="2025-09-09T04:50:57.034825722Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=26328359" Sep 9 04:50:57.035842 containerd[1537]: time="2025-09-09T04:50:57.035804365Z" level=info msg="ImageCreate event name:\"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:57.039227 containerd[1537]: time="2025-09-09T04:50:57.039190859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:57.041115 containerd[1537]: time="2025-09-09T04:50:57.040629362Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"26325157\" in 1.9599307s" Sep 9 04:50:57.041115 containerd[1537]: time="2025-09-09T04:50:57.040668926Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\"" Sep 9 04:50:57.041260 containerd[1537]: time="2025-09-09T04:50:57.041235800Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 04:50:58.143483 containerd[1537]: time="2025-09-09T04:50:58.143435403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:58.144362 containerd[1537]: time="2025-09-09T04:50:58.144334615Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=22528554" Sep 9 04:50:58.144810 containerd[1537]: time="2025-09-09T04:50:58.144789286Z" level=info msg="ImageCreate event name:\"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:58.147762 containerd[1537]: time="2025-09-09T04:50:58.147722392Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:58.149528 containerd[1537]: time="2025-09-09T04:50:58.149409016Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"24065666\" in 1.108073286s" Sep 9 04:50:58.149528 containerd[1537]: time="2025-09-09T04:50:58.149441385Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\"" Sep 9 04:50:58.149916 containerd[1537]: time="2025-09-09T04:50:58.149893999Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 04:50:59.122937 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 04:50:59.124794 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:59.254457 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:59.259036 (kubelet)[2052]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:50:59.316530 kubelet[2052]: E0909 04:50:59.316477 2052 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:50:59.319975 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:50:59.320118 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:50:59.321266 systemd[1]: kubelet.service: Consumed 151ms CPU time, 107.3M memory peak. Sep 9 04:50:59.618786 containerd[1537]: time="2025-09-09T04:50:59.618668768Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:59.620201 containerd[1537]: time="2025-09-09T04:50:59.620156922Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=17483529" Sep 9 04:50:59.621308 containerd[1537]: time="2025-09-09T04:50:59.621262353Z" level=info msg="ImageCreate event name:\"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:59.623464 containerd[1537]: time="2025-09-09T04:50:59.623432900Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:59.625286 containerd[1537]: time="2025-09-09T04:50:59.625241124Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"19020659\" in 1.475315695s" Sep 9 04:50:59.625322 containerd[1537]: time="2025-09-09T04:50:59.625284343Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\"" Sep 9 04:50:59.625698 containerd[1537]: time="2025-09-09T04:50:59.625667065Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 04:51:00.620197 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3698012468.mount: Deactivated successfully. Sep 9 04:51:00.854177 containerd[1537]: time="2025-09-09T04:51:00.854108635Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:00.854776 containerd[1537]: time="2025-09-09T04:51:00.854730427Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=27376726" Sep 9 04:51:00.855654 containerd[1537]: time="2025-09-09T04:51:00.855629887Z" level=info msg="ImageCreate event name:\"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:00.857572 containerd[1537]: time="2025-09-09T04:51:00.857529659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:00.858477 containerd[1537]: time="2025-09-09T04:51:00.858444464Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"27375743\" in 1.232745396s" Sep 9 04:51:00.858513 containerd[1537]: time="2025-09-09T04:51:00.858478982Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\"" Sep 9 04:51:00.859183 containerd[1537]: time="2025-09-09T04:51:00.859159940Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 04:51:01.367817 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount698697073.mount: Deactivated successfully. Sep 9 04:51:02.409191 containerd[1537]: time="2025-09-09T04:51:02.408857859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:02.410005 containerd[1537]: time="2025-09-09T04:51:02.409487416Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 9 04:51:02.410212 containerd[1537]: time="2025-09-09T04:51:02.410188067Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:02.412737 containerd[1537]: time="2025-09-09T04:51:02.412701590Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:02.414702 containerd[1537]: time="2025-09-09T04:51:02.414569954Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.555374306s" Sep 9 04:51:02.414702 containerd[1537]: time="2025-09-09T04:51:02.414612136Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 04:51:02.414993 containerd[1537]: time="2025-09-09T04:51:02.414977019Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 04:51:02.830916 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount775828093.mount: Deactivated successfully. Sep 9 04:51:02.836800 containerd[1537]: time="2025-09-09T04:51:02.836113743Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:51:02.836800 containerd[1537]: time="2025-09-09T04:51:02.836590455Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 9 04:51:02.837867 containerd[1537]: time="2025-09-09T04:51:02.837839475Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:51:02.839996 containerd[1537]: time="2025-09-09T04:51:02.839948145Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:51:02.840855 containerd[1537]: time="2025-09-09T04:51:02.840818008Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 425.757912ms" Sep 9 04:51:02.840893 containerd[1537]: time="2025-09-09T04:51:02.840854762Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 04:51:02.841746 containerd[1537]: time="2025-09-09T04:51:02.841721046Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 04:51:03.332472 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2240650451.mount: Deactivated successfully. Sep 9 04:51:05.338765 containerd[1537]: time="2025-09-09T04:51:05.338710130Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:05.339772 containerd[1537]: time="2025-09-09T04:51:05.339516541Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943167" Sep 9 04:51:05.340349 containerd[1537]: time="2025-09-09T04:51:05.340313880Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:05.343008 containerd[1537]: time="2025-09-09T04:51:05.342973484Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:05.345176 containerd[1537]: time="2025-09-09T04:51:05.345082822Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.503328655s" Sep 9 04:51:05.345176 containerd[1537]: time="2025-09-09T04:51:05.345118067Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 9 04:51:09.372730 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 04:51:09.374709 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:51:09.548756 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:51:09.559385 (kubelet)[2211]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:51:09.596610 kubelet[2211]: E0909 04:51:09.596563 2211 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:51:09.599410 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:51:09.599542 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:51:09.600053 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.2M memory peak. Sep 9 04:51:10.872277 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:51:10.872636 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.2M memory peak. Sep 9 04:51:10.874736 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:51:10.895402 systemd[1]: Reload requested from client PID 2226 ('systemctl') (unit session-7.scope)... Sep 9 04:51:10.895419 systemd[1]: Reloading... Sep 9 04:51:10.963165 zram_generator::config[2269]: No configuration found. Sep 9 04:51:11.139244 systemd[1]: Reloading finished in 243 ms. Sep 9 04:51:11.191543 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:51:11.193803 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:51:11.195194 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:51:11.195245 systemd[1]: kubelet.service: Consumed 92ms CPU time, 95.2M memory peak. Sep 9 04:51:11.196630 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:51:11.333765 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:51:11.337521 (kubelet)[2316]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:51:11.371357 kubelet[2316]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:51:11.371357 kubelet[2316]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:51:11.371357 kubelet[2316]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:51:11.371696 kubelet[2316]: I0909 04:51:11.371420 2316 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:51:12.363728 kubelet[2316]: I0909 04:51:12.363684 2316 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 04:51:12.364421 kubelet[2316]: I0909 04:51:12.363877 2316 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:51:12.364489 kubelet[2316]: I0909 04:51:12.364434 2316 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 04:51:12.388764 kubelet[2316]: E0909 04:51:12.388716 2316 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.32:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:51:12.390721 kubelet[2316]: I0909 04:51:12.390673 2316 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:51:12.398462 kubelet[2316]: I0909 04:51:12.398384 2316 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:51:12.401539 kubelet[2316]: I0909 04:51:12.401512 2316 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:51:12.402773 kubelet[2316]: I0909 04:51:12.402731 2316 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:51:12.402964 kubelet[2316]: I0909 04:51:12.402779 2316 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:51:12.403049 kubelet[2316]: I0909 04:51:12.403040 2316 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:51:12.403069 kubelet[2316]: I0909 04:51:12.403054 2316 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 04:51:12.403288 kubelet[2316]: I0909 04:51:12.403273 2316 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:51:12.406521 kubelet[2316]: I0909 04:51:12.406323 2316 kubelet.go:446] "Attempting to sync node with API server" Sep 9 04:51:12.406521 kubelet[2316]: I0909 04:51:12.406357 2316 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:51:12.406521 kubelet[2316]: I0909 04:51:12.406390 2316 kubelet.go:352] "Adding apiserver pod source" Sep 9 04:51:12.406521 kubelet[2316]: I0909 04:51:12.406406 2316 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:51:12.410250 kubelet[2316]: W0909 04:51:12.410192 2316 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.32:6443: connect: connection refused Sep 9 04:51:12.410331 kubelet[2316]: E0909 04:51:12.410254 2316 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.32:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:51:12.410514 kubelet[2316]: I0909 04:51:12.410490 2316 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:51:12.410699 kubelet[2316]: W0909 04:51:12.410652 2316 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.32:6443: connect: connection refused Sep 9 04:51:12.410748 kubelet[2316]: E0909 04:51:12.410710 2316 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.32:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:51:12.411211 kubelet[2316]: I0909 04:51:12.411189 2316 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:51:12.411371 kubelet[2316]: W0909 04:51:12.411354 2316 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 04:51:12.412374 kubelet[2316]: I0909 04:51:12.412352 2316 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:51:12.412549 kubelet[2316]: I0909 04:51:12.412533 2316 server.go:1287] "Started kubelet" Sep 9 04:51:12.413638 kubelet[2316]: I0909 04:51:12.413581 2316 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:51:12.413929 kubelet[2316]: I0909 04:51:12.413882 2316 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:51:12.414176 kubelet[2316]: I0909 04:51:12.414155 2316 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:51:12.415897 kubelet[2316]: I0909 04:51:12.415864 2316 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:51:12.417360 kubelet[2316]: I0909 04:51:12.417333 2316 server.go:479] "Adding debug handlers to kubelet server" Sep 9 04:51:12.418199 kubelet[2316]: I0909 04:51:12.418169 2316 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:51:12.419212 kubelet[2316]: E0909 04:51:12.418923 2316 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.32:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.32:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1863840d55bac9d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 04:51:12.412371411 +0000 UTC m=+1.072019786,LastTimestamp:2025-09-09 04:51:12.412371411 +0000 UTC m=+1.072019786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 04:51:12.420637 kubelet[2316]: I0909 04:51:12.419304 2316 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:51:12.420637 kubelet[2316]: E0909 04:51:12.419481 2316 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:51:12.420637 kubelet[2316]: I0909 04:51:12.420329 2316 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:51:12.420637 kubelet[2316]: I0909 04:51:12.420349 2316 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:51:12.420637 kubelet[2316]: W0909 04:51:12.420442 2316 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.32:6443: connect: connection refused Sep 9 04:51:12.420637 kubelet[2316]: E0909 04:51:12.420487 2316 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.32:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:51:12.420909 kubelet[2316]: E0909 04:51:12.420845 2316 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.32:6443: connect: connection refused" interval="200ms" Sep 9 04:51:12.421091 kubelet[2316]: I0909 04:51:12.421070 2316 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:51:12.421272 kubelet[2316]: I0909 04:51:12.421253 2316 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:51:12.422194 kubelet[2316]: E0909 04:51:12.422166 2316 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:51:12.423162 kubelet[2316]: I0909 04:51:12.422417 2316 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:51:12.431032 kubelet[2316]: I0909 04:51:12.430904 2316 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:51:12.432113 kubelet[2316]: I0909 04:51:12.432075 2316 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:51:12.432242 kubelet[2316]: I0909 04:51:12.432231 2316 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 04:51:12.432310 kubelet[2316]: I0909 04:51:12.432298 2316 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:51:12.432362 kubelet[2316]: I0909 04:51:12.432353 2316 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 04:51:12.432453 kubelet[2316]: E0909 04:51:12.432436 2316 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:51:12.435238 kubelet[2316]: W0909 04:51:12.434931 2316 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.32:6443: connect: connection refused Sep 9 04:51:12.435995 kubelet[2316]: E0909 04:51:12.435259 2316 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.32:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:51:12.435995 kubelet[2316]: I0909 04:51:12.435546 2316 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:51:12.435995 kubelet[2316]: I0909 04:51:12.435559 2316 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:51:12.435995 kubelet[2316]: I0909 04:51:12.435583 2316 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:51:12.520581 kubelet[2316]: E0909 04:51:12.520539 2316 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:51:12.527340 kubelet[2316]: I0909 04:51:12.527308 2316 policy_none.go:49] "None policy: Start" Sep 9 04:51:12.527387 kubelet[2316]: I0909 04:51:12.527350 2316 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:51:12.527387 kubelet[2316]: I0909 04:51:12.527364 2316 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:51:12.532338 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 04:51:12.532608 kubelet[2316]: E0909 04:51:12.532577 2316 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 04:51:12.554012 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 04:51:12.556717 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 04:51:12.573945 kubelet[2316]: I0909 04:51:12.573918 2316 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:51:12.574183 kubelet[2316]: I0909 04:51:12.574161 2316 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:51:12.574228 kubelet[2316]: I0909 04:51:12.574181 2316 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:51:12.574461 kubelet[2316]: I0909 04:51:12.574434 2316 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:51:12.575482 kubelet[2316]: E0909 04:51:12.575462 2316 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:51:12.575554 kubelet[2316]: E0909 04:51:12.575502 2316 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 04:51:12.621833 kubelet[2316]: E0909 04:51:12.621774 2316 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.32:6443: connect: connection refused" interval="400ms" Sep 9 04:51:12.676962 kubelet[2316]: I0909 04:51:12.676927 2316 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:51:12.677414 kubelet[2316]: E0909 04:51:12.677382 2316 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.32:6443/api/v1/nodes\": dial tcp 10.0.0.32:6443: connect: connection refused" node="localhost" Sep 9 04:51:12.740448 systemd[1]: Created slice kubepods-burstable-poded845e5b3717b5b9e4780b90bf5050ac.slice - libcontainer container kubepods-burstable-poded845e5b3717b5b9e4780b90bf5050ac.slice. Sep 9 04:51:12.756407 kubelet[2316]: E0909 04:51:12.756385 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:12.759665 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Sep 9 04:51:12.777772 kubelet[2316]: E0909 04:51:12.777744 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:12.781329 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Sep 9 04:51:12.783272 kubelet[2316]: E0909 04:51:12.783091 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:12.822539 kubelet[2316]: I0909 04:51:12.822503 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:12.822660 kubelet[2316]: I0909 04:51:12.822644 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed845e5b3717b5b9e4780b90bf5050ac-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"ed845e5b3717b5b9e4780b90bf5050ac\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:12.822754 kubelet[2316]: I0909 04:51:12.822741 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed845e5b3717b5b9e4780b90bf5050ac-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"ed845e5b3717b5b9e4780b90bf5050ac\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:12.822825 kubelet[2316]: I0909 04:51:12.822808 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed845e5b3717b5b9e4780b90bf5050ac-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"ed845e5b3717b5b9e4780b90bf5050ac\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:12.822985 kubelet[2316]: I0909 04:51:12.822868 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:12.822985 kubelet[2316]: I0909 04:51:12.822886 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:12.822985 kubelet[2316]: I0909 04:51:12.822903 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:12.822985 kubelet[2316]: I0909 04:51:12.822919 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:12.822985 kubelet[2316]: I0909 04:51:12.822934 2316 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:12.879451 kubelet[2316]: I0909 04:51:12.879342 2316 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:51:12.879697 kubelet[2316]: E0909 04:51:12.879650 2316 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.32:6443/api/v1/nodes\": dial tcp 10.0.0.32:6443: connect: connection refused" node="localhost" Sep 9 04:51:13.022716 kubelet[2316]: E0909 04:51:13.022665 2316 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.32:6443: connect: connection refused" interval="800ms" Sep 9 04:51:13.057754 containerd[1537]: time="2025-09-09T04:51:13.057706093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:ed845e5b3717b5b9e4780b90bf5050ac,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:13.074847 containerd[1537]: time="2025-09-09T04:51:13.074804755Z" level=info msg="connecting to shim d755e0163a2473e6cdea1acc3d26ecac508dfbb23591f9732111e2e0b54a64ed" address="unix:///run/containerd/s/1b5d4f3019ddb118311599c4a22f818f9ca2ea25be782ce1574761681f9b5f7e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:13.081236 containerd[1537]: time="2025-09-09T04:51:13.080940546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:13.084754 containerd[1537]: time="2025-09-09T04:51:13.084717586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:13.101335 systemd[1]: Started cri-containerd-d755e0163a2473e6cdea1acc3d26ecac508dfbb23591f9732111e2e0b54a64ed.scope - libcontainer container d755e0163a2473e6cdea1acc3d26ecac508dfbb23591f9732111e2e0b54a64ed. Sep 9 04:51:13.111168 containerd[1537]: time="2025-09-09T04:51:13.110531060Z" level=info msg="connecting to shim ad335dd75a4ff4efa3bfac03dadc981dfed85ab01f8b25de68a4aa057830d5dc" address="unix:///run/containerd/s/ded00dcb2664f1d02a1c7e3394248d9057cf66973806d65e8175b608556e874a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:13.114097 containerd[1537]: time="2025-09-09T04:51:13.114042777Z" level=info msg="connecting to shim f7bd7d6e06a4c19cd4e6ce440c7a67bb8dfac4d9d1893aa991dea017a98ec76a" address="unix:///run/containerd/s/08822198ebfb8d4528ada4e504aa1b63b64c6f515c46c2c64fb31807c074b2c1" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:13.142319 systemd[1]: Started cri-containerd-ad335dd75a4ff4efa3bfac03dadc981dfed85ab01f8b25de68a4aa057830d5dc.scope - libcontainer container ad335dd75a4ff4efa3bfac03dadc981dfed85ab01f8b25de68a4aa057830d5dc. Sep 9 04:51:13.145630 systemd[1]: Started cri-containerd-f7bd7d6e06a4c19cd4e6ce440c7a67bb8dfac4d9d1893aa991dea017a98ec76a.scope - libcontainer container f7bd7d6e06a4c19cd4e6ce440c7a67bb8dfac4d9d1893aa991dea017a98ec76a. Sep 9 04:51:13.155191 containerd[1537]: time="2025-09-09T04:51:13.154712222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:ed845e5b3717b5b9e4780b90bf5050ac,Namespace:kube-system,Attempt:0,} returns sandbox id \"d755e0163a2473e6cdea1acc3d26ecac508dfbb23591f9732111e2e0b54a64ed\"" Sep 9 04:51:13.159589 containerd[1537]: time="2025-09-09T04:51:13.159187031Z" level=info msg="CreateContainer within sandbox \"d755e0163a2473e6cdea1acc3d26ecac508dfbb23591f9732111e2e0b54a64ed\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 04:51:13.167998 containerd[1537]: time="2025-09-09T04:51:13.167969045Z" level=info msg="Container fda6b5b412ffac5744f95ce3a3a830217d31db43d123848bee520afd9e572ca4: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:13.181612 containerd[1537]: time="2025-09-09T04:51:13.181564481Z" level=info msg="CreateContainer within sandbox \"d755e0163a2473e6cdea1acc3d26ecac508dfbb23591f9732111e2e0b54a64ed\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fda6b5b412ffac5744f95ce3a3a830217d31db43d123848bee520afd9e572ca4\"" Sep 9 04:51:13.182227 containerd[1537]: time="2025-09-09T04:51:13.182190804Z" level=info msg="StartContainer for \"fda6b5b412ffac5744f95ce3a3a830217d31db43d123848bee520afd9e572ca4\"" Sep 9 04:51:13.183696 containerd[1537]: time="2025-09-09T04:51:13.183634281Z" level=info msg="connecting to shim fda6b5b412ffac5744f95ce3a3a830217d31db43d123848bee520afd9e572ca4" address="unix:///run/containerd/s/1b5d4f3019ddb118311599c4a22f818f9ca2ea25be782ce1574761681f9b5f7e" protocol=ttrpc version=3 Sep 9 04:51:13.190587 containerd[1537]: time="2025-09-09T04:51:13.190548782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"ad335dd75a4ff4efa3bfac03dadc981dfed85ab01f8b25de68a4aa057830d5dc\"" Sep 9 04:51:13.193074 containerd[1537]: time="2025-09-09T04:51:13.193042739Z" level=info msg="CreateContainer within sandbox \"ad335dd75a4ff4efa3bfac03dadc981dfed85ab01f8b25de68a4aa057830d5dc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 04:51:13.194072 containerd[1537]: time="2025-09-09T04:51:13.194023253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"f7bd7d6e06a4c19cd4e6ce440c7a67bb8dfac4d9d1893aa991dea017a98ec76a\"" Sep 9 04:51:13.199764 containerd[1537]: time="2025-09-09T04:51:13.199600044Z" level=info msg="CreateContainer within sandbox \"f7bd7d6e06a4c19cd4e6ce440c7a67bb8dfac4d9d1893aa991dea017a98ec76a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 04:51:13.207324 systemd[1]: Started cri-containerd-fda6b5b412ffac5744f95ce3a3a830217d31db43d123848bee520afd9e572ca4.scope - libcontainer container fda6b5b412ffac5744f95ce3a3a830217d31db43d123848bee520afd9e572ca4. Sep 9 04:51:13.211147 containerd[1537]: time="2025-09-09T04:51:13.211090636Z" level=info msg="Container ff23252a35a627ae57cf25650037e0211a99f42d9ff849dfa5544044d5dc2e97: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:13.217746 containerd[1537]: time="2025-09-09T04:51:13.217702368Z" level=info msg="Container 7b3e937d85ed7b38f1984410fe225548b8d51e51a7be15b1df5c7d29a84689a7: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:13.230341 containerd[1537]: time="2025-09-09T04:51:13.230293861Z" level=info msg="CreateContainer within sandbox \"ad335dd75a4ff4efa3bfac03dadc981dfed85ab01f8b25de68a4aa057830d5dc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ff23252a35a627ae57cf25650037e0211a99f42d9ff849dfa5544044d5dc2e97\"" Sep 9 04:51:13.230999 containerd[1537]: time="2025-09-09T04:51:13.230966520Z" level=info msg="StartContainer for \"ff23252a35a627ae57cf25650037e0211a99f42d9ff849dfa5544044d5dc2e97\"" Sep 9 04:51:13.232450 containerd[1537]: time="2025-09-09T04:51:13.232416005Z" level=info msg="CreateContainer within sandbox \"f7bd7d6e06a4c19cd4e6ce440c7a67bb8dfac4d9d1893aa991dea017a98ec76a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7b3e937d85ed7b38f1984410fe225548b8d51e51a7be15b1df5c7d29a84689a7\"" Sep 9 04:51:13.233943 containerd[1537]: time="2025-09-09T04:51:13.233901174Z" level=info msg="connecting to shim ff23252a35a627ae57cf25650037e0211a99f42d9ff849dfa5544044d5dc2e97" address="unix:///run/containerd/s/ded00dcb2664f1d02a1c7e3394248d9057cf66973806d65e8175b608556e874a" protocol=ttrpc version=3 Sep 9 04:51:13.233943 containerd[1537]: time="2025-09-09T04:51:13.233929048Z" level=info msg="StartContainer for \"7b3e937d85ed7b38f1984410fe225548b8d51e51a7be15b1df5c7d29a84689a7\"" Sep 9 04:51:13.236230 containerd[1537]: time="2025-09-09T04:51:13.236104737Z" level=info msg="connecting to shim 7b3e937d85ed7b38f1984410fe225548b8d51e51a7be15b1df5c7d29a84689a7" address="unix:///run/containerd/s/08822198ebfb8d4528ada4e504aa1b63b64c6f515c46c2c64fb31807c074b2c1" protocol=ttrpc version=3 Sep 9 04:51:13.249303 containerd[1537]: time="2025-09-09T04:51:13.249248263Z" level=info msg="StartContainer for \"fda6b5b412ffac5744f95ce3a3a830217d31db43d123848bee520afd9e572ca4\" returns successfully" Sep 9 04:51:13.258321 systemd[1]: Started cri-containerd-7b3e937d85ed7b38f1984410fe225548b8d51e51a7be15b1df5c7d29a84689a7.scope - libcontainer container 7b3e937d85ed7b38f1984410fe225548b8d51e51a7be15b1df5c7d29a84689a7. Sep 9 04:51:13.262719 systemd[1]: Started cri-containerd-ff23252a35a627ae57cf25650037e0211a99f42d9ff849dfa5544044d5dc2e97.scope - libcontainer container ff23252a35a627ae57cf25650037e0211a99f42d9ff849dfa5544044d5dc2e97. Sep 9 04:51:13.274714 kubelet[2316]: W0909 04:51:13.274610 2316 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.32:6443: connect: connection refused Sep 9 04:51:13.274714 kubelet[2316]: E0909 04:51:13.274682 2316 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.32:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:51:13.281945 kubelet[2316]: I0909 04:51:13.281914 2316 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:51:13.282352 kubelet[2316]: E0909 04:51:13.282315 2316 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.32:6443/api/v1/nodes\": dial tcp 10.0.0.32:6443: connect: connection refused" node="localhost" Sep 9 04:51:13.306168 kubelet[2316]: W0909 04:51:13.304867 2316 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.32:6443: connect: connection refused Sep 9 04:51:13.306168 kubelet[2316]: E0909 04:51:13.304944 2316 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.32:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:51:13.312366 containerd[1537]: time="2025-09-09T04:51:13.312326396Z" level=info msg="StartContainer for \"7b3e937d85ed7b38f1984410fe225548b8d51e51a7be15b1df5c7d29a84689a7\" returns successfully" Sep 9 04:51:13.324482 containerd[1537]: time="2025-09-09T04:51:13.324270100Z" level=info msg="StartContainer for \"ff23252a35a627ae57cf25650037e0211a99f42d9ff849dfa5544044d5dc2e97\" returns successfully" Sep 9 04:51:13.443287 kubelet[2316]: E0909 04:51:13.443175 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:13.445470 kubelet[2316]: E0909 04:51:13.445443 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:13.447617 kubelet[2316]: E0909 04:51:13.447496 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:14.084750 kubelet[2316]: I0909 04:51:14.084721 2316 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:51:14.450518 kubelet[2316]: E0909 04:51:14.450489 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:14.452712 kubelet[2316]: E0909 04:51:14.452679 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:14.452874 kubelet[2316]: E0909 04:51:14.452727 2316 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:51:15.088532 kubelet[2316]: E0909 04:51:15.088486 2316 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 04:51:15.217664 kubelet[2316]: I0909 04:51:15.217449 2316 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 04:51:15.220943 kubelet[2316]: I0909 04:51:15.220912 2316 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:15.277962 kubelet[2316]: E0909 04:51:15.277791 2316 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:15.278409 kubelet[2316]: I0909 04:51:15.278160 2316 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:15.280654 kubelet[2316]: E0909 04:51:15.280594 2316 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:15.280939 kubelet[2316]: I0909 04:51:15.280794 2316 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:15.283250 kubelet[2316]: E0909 04:51:15.283227 2316 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:15.411212 kubelet[2316]: I0909 04:51:15.410839 2316 apiserver.go:52] "Watching apiserver" Sep 9 04:51:15.420730 kubelet[2316]: I0909 04:51:15.420701 2316 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:51:15.449597 kubelet[2316]: I0909 04:51:15.449572 2316 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:15.451265 kubelet[2316]: E0909 04:51:15.451235 2316 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:15.967840 kubelet[2316]: I0909 04:51:15.967804 2316 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:15.969903 kubelet[2316]: E0909 04:51:15.969790 2316 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:17.274943 systemd[1]: Reload requested from client PID 2590 ('systemctl') (unit session-7.scope)... Sep 9 04:51:17.274958 systemd[1]: Reloading... Sep 9 04:51:17.351206 zram_generator::config[2636]: No configuration found. Sep 9 04:51:17.513051 systemd[1]: Reloading finished in 237 ms. Sep 9 04:51:17.544637 kubelet[2316]: I0909 04:51:17.544522 2316 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:51:17.544739 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:51:17.566153 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:51:17.568192 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:51:17.568251 systemd[1]: kubelet.service: Consumed 1.432s CPU time, 126.8M memory peak. Sep 9 04:51:17.569935 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:51:17.725248 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:51:17.729919 (kubelet)[2675]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:51:17.775606 kubelet[2675]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:51:17.775606 kubelet[2675]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:51:17.775606 kubelet[2675]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:51:17.775930 kubelet[2675]: I0909 04:51:17.775657 2675 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:51:17.781084 kubelet[2675]: I0909 04:51:17.781052 2675 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 04:51:17.781273 kubelet[2675]: I0909 04:51:17.781224 2675 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:51:17.781571 kubelet[2675]: I0909 04:51:17.781550 2675 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 04:51:17.783188 kubelet[2675]: I0909 04:51:17.783116 2675 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 04:51:17.786458 kubelet[2675]: I0909 04:51:17.786419 2675 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:51:17.790013 kubelet[2675]: I0909 04:51:17.789958 2675 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:51:17.792626 kubelet[2675]: I0909 04:51:17.792610 2675 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:51:17.792847 kubelet[2675]: I0909 04:51:17.792812 2675 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:51:17.792981 kubelet[2675]: I0909 04:51:17.792836 2675 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:51:17.793053 kubelet[2675]: I0909 04:51:17.792990 2675 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:51:17.793053 kubelet[2675]: I0909 04:51:17.792999 2675 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 04:51:17.793053 kubelet[2675]: I0909 04:51:17.793039 2675 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:51:17.793235 kubelet[2675]: I0909 04:51:17.793217 2675 kubelet.go:446] "Attempting to sync node with API server" Sep 9 04:51:17.793235 kubelet[2675]: I0909 04:51:17.793234 2675 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:51:17.793274 kubelet[2675]: I0909 04:51:17.793254 2675 kubelet.go:352] "Adding apiserver pod source" Sep 9 04:51:17.793363 kubelet[2675]: I0909 04:51:17.793346 2675 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:51:17.794010 kubelet[2675]: I0909 04:51:17.793984 2675 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:51:17.794585 kubelet[2675]: I0909 04:51:17.794557 2675 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:51:17.794998 kubelet[2675]: I0909 04:51:17.794926 2675 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:51:17.794998 kubelet[2675]: I0909 04:51:17.794960 2675 server.go:1287] "Started kubelet" Sep 9 04:51:17.797436 kubelet[2675]: I0909 04:51:17.797403 2675 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:51:17.798532 kubelet[2675]: I0909 04:51:17.798461 2675 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:51:17.799233 kubelet[2675]: I0909 04:51:17.799168 2675 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:51:17.799490 kubelet[2675]: I0909 04:51:17.799457 2675 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:51:17.799881 kubelet[2675]: I0909 04:51:17.799848 2675 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:51:17.799881 kubelet[2675]: I0909 04:51:17.799869 2675 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:51:17.800167 kubelet[2675]: E0909 04:51:17.800122 2675 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:51:17.804154 kubelet[2675]: I0909 04:51:17.801087 2675 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:51:17.804154 kubelet[2675]: I0909 04:51:17.801230 2675 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:51:17.804154 kubelet[2675]: I0909 04:51:17.802980 2675 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:51:17.804154 kubelet[2675]: I0909 04:51:17.803082 2675 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:51:17.806149 kubelet[2675]: I0909 04:51:17.804947 2675 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:51:17.812264 kubelet[2675]: E0909 04:51:17.811753 2675 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:51:17.822941 kubelet[2675]: I0909 04:51:17.822532 2675 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:51:17.825663 kubelet[2675]: I0909 04:51:17.825638 2675 server.go:479] "Adding debug handlers to kubelet server" Sep 9 04:51:17.825752 kubelet[2675]: I0909 04:51:17.825663 2675 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:51:17.825752 kubelet[2675]: I0909 04:51:17.825681 2675 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 04:51:17.825752 kubelet[2675]: I0909 04:51:17.825699 2675 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:51:17.825752 kubelet[2675]: I0909 04:51:17.825707 2675 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 04:51:17.825752 kubelet[2675]: E0909 04:51:17.825741 2675 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:51:17.887149 kubelet[2675]: I0909 04:51:17.887120 2675 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:51:17.887256 kubelet[2675]: I0909 04:51:17.887164 2675 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:51:17.887256 kubelet[2675]: I0909 04:51:17.887184 2675 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:51:17.887361 kubelet[2675]: I0909 04:51:17.887343 2675 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 04:51:17.887388 kubelet[2675]: I0909 04:51:17.887359 2675 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 04:51:17.887388 kubelet[2675]: I0909 04:51:17.887378 2675 policy_none.go:49] "None policy: Start" Sep 9 04:51:17.887388 kubelet[2675]: I0909 04:51:17.887385 2675 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:51:17.887452 kubelet[2675]: I0909 04:51:17.887394 2675 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:51:17.887498 kubelet[2675]: I0909 04:51:17.887486 2675 state_mem.go:75] "Updated machine memory state" Sep 9 04:51:17.892646 kubelet[2675]: I0909 04:51:17.892303 2675 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:51:17.892646 kubelet[2675]: I0909 04:51:17.892456 2675 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:51:17.892646 kubelet[2675]: I0909 04:51:17.892468 2675 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:51:17.892757 kubelet[2675]: I0909 04:51:17.892726 2675 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:51:17.894778 kubelet[2675]: E0909 04:51:17.894750 2675 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:51:17.926953 kubelet[2675]: I0909 04:51:17.926922 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:17.927107 kubelet[2675]: I0909 04:51:17.927079 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:17.927177 kubelet[2675]: I0909 04:51:17.927159 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:17.994184 kubelet[2675]: I0909 04:51:17.994105 2675 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:51:18.001823 kubelet[2675]: I0909 04:51:18.001793 2675 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 9 04:51:18.001930 kubelet[2675]: I0909 04:51:18.001870 2675 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 04:51:18.002713 kubelet[2675]: I0909 04:51:18.002653 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:18.002713 kubelet[2675]: I0909 04:51:18.002687 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed845e5b3717b5b9e4780b90bf5050ac-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"ed845e5b3717b5b9e4780b90bf5050ac\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:18.002713 kubelet[2675]: I0909 04:51:18.002709 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed845e5b3717b5b9e4780b90bf5050ac-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"ed845e5b3717b5b9e4780b90bf5050ac\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:18.002998 kubelet[2675]: I0909 04:51:18.002730 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:18.002998 kubelet[2675]: I0909 04:51:18.002757 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:18.002998 kubelet[2675]: I0909 04:51:18.002772 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed845e5b3717b5b9e4780b90bf5050ac-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"ed845e5b3717b5b9e4780b90bf5050ac\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:18.002998 kubelet[2675]: I0909 04:51:18.002822 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:18.002998 kubelet[2675]: I0909 04:51:18.002862 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:51:18.004226 kubelet[2675]: I0909 04:51:18.002884 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:18.793902 kubelet[2675]: I0909 04:51:18.793837 2675 apiserver.go:52] "Watching apiserver" Sep 9 04:51:18.802318 kubelet[2675]: I0909 04:51:18.802279 2675 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:51:18.858975 kubelet[2675]: I0909 04:51:18.858779 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:18.859250 kubelet[2675]: I0909 04:51:18.858865 2675 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:18.864247 kubelet[2675]: E0909 04:51:18.864128 2675 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 9 04:51:18.864566 kubelet[2675]: E0909 04:51:18.864534 2675 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 04:51:18.887566 kubelet[2675]: I0909 04:51:18.887387 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.887370549 podStartE2EDuration="1.887370549s" podCreationTimestamp="2025-09-09 04:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:51:18.887303301 +0000 UTC m=+1.153464876" watchObservedRunningTime="2025-09-09 04:51:18.887370549 +0000 UTC m=+1.153532124" Sep 9 04:51:18.887866 kubelet[2675]: I0909 04:51:18.887797 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.887788564 podStartE2EDuration="1.887788564s" podCreationTimestamp="2025-09-09 04:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:51:18.881209273 +0000 UTC m=+1.147370848" watchObservedRunningTime="2025-09-09 04:51:18.887788564 +0000 UTC m=+1.153950139" Sep 9 04:51:18.894679 kubelet[2675]: I0909 04:51:18.894547 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.894538896 podStartE2EDuration="1.894538896s" podCreationTimestamp="2025-09-09 04:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:51:18.894473089 +0000 UTC m=+1.160634664" watchObservedRunningTime="2025-09-09 04:51:18.894538896 +0000 UTC m=+1.160700471" Sep 9 04:51:22.503716 kubelet[2675]: I0909 04:51:22.503665 2675 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 04:51:22.504025 containerd[1537]: time="2025-09-09T04:51:22.503943301Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 04:51:22.504245 kubelet[2675]: I0909 04:51:22.504216 2675 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 04:51:23.503776 systemd[1]: Created slice kubepods-besteffort-pod8e6066a0_2496_4cc0_b397_edb7358bcb98.slice - libcontainer container kubepods-besteffort-pod8e6066a0_2496_4cc0_b397_edb7358bcb98.slice. Sep 9 04:51:23.539396 kubelet[2675]: I0909 04:51:23.539354 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e6066a0-2496-4cc0-b397-edb7358bcb98-lib-modules\") pod \"kube-proxy-hbz95\" (UID: \"8e6066a0-2496-4cc0-b397-edb7358bcb98\") " pod="kube-system/kube-proxy-hbz95" Sep 9 04:51:23.539396 kubelet[2675]: I0909 04:51:23.539396 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-684dt\" (UniqueName: \"kubernetes.io/projected/8e6066a0-2496-4cc0-b397-edb7358bcb98-kube-api-access-684dt\") pod \"kube-proxy-hbz95\" (UID: \"8e6066a0-2496-4cc0-b397-edb7358bcb98\") " pod="kube-system/kube-proxy-hbz95" Sep 9 04:51:23.539715 kubelet[2675]: I0909 04:51:23.539420 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8e6066a0-2496-4cc0-b397-edb7358bcb98-kube-proxy\") pod \"kube-proxy-hbz95\" (UID: \"8e6066a0-2496-4cc0-b397-edb7358bcb98\") " pod="kube-system/kube-proxy-hbz95" Sep 9 04:51:23.539715 kubelet[2675]: I0909 04:51:23.539434 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8e6066a0-2496-4cc0-b397-edb7358bcb98-xtables-lock\") pod \"kube-proxy-hbz95\" (UID: \"8e6066a0-2496-4cc0-b397-edb7358bcb98\") " pod="kube-system/kube-proxy-hbz95" Sep 9 04:51:23.580386 systemd[1]: Created slice kubepods-besteffort-pod608da73e_7fb8_4900_9a51_9bda8f5e0819.slice - libcontainer container kubepods-besteffort-pod608da73e_7fb8_4900_9a51_9bda8f5e0819.slice. Sep 9 04:51:23.640162 kubelet[2675]: I0909 04:51:23.640070 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/608da73e-7fb8-4900-9a51-9bda8f5e0819-var-lib-calico\") pod \"tigera-operator-755d956888-c4bgl\" (UID: \"608da73e-7fb8-4900-9a51-9bda8f5e0819\") " pod="tigera-operator/tigera-operator-755d956888-c4bgl" Sep 9 04:51:23.640364 kubelet[2675]: I0909 04:51:23.640288 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66p8x\" (UniqueName: \"kubernetes.io/projected/608da73e-7fb8-4900-9a51-9bda8f5e0819-kube-api-access-66p8x\") pod \"tigera-operator-755d956888-c4bgl\" (UID: \"608da73e-7fb8-4900-9a51-9bda8f5e0819\") " pod="tigera-operator/tigera-operator-755d956888-c4bgl" Sep 9 04:51:23.816745 containerd[1537]: time="2025-09-09T04:51:23.816639880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hbz95,Uid:8e6066a0-2496-4cc0-b397-edb7358bcb98,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:23.831307 containerd[1537]: time="2025-09-09T04:51:23.831270934Z" level=info msg="connecting to shim 37484ebad5f5b20de18acfa1ab5d474dfe7c4781a18094a41c07ef3a5832b4c3" address="unix:///run/containerd/s/a0463f69e4a69de716758b4f79b433f95040bb933de2bd7f1758bfd5a7c5fce0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:23.857369 systemd[1]: Started cri-containerd-37484ebad5f5b20de18acfa1ab5d474dfe7c4781a18094a41c07ef3a5832b4c3.scope - libcontainer container 37484ebad5f5b20de18acfa1ab5d474dfe7c4781a18094a41c07ef3a5832b4c3. Sep 9 04:51:23.876859 containerd[1537]: time="2025-09-09T04:51:23.876821110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-hbz95,Uid:8e6066a0-2496-4cc0-b397-edb7358bcb98,Namespace:kube-system,Attempt:0,} returns sandbox id \"37484ebad5f5b20de18acfa1ab5d474dfe7c4781a18094a41c07ef3a5832b4c3\"" Sep 9 04:51:23.880295 containerd[1537]: time="2025-09-09T04:51:23.880262616Z" level=info msg="CreateContainer within sandbox \"37484ebad5f5b20de18acfa1ab5d474dfe7c4781a18094a41c07ef3a5832b4c3\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 04:51:23.882675 containerd[1537]: time="2025-09-09T04:51:23.882634732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-c4bgl,Uid:608da73e-7fb8-4900-9a51-9bda8f5e0819,Namespace:tigera-operator,Attempt:0,}" Sep 9 04:51:23.898053 containerd[1537]: time="2025-09-09T04:51:23.898017988Z" level=info msg="Container d960b00d64c1f01b2566faefced42927cbf47d264a0de992c8524d14f06da584: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:23.900571 containerd[1537]: time="2025-09-09T04:51:23.900540754Z" level=info msg="connecting to shim 6c2192221ec7294554957033aaeffbbbf2658ff95fee55962ac5009bc08456a4" address="unix:///run/containerd/s/4c736db638a25a6d4c6e0f9806ccb4dbc46313fcdd4f1cd287f45cd5c10bcb78" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:23.904673 containerd[1537]: time="2025-09-09T04:51:23.904638966Z" level=info msg="CreateContainer within sandbox \"37484ebad5f5b20de18acfa1ab5d474dfe7c4781a18094a41c07ef3a5832b4c3\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d960b00d64c1f01b2566faefced42927cbf47d264a0de992c8524d14f06da584\"" Sep 9 04:51:23.907177 containerd[1537]: time="2025-09-09T04:51:23.905531732Z" level=info msg="StartContainer for \"d960b00d64c1f01b2566faefced42927cbf47d264a0de992c8524d14f06da584\"" Sep 9 04:51:23.907177 containerd[1537]: time="2025-09-09T04:51:23.906849668Z" level=info msg="connecting to shim d960b00d64c1f01b2566faefced42927cbf47d264a0de992c8524d14f06da584" address="unix:///run/containerd/s/a0463f69e4a69de716758b4f79b433f95040bb933de2bd7f1758bfd5a7c5fce0" protocol=ttrpc version=3 Sep 9 04:51:23.921306 systemd[1]: Started cri-containerd-6c2192221ec7294554957033aaeffbbbf2658ff95fee55962ac5009bc08456a4.scope - libcontainer container 6c2192221ec7294554957033aaeffbbbf2658ff95fee55962ac5009bc08456a4. Sep 9 04:51:23.926598 systemd[1]: Started cri-containerd-d960b00d64c1f01b2566faefced42927cbf47d264a0de992c8524d14f06da584.scope - libcontainer container d960b00d64c1f01b2566faefced42927cbf47d264a0de992c8524d14f06da584. Sep 9 04:51:23.956766 containerd[1537]: time="2025-09-09T04:51:23.956698014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-c4bgl,Uid:608da73e-7fb8-4900-9a51-9bda8f5e0819,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"6c2192221ec7294554957033aaeffbbbf2658ff95fee55962ac5009bc08456a4\"" Sep 9 04:51:23.962763 containerd[1537]: time="2025-09-09T04:51:23.962726283Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 04:51:23.969416 containerd[1537]: time="2025-09-09T04:51:23.969382322Z" level=info msg="StartContainer for \"d960b00d64c1f01b2566faefced42927cbf47d264a0de992c8524d14f06da584\" returns successfully" Sep 9 04:51:25.099387 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3179911550.mount: Deactivated successfully. Sep 9 04:51:25.479029 containerd[1537]: time="2025-09-09T04:51:25.478976566Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:25.479899 containerd[1537]: time="2025-09-09T04:51:25.479467385Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 04:51:25.480902 containerd[1537]: time="2025-09-09T04:51:25.480870325Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:25.482771 containerd[1537]: time="2025-09-09T04:51:25.482735108Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:25.484075 containerd[1537]: time="2025-09-09T04:51:25.483938023Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.521172557s" Sep 9 04:51:25.484075 containerd[1537]: time="2025-09-09T04:51:25.483973441Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 04:51:25.487012 containerd[1537]: time="2025-09-09T04:51:25.486974384Z" level=info msg="CreateContainer within sandbox \"6c2192221ec7294554957033aaeffbbbf2658ff95fee55962ac5009bc08456a4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 04:51:25.495077 containerd[1537]: time="2025-09-09T04:51:25.494498552Z" level=info msg="Container 8faa3bb8ae136a1e9aad63db4affa638a298149f4b1d011388f7ab90e8890749: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:25.499836 containerd[1537]: time="2025-09-09T04:51:25.499792224Z" level=info msg="CreateContainer within sandbox \"6c2192221ec7294554957033aaeffbbbf2658ff95fee55962ac5009bc08456a4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8faa3bb8ae136a1e9aad63db4affa638a298149f4b1d011388f7ab90e8890749\"" Sep 9 04:51:25.500512 containerd[1537]: time="2025-09-09T04:51:25.500455894Z" level=info msg="StartContainer for \"8faa3bb8ae136a1e9aad63db4affa638a298149f4b1d011388f7ab90e8890749\"" Sep 9 04:51:25.507943 containerd[1537]: time="2025-09-09T04:51:25.507912266Z" level=info msg="connecting to shim 8faa3bb8ae136a1e9aad63db4affa638a298149f4b1d011388f7ab90e8890749" address="unix:///run/containerd/s/4c736db638a25a6d4c6e0f9806ccb4dbc46313fcdd4f1cd287f45cd5c10bcb78" protocol=ttrpc version=3 Sep 9 04:51:25.531306 systemd[1]: Started cri-containerd-8faa3bb8ae136a1e9aad63db4affa638a298149f4b1d011388f7ab90e8890749.scope - libcontainer container 8faa3bb8ae136a1e9aad63db4affa638a298149f4b1d011388f7ab90e8890749. Sep 9 04:51:25.556173 containerd[1537]: time="2025-09-09T04:51:25.555805404Z" level=info msg="StartContainer for \"8faa3bb8ae136a1e9aad63db4affa638a298149f4b1d011388f7ab90e8890749\" returns successfully" Sep 9 04:51:25.883634 kubelet[2675]: I0909 04:51:25.883386 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-hbz95" podStartSLOduration=2.883368756 podStartE2EDuration="2.883368756s" podCreationTimestamp="2025-09-09 04:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:51:24.88140758 +0000 UTC m=+7.147569155" watchObservedRunningTime="2025-09-09 04:51:25.883368756 +0000 UTC m=+8.149530331" Sep 9 04:51:26.868986 kubelet[2675]: I0909 04:51:26.868918 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-c4bgl" podStartSLOduration=2.345105557 podStartE2EDuration="3.8688995s" podCreationTimestamp="2025-09-09 04:51:23 +0000 UTC" firstStartedPulling="2025-09-09 04:51:23.962043481 +0000 UTC m=+6.228205056" lastFinishedPulling="2025-09-09 04:51:25.485837424 +0000 UTC m=+7.751998999" observedRunningTime="2025-09-09 04:51:25.88364102 +0000 UTC m=+8.149802595" watchObservedRunningTime="2025-09-09 04:51:26.8688995 +0000 UTC m=+9.135061075" Sep 9 04:51:30.767075 sudo[1743]: pam_unix(sudo:session): session closed for user root Sep 9 04:51:30.771969 sshd[1742]: Connection closed by 10.0.0.1 port 58108 Sep 9 04:51:30.774668 sshd-session[1739]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:30.779612 systemd[1]: sshd@6-10.0.0.32:22-10.0.0.1:58108.service: Deactivated successfully. Sep 9 04:51:30.784759 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 04:51:30.785376 systemd[1]: session-7.scope: Consumed 7.211s CPU time, 223M memory peak. Sep 9 04:51:30.788350 systemd-logind[1514]: Session 7 logged out. Waiting for processes to exit. Sep 9 04:51:30.790735 systemd-logind[1514]: Removed session 7. Sep 9 04:51:32.556486 update_engine[1516]: I20250909 04:51:32.555953 1516 update_attempter.cc:509] Updating boot flags... Sep 9 04:51:35.966440 systemd[1]: Created slice kubepods-besteffort-podef293a02_095f_4fa0_9b38_28b1f61a6cc6.slice - libcontainer container kubepods-besteffort-podef293a02_095f_4fa0_9b38_28b1f61a6cc6.slice. Sep 9 04:51:36.017521 kubelet[2675]: I0909 04:51:36.017398 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ef293a02-095f-4fa0-9b38-28b1f61a6cc6-typha-certs\") pod \"calico-typha-5bdbc678d5-89dt4\" (UID: \"ef293a02-095f-4fa0-9b38-28b1f61a6cc6\") " pod="calico-system/calico-typha-5bdbc678d5-89dt4" Sep 9 04:51:36.017521 kubelet[2675]: I0909 04:51:36.017461 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ck8x\" (UniqueName: \"kubernetes.io/projected/ef293a02-095f-4fa0-9b38-28b1f61a6cc6-kube-api-access-8ck8x\") pod \"calico-typha-5bdbc678d5-89dt4\" (UID: \"ef293a02-095f-4fa0-9b38-28b1f61a6cc6\") " pod="calico-system/calico-typha-5bdbc678d5-89dt4" Sep 9 04:51:36.017521 kubelet[2675]: I0909 04:51:36.017497 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef293a02-095f-4fa0-9b38-28b1f61a6cc6-tigera-ca-bundle\") pod \"calico-typha-5bdbc678d5-89dt4\" (UID: \"ef293a02-095f-4fa0-9b38-28b1f61a6cc6\") " pod="calico-system/calico-typha-5bdbc678d5-89dt4" Sep 9 04:51:36.274435 containerd[1537]: time="2025-09-09T04:51:36.274047885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bdbc678d5-89dt4,Uid:ef293a02-095f-4fa0-9b38-28b1f61a6cc6,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:36.278469 systemd[1]: Created slice kubepods-besteffort-poda376ccd4_12af_447b_bc91_50d5fa89b455.slice - libcontainer container kubepods-besteffort-poda376ccd4_12af_447b_bc91_50d5fa89b455.slice. Sep 9 04:51:36.321209 kubelet[2675]: I0909 04:51:36.321158 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-xtables-lock\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.321209 kubelet[2675]: I0909 04:51:36.321210 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d566h\" (UniqueName: \"kubernetes.io/projected/a376ccd4-12af-447b-bc91-50d5fa89b455-kube-api-access-d566h\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322276 kubelet[2675]: I0909 04:51:36.322245 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-var-lib-calico\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322376 kubelet[2675]: I0909 04:51:36.322292 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a376ccd4-12af-447b-bc91-50d5fa89b455-tigera-ca-bundle\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322376 kubelet[2675]: I0909 04:51:36.322312 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-cni-net-dir\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322376 kubelet[2675]: I0909 04:51:36.322336 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-lib-modules\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322376 kubelet[2675]: I0909 04:51:36.322356 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-cni-bin-dir\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322376 kubelet[2675]: I0909 04:51:36.322373 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a376ccd4-12af-447b-bc91-50d5fa89b455-node-certs\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322478 kubelet[2675]: I0909 04:51:36.322388 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-var-run-calico\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322478 kubelet[2675]: I0909 04:51:36.322407 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-policysync\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322478 kubelet[2675]: I0909 04:51:36.322424 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-cni-log-dir\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.322478 kubelet[2675]: I0909 04:51:36.322447 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a376ccd4-12af-447b-bc91-50d5fa89b455-flexvol-driver-host\") pod \"calico-node-jjmt7\" (UID: \"a376ccd4-12af-447b-bc91-50d5fa89b455\") " pod="calico-system/calico-node-jjmt7" Sep 9 04:51:36.356091 containerd[1537]: time="2025-09-09T04:51:36.354580089Z" level=info msg="connecting to shim 59bd2829b4a1f6865808eb56f4dfec9edaff22aece7368fe39dac92ad9ace14e" address="unix:///run/containerd/s/6d984e0bb4a5b9c3c2b137d569a26b801ae5c21a6f6c71b42d692a0d92a22417" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:36.406455 systemd[1]: Started cri-containerd-59bd2829b4a1f6865808eb56f4dfec9edaff22aece7368fe39dac92ad9ace14e.scope - libcontainer container 59bd2829b4a1f6865808eb56f4dfec9edaff22aece7368fe39dac92ad9ace14e. Sep 9 04:51:36.439513 kubelet[2675]: E0909 04:51:36.439477 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.439513 kubelet[2675]: W0909 04:51:36.439504 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.440024 kubelet[2675]: E0909 04:51:36.440003 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.441082 kubelet[2675]: E0909 04:51:36.441056 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.441148 kubelet[2675]: W0909 04:51:36.441081 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.441148 kubelet[2675]: E0909 04:51:36.441114 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.470677 containerd[1537]: time="2025-09-09T04:51:36.470633708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bdbc678d5-89dt4,Uid:ef293a02-095f-4fa0-9b38-28b1f61a6cc6,Namespace:calico-system,Attempt:0,} returns sandbox id \"59bd2829b4a1f6865808eb56f4dfec9edaff22aece7368fe39dac92ad9ace14e\"" Sep 9 04:51:36.484583 containerd[1537]: time="2025-09-09T04:51:36.484540935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 04:51:36.514016 kubelet[2675]: E0909 04:51:36.513959 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pz4n4" podUID="093f1e96-6646-4411-bb5b-eecbd26e4d17" Sep 9 04:51:36.515906 kubelet[2675]: E0909 04:51:36.515851 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.515906 kubelet[2675]: W0909 04:51:36.515877 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.515906 kubelet[2675]: E0909 04:51:36.515898 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.516236 kubelet[2675]: E0909 04:51:36.516110 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.516236 kubelet[2675]: W0909 04:51:36.516123 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.516236 kubelet[2675]: E0909 04:51:36.516172 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.517251 kubelet[2675]: E0909 04:51:36.517216 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.517251 kubelet[2675]: W0909 04:51:36.517235 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.517251 kubelet[2675]: E0909 04:51:36.517251 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.518075 kubelet[2675]: E0909 04:51:36.518049 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.518075 kubelet[2675]: W0909 04:51:36.518067 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.518075 kubelet[2675]: E0909 04:51:36.518081 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.518321 kubelet[2675]: E0909 04:51:36.518297 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.518321 kubelet[2675]: W0909 04:51:36.518310 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.518321 kubelet[2675]: E0909 04:51:36.518320 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.518786 kubelet[2675]: E0909 04:51:36.518765 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.518786 kubelet[2675]: W0909 04:51:36.518782 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.518993 kubelet[2675]: E0909 04:51:36.518795 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.519490 kubelet[2675]: E0909 04:51:36.519459 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.519490 kubelet[2675]: W0909 04:51:36.519479 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.519490 kubelet[2675]: E0909 04:51:36.519491 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.519748 kubelet[2675]: E0909 04:51:36.519723 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.519748 kubelet[2675]: W0909 04:51:36.519739 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.519836 kubelet[2675]: E0909 04:51:36.519752 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.520010 kubelet[2675]: E0909 04:51:36.519953 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.520010 kubelet[2675]: W0909 04:51:36.519966 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.520010 kubelet[2675]: E0909 04:51:36.519975 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.520274 kubelet[2675]: E0909 04:51:36.520251 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.520274 kubelet[2675]: W0909 04:51:36.520265 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.520274 kubelet[2675]: E0909 04:51:36.520275 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.520480 kubelet[2675]: E0909 04:51:36.520461 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.520480 kubelet[2675]: W0909 04:51:36.520473 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.520545 kubelet[2675]: E0909 04:51:36.520482 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.520785 kubelet[2675]: E0909 04:51:36.520755 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.520785 kubelet[2675]: W0909 04:51:36.520771 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.520785 kubelet[2675]: E0909 04:51:36.520782 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.521084 kubelet[2675]: E0909 04:51:36.521058 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.521084 kubelet[2675]: W0909 04:51:36.521073 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.521084 kubelet[2675]: E0909 04:51:36.521084 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.521481 kubelet[2675]: E0909 04:51:36.521456 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.521481 kubelet[2675]: W0909 04:51:36.521474 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.521481 kubelet[2675]: E0909 04:51:36.521486 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.522332 kubelet[2675]: E0909 04:51:36.522299 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.522752 kubelet[2675]: W0909 04:51:36.522702 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.522752 kubelet[2675]: E0909 04:51:36.522740 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.523377 kubelet[2675]: E0909 04:51:36.523348 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.523377 kubelet[2675]: W0909 04:51:36.523364 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.523377 kubelet[2675]: E0909 04:51:36.523378 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.525384 kubelet[2675]: E0909 04:51:36.524239 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.525384 kubelet[2675]: W0909 04:51:36.524258 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.525384 kubelet[2675]: E0909 04:51:36.524272 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.525935 kubelet[2675]: E0909 04:51:36.525733 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.525935 kubelet[2675]: W0909 04:51:36.525748 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.525935 kubelet[2675]: E0909 04:51:36.525763 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.526695 kubelet[2675]: E0909 04:51:36.526299 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.527046 kubelet[2675]: W0909 04:51:36.526866 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.527046 kubelet[2675]: E0909 04:51:36.526895 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.527795 kubelet[2675]: E0909 04:51:36.527416 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.527934 kubelet[2675]: W0909 04:51:36.527913 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.528001 kubelet[2675]: E0909 04:51:36.527988 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.528732 kubelet[2675]: E0909 04:51:36.528698 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.529204 kubelet[2675]: W0909 04:51:36.528827 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.529204 kubelet[2675]: E0909 04:51:36.528848 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.529204 kubelet[2675]: I0909 04:51:36.528878 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/093f1e96-6646-4411-bb5b-eecbd26e4d17-socket-dir\") pod \"csi-node-driver-pz4n4\" (UID: \"093f1e96-6646-4411-bb5b-eecbd26e4d17\") " pod="calico-system/csi-node-driver-pz4n4" Sep 9 04:51:36.529998 kubelet[2675]: E0909 04:51:36.529520 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.529998 kubelet[2675]: W0909 04:51:36.529540 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.529998 kubelet[2675]: E0909 04:51:36.529560 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.529998 kubelet[2675]: I0909 04:51:36.529580 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/093f1e96-6646-4411-bb5b-eecbd26e4d17-varrun\") pod \"csi-node-driver-pz4n4\" (UID: \"093f1e96-6646-4411-bb5b-eecbd26e4d17\") " pod="calico-system/csi-node-driver-pz4n4" Sep 9 04:51:36.530243 kubelet[2675]: E0909 04:51:36.530220 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.530313 kubelet[2675]: W0909 04:51:36.530300 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.530400 kubelet[2675]: E0909 04:51:36.530379 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.530825 kubelet[2675]: E0909 04:51:36.530597 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.530825 kubelet[2675]: W0909 04:51:36.530819 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.530910 kubelet[2675]: E0909 04:51:36.530836 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.530959 kubelet[2675]: I0909 04:51:36.530797 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhzqs\" (UniqueName: \"kubernetes.io/projected/093f1e96-6646-4411-bb5b-eecbd26e4d17-kube-api-access-lhzqs\") pod \"csi-node-driver-pz4n4\" (UID: \"093f1e96-6646-4411-bb5b-eecbd26e4d17\") " pod="calico-system/csi-node-driver-pz4n4" Sep 9 04:51:36.531578 kubelet[2675]: E0909 04:51:36.531542 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.531578 kubelet[2675]: W0909 04:51:36.531571 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.532259 kubelet[2675]: E0909 04:51:36.532233 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.532364 kubelet[2675]: E0909 04:51:36.532350 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.532420 kubelet[2675]: W0909 04:51:36.532362 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.532420 kubelet[2675]: E0909 04:51:36.532378 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.532560 kubelet[2675]: E0909 04:51:36.532549 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.532560 kubelet[2675]: W0909 04:51:36.532559 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.532628 kubelet[2675]: E0909 04:51:36.532602 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.532753 kubelet[2675]: E0909 04:51:36.532742 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.532785 kubelet[2675]: W0909 04:51:36.532755 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.532854 kubelet[2675]: E0909 04:51:36.532837 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.532918 kubelet[2675]: E0909 04:51:36.532892 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.532971 kubelet[2675]: W0909 04:51:36.532959 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.533025 kubelet[2675]: E0909 04:51:36.533015 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.533086 kubelet[2675]: I0909 04:51:36.533075 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/093f1e96-6646-4411-bb5b-eecbd26e4d17-registration-dir\") pod \"csi-node-driver-pz4n4\" (UID: \"093f1e96-6646-4411-bb5b-eecbd26e4d17\") " pod="calico-system/csi-node-driver-pz4n4" Sep 9 04:51:36.533208 kubelet[2675]: E0909 04:51:36.533194 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.533208 kubelet[2675]: W0909 04:51:36.533206 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.533284 kubelet[2675]: E0909 04:51:36.533217 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.534292 kubelet[2675]: E0909 04:51:36.534265 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.534292 kubelet[2675]: W0909 04:51:36.534285 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.534420 kubelet[2675]: E0909 04:51:36.534304 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.534420 kubelet[2675]: I0909 04:51:36.534324 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/093f1e96-6646-4411-bb5b-eecbd26e4d17-kubelet-dir\") pod \"csi-node-driver-pz4n4\" (UID: \"093f1e96-6646-4411-bb5b-eecbd26e4d17\") " pod="calico-system/csi-node-driver-pz4n4" Sep 9 04:51:36.535719 kubelet[2675]: E0909 04:51:36.535688 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.535719 kubelet[2675]: W0909 04:51:36.535714 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.535837 kubelet[2675]: E0909 04:51:36.535743 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.536295 kubelet[2675]: E0909 04:51:36.536268 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.536295 kubelet[2675]: W0909 04:51:36.536284 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.536775 kubelet[2675]: E0909 04:51:36.536744 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.537299 kubelet[2675]: E0909 04:51:36.537272 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.537299 kubelet[2675]: W0909 04:51:36.537292 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.537416 kubelet[2675]: E0909 04:51:36.537307 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.538031 kubelet[2675]: E0909 04:51:36.537990 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.538031 kubelet[2675]: W0909 04:51:36.538010 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.538031 kubelet[2675]: E0909 04:51:36.538023 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.583702 containerd[1537]: time="2025-09-09T04:51:36.583658015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jjmt7,Uid:a376ccd4-12af-447b-bc91-50d5fa89b455,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:36.635828 kubelet[2675]: E0909 04:51:36.635765 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.635828 kubelet[2675]: W0909 04:51:36.635797 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.635828 kubelet[2675]: E0909 04:51:36.635817 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.636123 kubelet[2675]: E0909 04:51:36.636047 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.636123 kubelet[2675]: W0909 04:51:36.636056 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.636123 kubelet[2675]: E0909 04:51:36.636068 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.636785 kubelet[2675]: E0909 04:51:36.636255 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.636785 kubelet[2675]: W0909 04:51:36.636265 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.636785 kubelet[2675]: E0909 04:51:36.636275 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.638367 kubelet[2675]: E0909 04:51:36.638339 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.638367 kubelet[2675]: W0909 04:51:36.638364 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.638494 kubelet[2675]: E0909 04:51:36.638386 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.638573 kubelet[2675]: E0909 04:51:36.638556 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.638573 kubelet[2675]: W0909 04:51:36.638568 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.638709 kubelet[2675]: E0909 04:51:36.638615 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.638709 kubelet[2675]: E0909 04:51:36.638703 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.638709 kubelet[2675]: W0909 04:51:36.638709 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641055 kubelet[2675]: E0909 04:51:36.638740 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641055 kubelet[2675]: E0909 04:51:36.638841 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641055 kubelet[2675]: W0909 04:51:36.638848 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641055 kubelet[2675]: E0909 04:51:36.638885 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641055 kubelet[2675]: E0909 04:51:36.638991 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641055 kubelet[2675]: W0909 04:51:36.638998 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641055 kubelet[2675]: E0909 04:51:36.639045 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641055 kubelet[2675]: E0909 04:51:36.639640 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641055 kubelet[2675]: W0909 04:51:36.639654 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641055 kubelet[2675]: E0909 04:51:36.639667 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641272 containerd[1537]: time="2025-09-09T04:51:36.639311730Z" level=info msg="connecting to shim 65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6" address="unix:///run/containerd/s/c8dd0f4cd34e4ed93921efff1f8f0b30717fa69248bc7b466bf799b5dad39901" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:36.641307 kubelet[2675]: E0909 04:51:36.639919 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641307 kubelet[2675]: W0909 04:51:36.639929 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641307 kubelet[2675]: E0909 04:51:36.639967 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641307 kubelet[2675]: E0909 04:51:36.640113 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641307 kubelet[2675]: W0909 04:51:36.640121 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641307 kubelet[2675]: E0909 04:51:36.640174 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641307 kubelet[2675]: E0909 04:51:36.640322 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641307 kubelet[2675]: W0909 04:51:36.640330 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641307 kubelet[2675]: E0909 04:51:36.640722 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641307 kubelet[2675]: E0909 04:51:36.640871 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641512 kubelet[2675]: W0909 04:51:36.640880 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641512 kubelet[2675]: E0909 04:51:36.640924 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641512 kubelet[2675]: E0909 04:51:36.641099 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641512 kubelet[2675]: W0909 04:51:36.641111 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641512 kubelet[2675]: E0909 04:51:36.641173 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.641512 kubelet[2675]: E0909 04:51:36.641314 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.641512 kubelet[2675]: W0909 04:51:36.641322 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.641512 kubelet[2675]: E0909 04:51:36.641332 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.642736 kubelet[2675]: E0909 04:51:36.642531 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.642736 kubelet[2675]: W0909 04:51:36.642553 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.642736 kubelet[2675]: E0909 04:51:36.642587 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.643012 kubelet[2675]: E0909 04:51:36.642907 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.643012 kubelet[2675]: W0909 04:51:36.642920 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.643012 kubelet[2675]: E0909 04:51:36.642988 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.643190 kubelet[2675]: E0909 04:51:36.643175 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.643319 kubelet[2675]: W0909 04:51:36.643304 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.643466 kubelet[2675]: E0909 04:51:36.643410 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.643629 kubelet[2675]: E0909 04:51:36.643614 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.643800 kubelet[2675]: W0909 04:51:36.643675 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.643800 kubelet[2675]: E0909 04:51:36.643714 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.643949 kubelet[2675]: E0909 04:51:36.643934 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.644001 kubelet[2675]: W0909 04:51:36.643990 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.644131 kubelet[2675]: E0909 04:51:36.644064 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.644380 kubelet[2675]: E0909 04:51:36.644251 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.644380 kubelet[2675]: W0909 04:51:36.644265 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.644380 kubelet[2675]: E0909 04:51:36.644278 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.644558 kubelet[2675]: E0909 04:51:36.644544 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.644609 kubelet[2675]: W0909 04:51:36.644598 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.644831 kubelet[2675]: E0909 04:51:36.644658 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.644964 kubelet[2675]: E0909 04:51:36.644949 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.645018 kubelet[2675]: W0909 04:51:36.645006 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.645133 kubelet[2675]: E0909 04:51:36.645096 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.645401 kubelet[2675]: E0909 04:51:36.645383 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.645481 kubelet[2675]: W0909 04:51:36.645466 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.645535 kubelet[2675]: E0909 04:51:36.645524 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.645832 kubelet[2675]: E0909 04:51:36.645814 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.645915 kubelet[2675]: W0909 04:51:36.645900 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.645970 kubelet[2675]: E0909 04:51:36.645958 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.658720 kubelet[2675]: E0909 04:51:36.658693 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:36.658720 kubelet[2675]: W0909 04:51:36.658711 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:36.658720 kubelet[2675]: E0909 04:51:36.658729 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:36.673371 systemd[1]: Started cri-containerd-65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6.scope - libcontainer container 65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6. Sep 9 04:51:36.708710 containerd[1537]: time="2025-09-09T04:51:36.708619515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jjmt7,Uid:a376ccd4-12af-447b-bc91-50d5fa89b455,Namespace:calico-system,Attempt:0,} returns sandbox id \"65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6\"" Sep 9 04:51:37.430893 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount353440125.mount: Deactivated successfully. Sep 9 04:51:37.764127 containerd[1537]: time="2025-09-09T04:51:37.763400711Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:37.764127 containerd[1537]: time="2025-09-09T04:51:37.763888372Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 04:51:37.764801 containerd[1537]: time="2025-09-09T04:51:37.764781268Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:37.766605 containerd[1537]: time="2025-09-09T04:51:37.766562420Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:37.767697 containerd[1537]: time="2025-09-09T04:51:37.767563108Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.282788903s" Sep 9 04:51:37.767697 containerd[1537]: time="2025-09-09T04:51:37.767600718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 04:51:37.769360 containerd[1537]: time="2025-09-09T04:51:37.769318172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 04:51:37.783327 containerd[1537]: time="2025-09-09T04:51:37.783191159Z" level=info msg="CreateContainer within sandbox \"59bd2829b4a1f6865808eb56f4dfec9edaff22aece7368fe39dac92ad9ace14e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 04:51:37.791377 containerd[1537]: time="2025-09-09T04:51:37.791332778Z" level=info msg="Container 475613fe78bb37c438e1f6ce6e761d26fe22c690314aea903914600df3e84890: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:37.815313 containerd[1537]: time="2025-09-09T04:51:37.815262135Z" level=info msg="CreateContainer within sandbox \"59bd2829b4a1f6865808eb56f4dfec9edaff22aece7368fe39dac92ad9ace14e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"475613fe78bb37c438e1f6ce6e761d26fe22c690314aea903914600df3e84890\"" Sep 9 04:51:37.816116 containerd[1537]: time="2025-09-09T04:51:37.816080530Z" level=info msg="StartContainer for \"475613fe78bb37c438e1f6ce6e761d26fe22c690314aea903914600df3e84890\"" Sep 9 04:51:37.819150 containerd[1537]: time="2025-09-09T04:51:37.819103559Z" level=info msg="connecting to shim 475613fe78bb37c438e1f6ce6e761d26fe22c690314aea903914600df3e84890" address="unix:///run/containerd/s/6d984e0bb4a5b9c3c2b137d569a26b801ae5c21a6f6c71b42d692a0d92a22417" protocol=ttrpc version=3 Sep 9 04:51:37.826527 kubelet[2675]: E0909 04:51:37.826476 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pz4n4" podUID="093f1e96-6646-4411-bb5b-eecbd26e4d17" Sep 9 04:51:37.857344 systemd[1]: Started cri-containerd-475613fe78bb37c438e1f6ce6e761d26fe22c690314aea903914600df3e84890.scope - libcontainer container 475613fe78bb37c438e1f6ce6e761d26fe22c690314aea903914600df3e84890. Sep 9 04:51:37.898273 containerd[1537]: time="2025-09-09T04:51:37.898225056Z" level=info msg="StartContainer for \"475613fe78bb37c438e1f6ce6e761d26fe22c690314aea903914600df3e84890\" returns successfully" Sep 9 04:51:37.937014 kubelet[2675]: E0909 04:51:37.936961 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.937014 kubelet[2675]: W0909 04:51:37.937002 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.937246 kubelet[2675]: E0909 04:51:37.937037 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.937246 kubelet[2675]: E0909 04:51:37.937207 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.937246 kubelet[2675]: W0909 04:51:37.937215 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.937246 kubelet[2675]: E0909 04:51:37.937243 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.937482 kubelet[2675]: E0909 04:51:37.937468 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.937482 kubelet[2675]: W0909 04:51:37.937481 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.937531 kubelet[2675]: E0909 04:51:37.937491 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.937683 kubelet[2675]: E0909 04:51:37.937672 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.937717 kubelet[2675]: W0909 04:51:37.937683 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.937717 kubelet[2675]: E0909 04:51:37.937693 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.937883 kubelet[2675]: E0909 04:51:37.937871 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.937883 kubelet[2675]: W0909 04:51:37.937882 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.937935 kubelet[2675]: E0909 04:51:37.937891 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.938032 kubelet[2675]: E0909 04:51:37.938022 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.938060 kubelet[2675]: W0909 04:51:37.938032 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.938060 kubelet[2675]: E0909 04:51:37.938040 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.938189 kubelet[2675]: E0909 04:51:37.938175 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.938189 kubelet[2675]: W0909 04:51:37.938186 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.938245 kubelet[2675]: E0909 04:51:37.938194 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.938333 kubelet[2675]: E0909 04:51:37.938308 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.938333 kubelet[2675]: W0909 04:51:37.938322 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.938333 kubelet[2675]: E0909 04:51:37.938331 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.938540 kubelet[2675]: E0909 04:51:37.938524 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.938578 kubelet[2675]: W0909 04:51:37.938545 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.938578 kubelet[2675]: E0909 04:51:37.938557 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.938710 kubelet[2675]: E0909 04:51:37.938691 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.938710 kubelet[2675]: W0909 04:51:37.938703 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.938770 kubelet[2675]: E0909 04:51:37.938712 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.938851 kubelet[2675]: E0909 04:51:37.938845 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.938878 kubelet[2675]: W0909 04:51:37.938854 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.938878 kubelet[2675]: E0909 04:51:37.938862 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.939029 kubelet[2675]: E0909 04:51:37.939017 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.939029 kubelet[2675]: W0909 04:51:37.939028 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.939099 kubelet[2675]: E0909 04:51:37.939039 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.939213 kubelet[2675]: E0909 04:51:37.939203 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.939213 kubelet[2675]: W0909 04:51:37.939213 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.939262 kubelet[2675]: E0909 04:51:37.939221 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.939348 kubelet[2675]: E0909 04:51:37.939339 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.939418 kubelet[2675]: W0909 04:51:37.939348 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.939418 kubelet[2675]: E0909 04:51:37.939370 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.939618 kubelet[2675]: E0909 04:51:37.939606 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.939655 kubelet[2675]: W0909 04:51:37.939618 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.939655 kubelet[2675]: E0909 04:51:37.939630 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.949528 kubelet[2675]: E0909 04:51:37.949497 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.949528 kubelet[2675]: W0909 04:51:37.949523 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.949692 kubelet[2675]: E0909 04:51:37.949546 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.949847 kubelet[2675]: E0909 04:51:37.949829 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.949889 kubelet[2675]: W0909 04:51:37.949843 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.949889 kubelet[2675]: E0909 04:51:37.949882 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.950089 kubelet[2675]: E0909 04:51:37.950077 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.950250 kubelet[2675]: W0909 04:51:37.950090 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.950250 kubelet[2675]: E0909 04:51:37.950104 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.950730 kubelet[2675]: E0909 04:51:37.950381 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.950730 kubelet[2675]: W0909 04:51:37.950710 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.950918 kubelet[2675]: E0909 04:51:37.950793 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.951531 kubelet[2675]: E0909 04:51:37.951389 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.952224 kubelet[2675]: W0909 04:51:37.952179 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.952224 kubelet[2675]: E0909 04:51:37.952211 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.952446 kubelet[2675]: E0909 04:51:37.952421 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.952446 kubelet[2675]: W0909 04:51:37.952437 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.952509 kubelet[2675]: E0909 04:51:37.952482 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.952654 kubelet[2675]: E0909 04:51:37.952638 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.952654 kubelet[2675]: W0909 04:51:37.952654 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.952727 kubelet[2675]: E0909 04:51:37.952709 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.956219 kubelet[2675]: E0909 04:51:37.956193 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.956433 kubelet[2675]: W0909 04:51:37.956290 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.956433 kubelet[2675]: E0909 04:51:37.956338 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.956605 kubelet[2675]: E0909 04:51:37.956591 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.956660 kubelet[2675]: W0909 04:51:37.956649 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.956779 kubelet[2675]: E0909 04:51:37.956755 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.956997 kubelet[2675]: E0909 04:51:37.956983 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.957163 kubelet[2675]: W0909 04:51:37.957047 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.957163 kubelet[2675]: E0909 04:51:37.957079 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.957305 kubelet[2675]: E0909 04:51:37.957291 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.957355 kubelet[2675]: W0909 04:51:37.957345 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.957506 kubelet[2675]: E0909 04:51:37.957416 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.957643 kubelet[2675]: E0909 04:51:37.957609 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.957697 kubelet[2675]: W0909 04:51:37.957686 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.957761 kubelet[2675]: E0909 04:51:37.957745 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.958042 kubelet[2675]: E0909 04:51:37.958003 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.958042 kubelet[2675]: W0909 04:51:37.958021 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.958042 kubelet[2675]: E0909 04:51:37.958038 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.958345 kubelet[2675]: E0909 04:51:37.958302 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.958345 kubelet[2675]: W0909 04:51:37.958316 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.958345 kubelet[2675]: E0909 04:51:37.958329 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.958660 kubelet[2675]: E0909 04:51:37.958638 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.958700 kubelet[2675]: W0909 04:51:37.958658 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.958700 kubelet[2675]: E0909 04:51:37.958684 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.959038 kubelet[2675]: E0909 04:51:37.959021 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.959107 kubelet[2675]: W0909 04:51:37.959095 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.959184 kubelet[2675]: E0909 04:51:37.959173 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.959400 kubelet[2675]: E0909 04:51:37.959369 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.959400 kubelet[2675]: W0909 04:51:37.959387 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.959473 kubelet[2675]: E0909 04:51:37.959414 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:37.959845 kubelet[2675]: E0909 04:51:37.959827 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:37.959917 kubelet[2675]: W0909 04:51:37.959904 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:37.959970 kubelet[2675]: E0909 04:51:37.959958 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.907156 kubelet[2675]: I0909 04:51:38.907114 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:51:38.907891 containerd[1537]: time="2025-09-09T04:51:38.907851759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:38.908500 containerd[1537]: time="2025-09-09T04:51:38.908444401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 04:51:38.909316 containerd[1537]: time="2025-09-09T04:51:38.909289714Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:38.911199 containerd[1537]: time="2025-09-09T04:51:38.911169990Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:38.911947 containerd[1537]: time="2025-09-09T04:51:38.911908552Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.142549169s" Sep 9 04:51:38.911981 containerd[1537]: time="2025-09-09T04:51:38.911945603Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 04:51:38.914189 containerd[1537]: time="2025-09-09T04:51:38.914132443Z" level=info msg="CreateContainer within sandbox \"65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 04:51:38.923387 containerd[1537]: time="2025-09-09T04:51:38.922274638Z" level=info msg="Container e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:38.930729 containerd[1537]: time="2025-09-09T04:51:38.930669863Z" level=info msg="CreateContainer within sandbox \"65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970\"" Sep 9 04:51:38.931332 containerd[1537]: time="2025-09-09T04:51:38.931273989Z" level=info msg="StartContainer for \"e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970\"" Sep 9 04:51:38.933191 containerd[1537]: time="2025-09-09T04:51:38.933150024Z" level=info msg="connecting to shim e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970" address="unix:///run/containerd/s/c8dd0f4cd34e4ed93921efff1f8f0b30717fa69248bc7b466bf799b5dad39901" protocol=ttrpc version=3 Sep 9 04:51:38.949056 kubelet[2675]: E0909 04:51:38.949027 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.949056 kubelet[2675]: W0909 04:51:38.949049 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.949268 kubelet[2675]: E0909 04:51:38.949079 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.949308 kubelet[2675]: E0909 04:51:38.949279 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.949308 kubelet[2675]: W0909 04:51:38.949289 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.949308 kubelet[2675]: E0909 04:51:38.949300 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.949449 kubelet[2675]: E0909 04:51:38.949438 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.949483 kubelet[2675]: W0909 04:51:38.949449 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.949483 kubelet[2675]: E0909 04:51:38.949459 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.949615 kubelet[2675]: E0909 04:51:38.949604 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.949643 kubelet[2675]: W0909 04:51:38.949616 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.949643 kubelet[2675]: E0909 04:51:38.949630 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.950004 kubelet[2675]: E0909 04:51:38.949990 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.950045 kubelet[2675]: W0909 04:51:38.950005 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.950045 kubelet[2675]: E0909 04:51:38.950017 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.950200 kubelet[2675]: E0909 04:51:38.950188 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.950200 kubelet[2675]: W0909 04:51:38.950200 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.950284 kubelet[2675]: E0909 04:51:38.950209 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.950362 kubelet[2675]: E0909 04:51:38.950351 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.950390 kubelet[2675]: W0909 04:51:38.950362 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.950390 kubelet[2675]: E0909 04:51:38.950371 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.953381 kubelet[2675]: E0909 04:51:38.953355 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.953381 kubelet[2675]: W0909 04:51:38.953378 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.953381 kubelet[2675]: E0909 04:51:38.953394 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.953381 kubelet[2675]: E0909 04:51:38.953614 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.953381 kubelet[2675]: W0909 04:51:38.953624 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.953381 kubelet[2675]: E0909 04:51:38.953638 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.954386 kubelet[2675]: E0909 04:51:38.954291 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.954386 kubelet[2675]: W0909 04:51:38.954303 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.954386 kubelet[2675]: E0909 04:51:38.954315 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.956169 kubelet[2675]: E0909 04:51:38.954876 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.956290 kubelet[2675]: W0909 04:51:38.956217 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.956329 kubelet[2675]: E0909 04:51:38.956298 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.956590 kubelet[2675]: E0909 04:51:38.956575 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.956590 kubelet[2675]: W0909 04:51:38.956590 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.956647 kubelet[2675]: E0909 04:51:38.956602 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.956835 kubelet[2675]: E0909 04:51:38.956821 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.956882 kubelet[2675]: W0909 04:51:38.956835 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.956882 kubelet[2675]: E0909 04:51:38.956851 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.957043 kubelet[2675]: E0909 04:51:38.957029 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.957082 kubelet[2675]: W0909 04:51:38.957043 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.957082 kubelet[2675]: E0909 04:51:38.957068 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.957297 kubelet[2675]: E0909 04:51:38.957281 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.957334 kubelet[2675]: W0909 04:51:38.957299 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.957334 kubelet[2675]: E0909 04:51:38.957318 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.959785 kubelet[2675]: E0909 04:51:38.959743 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.959836 kubelet[2675]: W0909 04:51:38.959785 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.959836 kubelet[2675]: E0909 04:51:38.959802 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.960061 kubelet[2675]: E0909 04:51:38.960030 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.960104 kubelet[2675]: W0909 04:51:38.960061 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.960104 kubelet[2675]: E0909 04:51:38.960084 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.960491 kubelet[2675]: E0909 04:51:38.960477 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.960491 kubelet[2675]: W0909 04:51:38.960492 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.960550 kubelet[2675]: E0909 04:51:38.960525 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.960755 kubelet[2675]: E0909 04:51:38.960741 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.960755 kubelet[2675]: W0909 04:51:38.960754 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.960819 kubelet[2675]: E0909 04:51:38.960768 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.960962 kubelet[2675]: E0909 04:51:38.960950 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.960962 kubelet[2675]: W0909 04:51:38.960961 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.961012 kubelet[2675]: E0909 04:51:38.960973 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.961532 kubelet[2675]: E0909 04:51:38.961127 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.961532 kubelet[2675]: W0909 04:51:38.961165 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.961312 systemd[1]: Started cri-containerd-e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970.scope - libcontainer container e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970. Sep 9 04:51:38.962013 kubelet[2675]: E0909 04:51:38.961872 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.962058 kubelet[2675]: E0909 04:51:38.962042 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.962058 kubelet[2675]: W0909 04:51:38.962053 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.962206 kubelet[2675]: E0909 04:51:38.962101 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.962274 kubelet[2675]: E0909 04:51:38.962257 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.962274 kubelet[2675]: W0909 04:51:38.962270 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.962370 kubelet[2675]: E0909 04:51:38.962300 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.962464 kubelet[2675]: E0909 04:51:38.962450 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.962501 kubelet[2675]: W0909 04:51:38.962463 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.962501 kubelet[2675]: E0909 04:51:38.962478 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.962741 kubelet[2675]: E0909 04:51:38.962728 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.962805 kubelet[2675]: W0909 04:51:38.962793 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.962868 kubelet[2675]: E0909 04:51:38.962857 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.963037 kubelet[2675]: E0909 04:51:38.963021 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.963037 kubelet[2675]: W0909 04:51:38.963036 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.963111 kubelet[2675]: E0909 04:51:38.963051 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.963320 kubelet[2675]: E0909 04:51:38.963303 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.963320 kubelet[2675]: W0909 04:51:38.963319 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.964255 kubelet[2675]: E0909 04:51:38.964201 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.964634 kubelet[2675]: E0909 04:51:38.964539 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.964634 kubelet[2675]: W0909 04:51:38.964556 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.964634 kubelet[2675]: E0909 04:51:38.964617 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.964809 kubelet[2675]: E0909 04:51:38.964799 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.964863 kubelet[2675]: W0909 04:51:38.964853 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.964982 kubelet[2675]: E0909 04:51:38.964948 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.965206 kubelet[2675]: E0909 04:51:38.965190 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.965269 kubelet[2675]: W0909 04:51:38.965257 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.965404 kubelet[2675]: E0909 04:51:38.965391 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.965568 kubelet[2675]: E0909 04:51:38.965543 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.965568 kubelet[2675]: W0909 04:51:38.965556 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.965654 kubelet[2675]: E0909 04:51:38.965643 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.965930 kubelet[2675]: E0909 04:51:38.965916 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.966001 kubelet[2675]: W0909 04:51:38.965989 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.966057 kubelet[2675]: E0909 04:51:38.966046 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:38.966464 kubelet[2675]: E0909 04:51:38.966447 2675 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:51:38.966533 kubelet[2675]: W0909 04:51:38.966521 2675 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:51:38.966595 kubelet[2675]: E0909 04:51:38.966584 2675 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:51:39.009098 containerd[1537]: time="2025-09-09T04:51:39.009020559Z" level=info msg="StartContainer for \"e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970\" returns successfully" Sep 9 04:51:39.021611 systemd[1]: cri-containerd-e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970.scope: Deactivated successfully. Sep 9 04:51:39.039993 containerd[1537]: time="2025-09-09T04:51:39.039872218Z" level=info msg="received exit event container_id:\"e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970\" id:\"e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970\" pid:3404 exited_at:{seconds:1757393499 nanos:32233092}" Sep 9 04:51:39.040131 containerd[1537]: time="2025-09-09T04:51:39.039952399Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970\" id:\"e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970\" pid:3404 exited_at:{seconds:1757393499 nanos:32233092}" Sep 9 04:51:39.076683 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e796c038120bf03e01292afabfd6255a25e8fd2e08aa00767a97179931fca970-rootfs.mount: Deactivated successfully. Sep 9 04:51:39.826534 kubelet[2675]: E0909 04:51:39.826476 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pz4n4" podUID="093f1e96-6646-4411-bb5b-eecbd26e4d17" Sep 9 04:51:39.912578 containerd[1537]: time="2025-09-09T04:51:39.912329527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 04:51:39.931500 kubelet[2675]: I0909 04:51:39.931436 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5bdbc678d5-89dt4" podStartSLOduration=3.63947433 podStartE2EDuration="4.931420178s" podCreationTimestamp="2025-09-09 04:51:35 +0000 UTC" firstStartedPulling="2025-09-09 04:51:36.476384639 +0000 UTC m=+18.742546215" lastFinishedPulling="2025-09-09 04:51:37.768330528 +0000 UTC m=+20.034492063" observedRunningTime="2025-09-09 04:51:37.920874045 +0000 UTC m=+20.187035620" watchObservedRunningTime="2025-09-09 04:51:39.931420178 +0000 UTC m=+22.197581713" Sep 9 04:51:41.827068 kubelet[2675]: E0909 04:51:41.826710 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pz4n4" podUID="093f1e96-6646-4411-bb5b-eecbd26e4d17" Sep 9 04:51:43.425987 containerd[1537]: time="2025-09-09T04:51:43.425404875Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:43.427050 containerd[1537]: time="2025-09-09T04:51:43.426982624Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 04:51:43.427642 containerd[1537]: time="2025-09-09T04:51:43.427606522Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:43.430116 containerd[1537]: time="2025-09-09T04:51:43.430080110Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:43.432117 containerd[1537]: time="2025-09-09T04:51:43.432076072Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.519702775s" Sep 9 04:51:43.432117 containerd[1537]: time="2025-09-09T04:51:43.432115441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 04:51:43.438973 containerd[1537]: time="2025-09-09T04:51:43.438931830Z" level=info msg="CreateContainer within sandbox \"65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 04:51:43.446891 containerd[1537]: time="2025-09-09T04:51:43.445811834Z" level=info msg="Container fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:43.454232 containerd[1537]: time="2025-09-09T04:51:43.454190530Z" level=info msg="CreateContainer within sandbox \"65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c\"" Sep 9 04:51:43.454652 containerd[1537]: time="2025-09-09T04:51:43.454633388Z" level=info msg="StartContainer for \"fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c\"" Sep 9 04:51:43.456054 containerd[1537]: time="2025-09-09T04:51:43.456016534Z" level=info msg="connecting to shim fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c" address="unix:///run/containerd/s/c8dd0f4cd34e4ed93921efff1f8f0b30717fa69248bc7b466bf799b5dad39901" protocol=ttrpc version=3 Sep 9 04:51:43.479524 systemd[1]: Started cri-containerd-fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c.scope - libcontainer container fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c. Sep 9 04:51:43.514089 containerd[1537]: time="2025-09-09T04:51:43.513956845Z" level=info msg="StartContainer for \"fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c\" returns successfully" Sep 9 04:51:43.826763 kubelet[2675]: E0909 04:51:43.826640 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pz4n4" podUID="093f1e96-6646-4411-bb5b-eecbd26e4d17" Sep 9 04:51:44.166787 systemd[1]: cri-containerd-fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c.scope: Deactivated successfully. Sep 9 04:51:44.167538 systemd[1]: cri-containerd-fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c.scope: Consumed 457ms CPU time, 172.5M memory peak, 1.6M read from disk, 165.8M written to disk. Sep 9 04:51:44.182233 containerd[1537]: time="2025-09-09T04:51:44.182153434Z" level=info msg="received exit event container_id:\"fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c\" id:\"fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c\" pid:3464 exited_at:{seconds:1757393504 nanos:181792997}" Sep 9 04:51:44.184518 containerd[1537]: time="2025-09-09T04:51:44.182247494Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c\" id:\"fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c\" pid:3464 exited_at:{seconds:1757393504 nanos:181792997}" Sep 9 04:51:44.204475 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fe2bd6a287c8590ec2fd2971c4685da8f783a467939711c556c0c21b39842a2c-rootfs.mount: Deactivated successfully. Sep 9 04:51:44.222183 kubelet[2675]: I0909 04:51:44.221764 2675 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 04:51:44.298407 systemd[1]: Created slice kubepods-burstable-podc6c8756f_3aea_476b_8d23_25f7676c20c7.slice - libcontainer container kubepods-burstable-podc6c8756f_3aea_476b_8d23_25f7676c20c7.slice. Sep 9 04:51:44.308558 systemd[1]: Created slice kubepods-besteffort-pod46872eaa_5736_4fd1_abbb_412aa5718fa0.slice - libcontainer container kubepods-besteffort-pod46872eaa_5736_4fd1_abbb_412aa5718fa0.slice. Sep 9 04:51:44.314443 systemd[1]: Created slice kubepods-besteffort-podc8a892bb_e9a9_4c59_b890_d0eb092f4236.slice - libcontainer container kubepods-besteffort-podc8a892bb_e9a9_4c59_b890_d0eb092f4236.slice. Sep 9 04:51:44.320366 systemd[1]: Created slice kubepods-besteffort-pod21127a09_5bac_4c83_afd6_43950e57c418.slice - libcontainer container kubepods-besteffort-pod21127a09_5bac_4c83_afd6_43950e57c418.slice. Sep 9 04:51:44.323906 systemd[1]: Created slice kubepods-besteffort-pod8ba30964_2505_4075_b0a9_71a192936193.slice - libcontainer container kubepods-besteffort-pod8ba30964_2505_4075_b0a9_71a192936193.slice. Sep 9 04:51:44.332558 systemd[1]: Created slice kubepods-burstable-podf1c0f5dc_c087_4532_a9e5_5f9167f6f542.slice - libcontainer container kubepods-burstable-podf1c0f5dc_c087_4532_a9e5_5f9167f6f542.slice. Sep 9 04:51:44.336164 systemd[1]: Created slice kubepods-besteffort-podb37d8bb0_5597_4873_af34_4b17c8a8ba2d.slice - libcontainer container kubepods-besteffort-podb37d8bb0_5597_4873_af34_4b17c8a8ba2d.slice. Sep 9 04:51:44.400936 kubelet[2675]: I0909 04:51:44.400887 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fw5\" (UniqueName: \"kubernetes.io/projected/b37d8bb0-5597-4873-af34-4b17c8a8ba2d-kube-api-access-w9fw5\") pod \"calico-apiserver-c8977fc6c-t4qjf\" (UID: \"b37d8bb0-5597-4873-af34-4b17c8a8ba2d\") " pod="calico-apiserver/calico-apiserver-c8977fc6c-t4qjf" Sep 9 04:51:44.401224 kubelet[2675]: I0909 04:51:44.401202 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21127a09-5bac-4c83-afd6-43950e57c418-tigera-ca-bundle\") pod \"calico-kube-controllers-75b67558cc-6p5fq\" (UID: \"21127a09-5bac-4c83-afd6-43950e57c418\") " pod="calico-system/calico-kube-controllers-75b67558cc-6p5fq" Sep 9 04:51:44.401326 kubelet[2675]: I0909 04:51:44.401315 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46872eaa-5736-4fd1-abbb-412aa5718fa0-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-xmwf5\" (UID: \"46872eaa-5736-4fd1-abbb-412aa5718fa0\") " pod="calico-system/goldmane-54d579b49d-xmwf5" Sep 9 04:51:44.401472 kubelet[2675]: I0909 04:51:44.401399 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8ba30964-2505-4075-b0a9-71a192936193-calico-apiserver-certs\") pod \"calico-apiserver-c8977fc6c-ltdk2\" (UID: \"8ba30964-2505-4075-b0a9-71a192936193\") " pod="calico-apiserver/calico-apiserver-c8977fc6c-ltdk2" Sep 9 04:51:44.401530 kubelet[2675]: I0909 04:51:44.401518 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p9zm\" (UniqueName: \"kubernetes.io/projected/c6c8756f-3aea-476b-8d23-25f7676c20c7-kube-api-access-5p9zm\") pod \"coredns-668d6bf9bc-j8z42\" (UID: \"c6c8756f-3aea-476b-8d23-25f7676c20c7\") " pod="kube-system/coredns-668d6bf9bc-j8z42" Sep 9 04:51:44.401638 kubelet[2675]: I0909 04:51:44.401625 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b37d8bb0-5597-4873-af34-4b17c8a8ba2d-calico-apiserver-certs\") pod \"calico-apiserver-c8977fc6c-t4qjf\" (UID: \"b37d8bb0-5597-4873-af34-4b17c8a8ba2d\") " pod="calico-apiserver/calico-apiserver-c8977fc6c-t4qjf" Sep 9 04:51:44.401709 kubelet[2675]: I0909 04:51:44.401698 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46872eaa-5736-4fd1-abbb-412aa5718fa0-config\") pod \"goldmane-54d579b49d-xmwf5\" (UID: \"46872eaa-5736-4fd1-abbb-412aa5718fa0\") " pod="calico-system/goldmane-54d579b49d-xmwf5" Sep 9 04:51:44.401986 kubelet[2675]: I0909 04:51:44.401872 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1c0f5dc-c087-4532-a9e5-5f9167f6f542-config-volume\") pod \"coredns-668d6bf9bc-nsjpv\" (UID: \"f1c0f5dc-c087-4532-a9e5-5f9167f6f542\") " pod="kube-system/coredns-668d6bf9bc-nsjpv" Sep 9 04:51:44.401986 kubelet[2675]: I0909 04:51:44.401903 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp8cw\" (UniqueName: \"kubernetes.io/projected/8ba30964-2505-4075-b0a9-71a192936193-kube-api-access-rp8cw\") pod \"calico-apiserver-c8977fc6c-ltdk2\" (UID: \"8ba30964-2505-4075-b0a9-71a192936193\") " pod="calico-apiserver/calico-apiserver-c8977fc6c-ltdk2" Sep 9 04:51:44.402152 kubelet[2675]: I0909 04:51:44.402081 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/46872eaa-5736-4fd1-abbb-412aa5718fa0-goldmane-key-pair\") pod \"goldmane-54d579b49d-xmwf5\" (UID: \"46872eaa-5736-4fd1-abbb-412aa5718fa0\") " pod="calico-system/goldmane-54d579b49d-xmwf5" Sep 9 04:51:44.402152 kubelet[2675]: I0909 04:51:44.402116 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shk9\" (UniqueName: \"kubernetes.io/projected/f1c0f5dc-c087-4532-a9e5-5f9167f6f542-kube-api-access-4shk9\") pod \"coredns-668d6bf9bc-nsjpv\" (UID: \"f1c0f5dc-c087-4532-a9e5-5f9167f6f542\") " pod="kube-system/coredns-668d6bf9bc-nsjpv" Sep 9 04:51:44.402384 kubelet[2675]: I0909 04:51:44.402365 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6c8756f-3aea-476b-8d23-25f7676c20c7-config-volume\") pod \"coredns-668d6bf9bc-j8z42\" (UID: \"c6c8756f-3aea-476b-8d23-25f7676c20c7\") " pod="kube-system/coredns-668d6bf9bc-j8z42" Sep 9 04:51:44.402508 kubelet[2675]: I0909 04:51:44.402475 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-backend-key-pair\") pod \"whisker-5f66d6c59f-64tqj\" (UID: \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\") " pod="calico-system/whisker-5f66d6c59f-64tqj" Sep 9 04:51:44.402619 kubelet[2675]: I0909 04:51:44.402595 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc2rx\" (UniqueName: \"kubernetes.io/projected/c8a892bb-e9a9-4c59-b890-d0eb092f4236-kube-api-access-kc2rx\") pod \"whisker-5f66d6c59f-64tqj\" (UID: \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\") " pod="calico-system/whisker-5f66d6c59f-64tqj" Sep 9 04:51:44.402707 kubelet[2675]: I0909 04:51:44.402694 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxcw\" (UniqueName: \"kubernetes.io/projected/46872eaa-5736-4fd1-abbb-412aa5718fa0-kube-api-access-6xxcw\") pod \"goldmane-54d579b49d-xmwf5\" (UID: \"46872eaa-5736-4fd1-abbb-412aa5718fa0\") " pod="calico-system/goldmane-54d579b49d-xmwf5" Sep 9 04:51:44.402783 kubelet[2675]: I0909 04:51:44.402770 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr27s\" (UniqueName: \"kubernetes.io/projected/21127a09-5bac-4c83-afd6-43950e57c418-kube-api-access-mr27s\") pod \"calico-kube-controllers-75b67558cc-6p5fq\" (UID: \"21127a09-5bac-4c83-afd6-43950e57c418\") " pod="calico-system/calico-kube-controllers-75b67558cc-6p5fq" Sep 9 04:51:44.402919 kubelet[2675]: I0909 04:51:44.402851 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-ca-bundle\") pod \"whisker-5f66d6c59f-64tqj\" (UID: \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\") " pod="calico-system/whisker-5f66d6c59f-64tqj" Sep 9 04:51:44.606890 containerd[1537]: time="2025-09-09T04:51:44.605957675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j8z42,Uid:c6c8756f-3aea-476b-8d23-25f7676c20c7,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:44.612852 containerd[1537]: time="2025-09-09T04:51:44.612786328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xmwf5,Uid:46872eaa-5736-4fd1-abbb-412aa5718fa0,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:44.618677 containerd[1537]: time="2025-09-09T04:51:44.618573999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f66d6c59f-64tqj,Uid:c8a892bb-e9a9-4c59-b890-d0eb092f4236,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:44.623602 containerd[1537]: time="2025-09-09T04:51:44.623484684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75b67558cc-6p5fq,Uid:21127a09-5bac-4c83-afd6-43950e57c418,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:44.629108 containerd[1537]: time="2025-09-09T04:51:44.629045547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-ltdk2,Uid:8ba30964-2505-4075-b0a9-71a192936193,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:51:44.636523 containerd[1537]: time="2025-09-09T04:51:44.636460324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nsjpv,Uid:f1c0f5dc-c087-4532-a9e5-5f9167f6f542,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:44.639869 containerd[1537]: time="2025-09-09T04:51:44.639830041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-t4qjf,Uid:b37d8bb0-5597-4873-af34-4b17c8a8ba2d,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:51:44.769833 containerd[1537]: time="2025-09-09T04:51:44.769763883Z" level=error msg="Failed to destroy network for sandbox \"edc6d3ddfe6649697bc2e3ec8c96f1cac3f66d7485b4ad77c623e713c4a43697\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.775717 containerd[1537]: time="2025-09-09T04:51:44.775653816Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j8z42,Uid:c6c8756f-3aea-476b-8d23-25f7676c20c7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"edc6d3ddfe6649697bc2e3ec8c96f1cac3f66d7485b4ad77c623e713c4a43697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.776074 kubelet[2675]: E0909 04:51:44.776022 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edc6d3ddfe6649697bc2e3ec8c96f1cac3f66d7485b4ad77c623e713c4a43697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.776240 kubelet[2675]: E0909 04:51:44.776213 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edc6d3ddfe6649697bc2e3ec8c96f1cac3f66d7485b4ad77c623e713c4a43697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j8z42" Sep 9 04:51:44.776287 kubelet[2675]: E0909 04:51:44.776242 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edc6d3ddfe6649697bc2e3ec8c96f1cac3f66d7485b4ad77c623e713c4a43697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j8z42" Sep 9 04:51:44.776335 kubelet[2675]: E0909 04:51:44.776302 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-j8z42_kube-system(c6c8756f-3aea-476b-8d23-25f7676c20c7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-j8z42_kube-system(c6c8756f-3aea-476b-8d23-25f7676c20c7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"edc6d3ddfe6649697bc2e3ec8c96f1cac3f66d7485b4ad77c623e713c4a43697\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-j8z42" podUID="c6c8756f-3aea-476b-8d23-25f7676c20c7" Sep 9 04:51:44.803900 containerd[1537]: time="2025-09-09T04:51:44.803842493Z" level=error msg="Failed to destroy network for sandbox \"e370a8de38035b44188852fd53be8559c1c4214d72c05df62c3ceb1242f5d1a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.806877 containerd[1537]: time="2025-09-09T04:51:44.806811205Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-t4qjf,Uid:b37d8bb0-5597-4873-af34-4b17c8a8ba2d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e370a8de38035b44188852fd53be8559c1c4214d72c05df62c3ceb1242f5d1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.807164 kubelet[2675]: E0909 04:51:44.807113 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e370a8de38035b44188852fd53be8559c1c4214d72c05df62c3ceb1242f5d1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.807256 kubelet[2675]: E0909 04:51:44.807232 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e370a8de38035b44188852fd53be8559c1c4214d72c05df62c3ceb1242f5d1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c8977fc6c-t4qjf" Sep 9 04:51:44.807301 kubelet[2675]: E0909 04:51:44.807260 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e370a8de38035b44188852fd53be8559c1c4214d72c05df62c3ceb1242f5d1a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c8977fc6c-t4qjf" Sep 9 04:51:44.809024 kubelet[2675]: E0909 04:51:44.807336 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c8977fc6c-t4qjf_calico-apiserver(b37d8bb0-5597-4873-af34-4b17c8a8ba2d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c8977fc6c-t4qjf_calico-apiserver(b37d8bb0-5597-4873-af34-4b17c8a8ba2d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e370a8de38035b44188852fd53be8559c1c4214d72c05df62c3ceb1242f5d1a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c8977fc6c-t4qjf" podUID="b37d8bb0-5597-4873-af34-4b17c8a8ba2d" Sep 9 04:51:44.811425 containerd[1537]: time="2025-09-09T04:51:44.811383337Z" level=error msg="Failed to destroy network for sandbox \"c8b5f8adcf18a99abf9f63c08bdf579faad9344dbd477ff16c27df9f2b6732b8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.815165 containerd[1537]: time="2025-09-09T04:51:44.814834032Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-ltdk2,Uid:8ba30964-2505-4075-b0a9-71a192936193,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8b5f8adcf18a99abf9f63c08bdf579faad9344dbd477ff16c27df9f2b6732b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.815331 kubelet[2675]: E0909 04:51:44.815069 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8b5f8adcf18a99abf9f63c08bdf579faad9344dbd477ff16c27df9f2b6732b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.815331 kubelet[2675]: E0909 04:51:44.815120 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8b5f8adcf18a99abf9f63c08bdf579faad9344dbd477ff16c27df9f2b6732b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c8977fc6c-ltdk2" Sep 9 04:51:44.815331 kubelet[2675]: E0909 04:51:44.815151 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8b5f8adcf18a99abf9f63c08bdf579faad9344dbd477ff16c27df9f2b6732b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c8977fc6c-ltdk2" Sep 9 04:51:44.816235 kubelet[2675]: E0909 04:51:44.815189 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c8977fc6c-ltdk2_calico-apiserver(8ba30964-2505-4075-b0a9-71a192936193)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c8977fc6c-ltdk2_calico-apiserver(8ba30964-2505-4075-b0a9-71a192936193)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c8b5f8adcf18a99abf9f63c08bdf579faad9344dbd477ff16c27df9f2b6732b8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c8977fc6c-ltdk2" podUID="8ba30964-2505-4075-b0a9-71a192936193" Sep 9 04:51:44.817309 containerd[1537]: time="2025-09-09T04:51:44.817270550Z" level=error msg="Failed to destroy network for sandbox \"48494a5e0fb6f36bb4a9981b83b75bae6ee34de8741e2347a0fafc881955bd5a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.818737 containerd[1537]: time="2025-09-09T04:51:44.818697694Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nsjpv,Uid:f1c0f5dc-c087-4532-a9e5-5f9167f6f542,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"48494a5e0fb6f36bb4a9981b83b75bae6ee34de8741e2347a0fafc881955bd5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.819206 kubelet[2675]: E0909 04:51:44.819167 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48494a5e0fb6f36bb4a9981b83b75bae6ee34de8741e2347a0fafc881955bd5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.819265 kubelet[2675]: E0909 04:51:44.819223 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48494a5e0fb6f36bb4a9981b83b75bae6ee34de8741e2347a0fafc881955bd5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nsjpv" Sep 9 04:51:44.819265 kubelet[2675]: E0909 04:51:44.819242 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48494a5e0fb6f36bb4a9981b83b75bae6ee34de8741e2347a0fafc881955bd5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nsjpv" Sep 9 04:51:44.819400 kubelet[2675]: E0909 04:51:44.819283 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-nsjpv_kube-system(f1c0f5dc-c087-4532-a9e5-5f9167f6f542)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-nsjpv_kube-system(f1c0f5dc-c087-4532-a9e5-5f9167f6f542)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48494a5e0fb6f36bb4a9981b83b75bae6ee34de8741e2347a0fafc881955bd5a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-nsjpv" podUID="f1c0f5dc-c087-4532-a9e5-5f9167f6f542" Sep 9 04:51:44.825331 containerd[1537]: time="2025-09-09T04:51:44.825284215Z" level=error msg="Failed to destroy network for sandbox \"a52c56b4a989541a47ccc72613539a2adeb27e2292a60d4020e9aeffbb66769c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.826855 containerd[1537]: time="2025-09-09T04:51:44.826813380Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xmwf5,Uid:46872eaa-5736-4fd1-abbb-412aa5718fa0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a52c56b4a989541a47ccc72613539a2adeb27e2292a60d4020e9aeffbb66769c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.827094 kubelet[2675]: E0909 04:51:44.827036 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a52c56b4a989541a47ccc72613539a2adeb27e2292a60d4020e9aeffbb66769c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.827965 kubelet[2675]: E0909 04:51:44.827113 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a52c56b4a989541a47ccc72613539a2adeb27e2292a60d4020e9aeffbb66769c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xmwf5" Sep 9 04:51:44.827965 kubelet[2675]: E0909 04:51:44.827132 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a52c56b4a989541a47ccc72613539a2adeb27e2292a60d4020e9aeffbb66769c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xmwf5" Sep 9 04:51:44.827965 kubelet[2675]: E0909 04:51:44.827185 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-xmwf5_calico-system(46872eaa-5736-4fd1-abbb-412aa5718fa0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-xmwf5_calico-system(46872eaa-5736-4fd1-abbb-412aa5718fa0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a52c56b4a989541a47ccc72613539a2adeb27e2292a60d4020e9aeffbb66769c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-xmwf5" podUID="46872eaa-5736-4fd1-abbb-412aa5718fa0" Sep 9 04:51:44.838530 containerd[1537]: time="2025-09-09T04:51:44.838487384Z" level=error msg="Failed to destroy network for sandbox \"de353d6580695ac83ce3a9549df2d908b69ffc9cf3eca9dd8750b499d79de5bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.838771 containerd[1537]: time="2025-09-09T04:51:44.838495785Z" level=error msg="Failed to destroy network for sandbox \"b0ae847fec4ec1fcaaf2657e5af401c62e5bec2fa0651932bc9ad309762efeeb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.839527 containerd[1537]: time="2025-09-09T04:51:44.839492157Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75b67558cc-6p5fq,Uid:21127a09-5bac-4c83-afd6-43950e57c418,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"de353d6580695ac83ce3a9549df2d908b69ffc9cf3eca9dd8750b499d79de5bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.839784 kubelet[2675]: E0909 04:51:44.839712 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de353d6580695ac83ce3a9549df2d908b69ffc9cf3eca9dd8750b499d79de5bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.839784 kubelet[2675]: E0909 04:51:44.839779 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de353d6580695ac83ce3a9549df2d908b69ffc9cf3eca9dd8750b499d79de5bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75b67558cc-6p5fq" Sep 9 04:51:44.839870 kubelet[2675]: E0909 04:51:44.839799 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de353d6580695ac83ce3a9549df2d908b69ffc9cf3eca9dd8750b499d79de5bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75b67558cc-6p5fq" Sep 9 04:51:44.839870 kubelet[2675]: E0909 04:51:44.839836 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-75b67558cc-6p5fq_calico-system(21127a09-5bac-4c83-afd6-43950e57c418)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-75b67558cc-6p5fq_calico-system(21127a09-5bac-4c83-afd6-43950e57c418)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de353d6580695ac83ce3a9549df2d908b69ffc9cf3eca9dd8750b499d79de5bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-75b67558cc-6p5fq" podUID="21127a09-5bac-4c83-afd6-43950e57c418" Sep 9 04:51:44.840459 containerd[1537]: time="2025-09-09T04:51:44.840418554Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f66d6c59f-64tqj,Uid:c8a892bb-e9a9-4c59-b890-d0eb092f4236,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0ae847fec4ec1fcaaf2657e5af401c62e5bec2fa0651932bc9ad309762efeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.840772 kubelet[2675]: E0909 04:51:44.840731 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0ae847fec4ec1fcaaf2657e5af401c62e5bec2fa0651932bc9ad309762efeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:44.840817 kubelet[2675]: E0909 04:51:44.840781 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0ae847fec4ec1fcaaf2657e5af401c62e5bec2fa0651932bc9ad309762efeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5f66d6c59f-64tqj" Sep 9 04:51:44.841263 kubelet[2675]: E0909 04:51:44.841219 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0ae847fec4ec1fcaaf2657e5af401c62e5bec2fa0651932bc9ad309762efeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5f66d6c59f-64tqj" Sep 9 04:51:44.841312 kubelet[2675]: E0909 04:51:44.841284 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5f66d6c59f-64tqj_calico-system(c8a892bb-e9a9-4c59-b890-d0eb092f4236)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5f66d6c59f-64tqj_calico-system(c8a892bb-e9a9-4c59-b890-d0eb092f4236)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b0ae847fec4ec1fcaaf2657e5af401c62e5bec2fa0651932bc9ad309762efeeb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5f66d6c59f-64tqj" podUID="c8a892bb-e9a9-4c59-b890-d0eb092f4236" Sep 9 04:51:44.930205 containerd[1537]: time="2025-09-09T04:51:44.929688866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 04:51:45.850495 systemd[1]: Created slice kubepods-besteffort-pod093f1e96_6646_4411_bb5b_eecbd26e4d17.slice - libcontainer container kubepods-besteffort-pod093f1e96_6646_4411_bb5b_eecbd26e4d17.slice. Sep 9 04:51:45.853581 containerd[1537]: time="2025-09-09T04:51:45.853541814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pz4n4,Uid:093f1e96-6646-4411-bb5b-eecbd26e4d17,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:45.925589 containerd[1537]: time="2025-09-09T04:51:45.925537102Z" level=error msg="Failed to destroy network for sandbox \"8e858d9ebb22c2ea3f0c21d8cf99c76feccad205c139d944580f4d074cd02b70\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:45.927412 systemd[1]: run-netns-cni\x2d093bc8bd\x2d76df\x2d24a9\x2d67df\x2d96fadbd67bfd.mount: Deactivated successfully. Sep 9 04:51:45.933059 containerd[1537]: time="2025-09-09T04:51:45.933002949Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pz4n4,Uid:093f1e96-6646-4411-bb5b-eecbd26e4d17,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e858d9ebb22c2ea3f0c21d8cf99c76feccad205c139d944580f4d074cd02b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:45.933276 kubelet[2675]: E0909 04:51:45.933235 2675 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e858d9ebb22c2ea3f0c21d8cf99c76feccad205c139d944580f4d074cd02b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:51:45.933547 kubelet[2675]: E0909 04:51:45.933294 2675 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e858d9ebb22c2ea3f0c21d8cf99c76feccad205c139d944580f4d074cd02b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pz4n4" Sep 9 04:51:45.933547 kubelet[2675]: E0909 04:51:45.933323 2675 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e858d9ebb22c2ea3f0c21d8cf99c76feccad205c139d944580f4d074cd02b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pz4n4" Sep 9 04:51:45.933547 kubelet[2675]: E0909 04:51:45.933369 2675 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-pz4n4_calico-system(093f1e96-6646-4411-bb5b-eecbd26e4d17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-pz4n4_calico-system(093f1e96-6646-4411-bb5b-eecbd26e4d17)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e858d9ebb22c2ea3f0c21d8cf99c76feccad205c139d944580f4d074cd02b70\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-pz4n4" podUID="093f1e96-6646-4411-bb5b-eecbd26e4d17" Sep 9 04:51:47.898589 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3648338360.mount: Deactivated successfully. Sep 9 04:51:48.213438 containerd[1537]: time="2025-09-09T04:51:48.213304413Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:48.214284 containerd[1537]: time="2025-09-09T04:51:48.214202337Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 04:51:48.214730 containerd[1537]: time="2025-09-09T04:51:48.214705869Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:48.216827 containerd[1537]: time="2025-09-09T04:51:48.216785330Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:48.217499 containerd[1537]: time="2025-09-09T04:51:48.217449612Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 3.287136453s" Sep 9 04:51:48.217737 containerd[1537]: time="2025-09-09T04:51:48.217501981Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 04:51:48.233170 containerd[1537]: time="2025-09-09T04:51:48.231479739Z" level=info msg="CreateContainer within sandbox \"65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 04:51:48.251706 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2267628095.mount: Deactivated successfully. Sep 9 04:51:48.252037 containerd[1537]: time="2025-09-09T04:51:48.251990493Z" level=info msg="Container da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:48.269148 containerd[1537]: time="2025-09-09T04:51:48.269069738Z" level=info msg="CreateContainer within sandbox \"65f2e2b0ae6c4048d60b34fb7a8515627f38611a0a64540afc3ec13aea74dce6\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e\"" Sep 9 04:51:48.272164 containerd[1537]: time="2025-09-09T04:51:48.272002075Z" level=info msg="StartContainer for \"da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e\"" Sep 9 04:51:48.273712 containerd[1537]: time="2025-09-09T04:51:48.273680382Z" level=info msg="connecting to shim da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e" address="unix:///run/containerd/s/c8dd0f4cd34e4ed93921efff1f8f0b30717fa69248bc7b466bf799b5dad39901" protocol=ttrpc version=3 Sep 9 04:51:48.291307 systemd[1]: Started cri-containerd-da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e.scope - libcontainer container da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e. Sep 9 04:51:48.329507 containerd[1537]: time="2025-09-09T04:51:48.329462151Z" level=info msg="StartContainer for \"da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e\" returns successfully" Sep 9 04:51:48.444505 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 04:51:48.444603 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 04:51:48.639300 kubelet[2675]: I0909 04:51:48.639251 2675 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-backend-key-pair\") pod \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\" (UID: \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\") " Sep 9 04:51:48.639662 kubelet[2675]: I0909 04:51:48.639637 2675 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc2rx\" (UniqueName: \"kubernetes.io/projected/c8a892bb-e9a9-4c59-b890-d0eb092f4236-kube-api-access-kc2rx\") pod \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\" (UID: \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\") " Sep 9 04:51:48.639697 kubelet[2675]: I0909 04:51:48.639674 2675 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-ca-bundle\") pod \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\" (UID: \"c8a892bb-e9a9-4c59-b890-d0eb092f4236\") " Sep 9 04:51:48.642722 kubelet[2675]: I0909 04:51:48.642680 2675 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "c8a892bb-e9a9-4c59-b890-d0eb092f4236" (UID: "c8a892bb-e9a9-4c59-b890-d0eb092f4236"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 04:51:48.643616 kubelet[2675]: I0909 04:51:48.643490 2675 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "c8a892bb-e9a9-4c59-b890-d0eb092f4236" (UID: "c8a892bb-e9a9-4c59-b890-d0eb092f4236"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 04:51:48.646427 kubelet[2675]: I0909 04:51:48.646391 2675 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a892bb-e9a9-4c59-b890-d0eb092f4236-kube-api-access-kc2rx" (OuterVolumeSpecName: "kube-api-access-kc2rx") pod "c8a892bb-e9a9-4c59-b890-d0eb092f4236" (UID: "c8a892bb-e9a9-4c59-b890-d0eb092f4236"). InnerVolumeSpecName "kube-api-access-kc2rx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 04:51:48.740437 kubelet[2675]: I0909 04:51:48.740388 2675 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 04:51:48.740437 kubelet[2675]: I0909 04:51:48.740421 2675 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kc2rx\" (UniqueName: \"kubernetes.io/projected/c8a892bb-e9a9-4c59-b890-d0eb092f4236-kube-api-access-kc2rx\") on node \"localhost\" DevicePath \"\"" Sep 9 04:51:48.740437 kubelet[2675]: I0909 04:51:48.740430 2675 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8a892bb-e9a9-4c59-b890-d0eb092f4236-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 04:51:48.899360 systemd[1]: var-lib-kubelet-pods-c8a892bb\x2de9a9\x2d4c59\x2db890\x2dd0eb092f4236-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkc2rx.mount: Deactivated successfully. Sep 9 04:51:48.899454 systemd[1]: var-lib-kubelet-pods-c8a892bb\x2de9a9\x2d4c59\x2db890\x2dd0eb092f4236-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 04:51:48.956909 systemd[1]: Removed slice kubepods-besteffort-podc8a892bb_e9a9_4c59_b890_d0eb092f4236.slice - libcontainer container kubepods-besteffort-podc8a892bb_e9a9_4c59_b890_d0eb092f4236.slice. Sep 9 04:51:49.074987 containerd[1537]: time="2025-09-09T04:51:49.074942346Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e\" id:\"d0692c9752d78f95d6165192738c3537577e76cc21f5cedef0f09b371e4678be\" pid:3847 exit_status:1 exited_at:{seconds:1757393509 nanos:74578042}" Sep 9 04:51:49.147785 kubelet[2675]: I0909 04:51:49.147706 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jjmt7" podStartSLOduration=1.639711549 podStartE2EDuration="13.14769004s" podCreationTimestamp="2025-09-09 04:51:36 +0000 UTC" firstStartedPulling="2025-09-09 04:51:36.710057148 +0000 UTC m=+18.976218723" lastFinishedPulling="2025-09-09 04:51:48.218035639 +0000 UTC m=+30.484197214" observedRunningTime="2025-09-09 04:51:49.146752915 +0000 UTC m=+31.412914490" watchObservedRunningTime="2025-09-09 04:51:49.14769004 +0000 UTC m=+31.413851615" Sep 9 04:51:49.166554 systemd[1]: Created slice kubepods-besteffort-pod2b83e013_d851_44a0_bec0_401bd52e7319.slice - libcontainer container kubepods-besteffort-pod2b83e013_d851_44a0_bec0_401bd52e7319.slice. Sep 9 04:51:49.244011 kubelet[2675]: I0909 04:51:49.243965 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b83e013-d851-44a0-bec0-401bd52e7319-whisker-ca-bundle\") pod \"whisker-5699f8795d-t8zgr\" (UID: \"2b83e013-d851-44a0-bec0-401bd52e7319\") " pod="calico-system/whisker-5699f8795d-t8zgr" Sep 9 04:51:49.244011 kubelet[2675]: I0909 04:51:49.244009 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhxvj\" (UniqueName: \"kubernetes.io/projected/2b83e013-d851-44a0-bec0-401bd52e7319-kube-api-access-hhxvj\") pod \"whisker-5699f8795d-t8zgr\" (UID: \"2b83e013-d851-44a0-bec0-401bd52e7319\") " pod="calico-system/whisker-5699f8795d-t8zgr" Sep 9 04:51:49.244208 kubelet[2675]: I0909 04:51:49.244041 2675 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2b83e013-d851-44a0-bec0-401bd52e7319-whisker-backend-key-pair\") pod \"whisker-5699f8795d-t8zgr\" (UID: \"2b83e013-d851-44a0-bec0-401bd52e7319\") " pod="calico-system/whisker-5699f8795d-t8zgr" Sep 9 04:51:49.480173 containerd[1537]: time="2025-09-09T04:51:49.479971153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5699f8795d-t8zgr,Uid:2b83e013-d851-44a0-bec0-401bd52e7319,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:49.655829 systemd-networkd[1443]: cali13bb5b132e6: Link UP Sep 9 04:51:49.656339 systemd-networkd[1443]: cali13bb5b132e6: Gained carrier Sep 9 04:51:49.668725 containerd[1537]: 2025-09-09 04:51:49.505 [INFO][3861] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:51:49.668725 containerd[1537]: 2025-09-09 04:51:49.544 [INFO][3861] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5699f8795d--t8zgr-eth0 whisker-5699f8795d- calico-system 2b83e013-d851-44a0-bec0-401bd52e7319 853 0 2025-09-09 04:51:48 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5699f8795d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5699f8795d-t8zgr eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali13bb5b132e6 [] [] }} ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-" Sep 9 04:51:49.668725 containerd[1537]: 2025-09-09 04:51:49.545 [INFO][3861] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" Sep 9 04:51:49.668725 containerd[1537]: 2025-09-09 04:51:49.614 [INFO][3875] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" HandleID="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Workload="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.614 [INFO][3875] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" HandleID="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Workload="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400059ba50), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5699f8795d-t8zgr", "timestamp":"2025-09-09 04:51:49.614711001 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.614 [INFO][3875] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.615 [INFO][3875] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.615 [INFO][3875] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.624 [INFO][3875] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" host="localhost" Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.629 [INFO][3875] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.633 [INFO][3875] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.635 [INFO][3875] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.637 [INFO][3875] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:49.669019 containerd[1537]: 2025-09-09 04:51:49.637 [INFO][3875] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" host="localhost" Sep 9 04:51:49.669260 containerd[1537]: 2025-09-09 04:51:49.638 [INFO][3875] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed Sep 9 04:51:49.669260 containerd[1537]: 2025-09-09 04:51:49.642 [INFO][3875] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" host="localhost" Sep 9 04:51:49.669260 containerd[1537]: 2025-09-09 04:51:49.646 [INFO][3875] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" host="localhost" Sep 9 04:51:49.669260 containerd[1537]: 2025-09-09 04:51:49.646 [INFO][3875] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" host="localhost" Sep 9 04:51:49.669260 containerd[1537]: 2025-09-09 04:51:49.646 [INFO][3875] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:49.669260 containerd[1537]: 2025-09-09 04:51:49.646 [INFO][3875] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" HandleID="k8s-pod-network.c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Workload="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" Sep 9 04:51:49.669388 containerd[1537]: 2025-09-09 04:51:49.649 [INFO][3861] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5699f8795d--t8zgr-eth0", GenerateName:"whisker-5699f8795d-", Namespace:"calico-system", SelfLink:"", UID:"2b83e013-d851-44a0-bec0-401bd52e7319", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5699f8795d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5699f8795d-t8zgr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali13bb5b132e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:49.669388 containerd[1537]: 2025-09-09 04:51:49.649 [INFO][3861] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" Sep 9 04:51:49.669456 containerd[1537]: 2025-09-09 04:51:49.649 [INFO][3861] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali13bb5b132e6 ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" Sep 9 04:51:49.669456 containerd[1537]: 2025-09-09 04:51:49.656 [INFO][3861] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" Sep 9 04:51:49.669495 containerd[1537]: 2025-09-09 04:51:49.657 [INFO][3861] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5699f8795d--t8zgr-eth0", GenerateName:"whisker-5699f8795d-", Namespace:"calico-system", SelfLink:"", UID:"2b83e013-d851-44a0-bec0-401bd52e7319", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5699f8795d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed", Pod:"whisker-5699f8795d-t8zgr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali13bb5b132e6", MAC:"52:9e:ed:26:db:ca", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:49.669539 containerd[1537]: 2025-09-09 04:51:49.666 [INFO][3861] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" Namespace="calico-system" Pod="whisker-5699f8795d-t8zgr" WorkloadEndpoint="localhost-k8s-whisker--5699f8795d--t8zgr-eth0" Sep 9 04:51:49.712689 containerd[1537]: time="2025-09-09T04:51:49.712649707Z" level=info msg="connecting to shim c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed" address="unix:///run/containerd/s/87c64d764be5b9db33cc7d99b8e923eba003be632f9350995891a655ddb66dc4" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:49.737325 systemd[1]: Started cri-containerd-c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed.scope - libcontainer container c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed. Sep 9 04:51:49.756173 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:49.780780 containerd[1537]: time="2025-09-09T04:51:49.780690249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5699f8795d-t8zgr,Uid:2b83e013-d851-44a0-bec0-401bd52e7319,Namespace:calico-system,Attempt:0,} returns sandbox id \"c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed\"" Sep 9 04:51:49.783182 containerd[1537]: time="2025-09-09T04:51:49.783044865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 04:51:49.830209 kubelet[2675]: I0909 04:51:49.830131 2675 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a892bb-e9a9-4c59-b890-d0eb092f4236" path="/var/lib/kubelet/pods/c8a892bb-e9a9-4c59-b890-d0eb092f4236/volumes" Sep 9 04:51:50.041344 containerd[1537]: time="2025-09-09T04:51:50.038912092Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e\" id:\"a3221b7485fda1bc59870788ff81c66ea9a00076cb36dbd630199c7d350f544f\" pid:4045 exit_status:1 exited_at:{seconds:1757393510 nanos:38391804}" Sep 9 04:51:50.689175 containerd[1537]: time="2025-09-09T04:51:50.688496549Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:50.689175 containerd[1537]: time="2025-09-09T04:51:50.688980952Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 04:51:50.689843 containerd[1537]: time="2025-09-09T04:51:50.689814974Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:50.697854 containerd[1537]: time="2025-09-09T04:51:50.696609334Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:50.697854 containerd[1537]: time="2025-09-09T04:51:50.697516489Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 914.428697ms" Sep 9 04:51:50.697854 containerd[1537]: time="2025-09-09T04:51:50.697555816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 04:51:50.700437 containerd[1537]: time="2025-09-09T04:51:50.700402382Z" level=info msg="CreateContainer within sandbox \"c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 04:51:50.708396 containerd[1537]: time="2025-09-09T04:51:50.708350060Z" level=info msg="Container 5971f31eed9df808bfecbec57bbeead6f87c73c7411959f9cb7d6a12ec0c2872: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:50.716129 containerd[1537]: time="2025-09-09T04:51:50.716086341Z" level=info msg="CreateContainer within sandbox \"c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"5971f31eed9df808bfecbec57bbeead6f87c73c7411959f9cb7d6a12ec0c2872\"" Sep 9 04:51:50.716554 containerd[1537]: time="2025-09-09T04:51:50.716521455Z" level=info msg="StartContainer for \"5971f31eed9df808bfecbec57bbeead6f87c73c7411959f9cb7d6a12ec0c2872\"" Sep 9 04:51:50.717541 containerd[1537]: time="2025-09-09T04:51:50.717504303Z" level=info msg="connecting to shim 5971f31eed9df808bfecbec57bbeead6f87c73c7411959f9cb7d6a12ec0c2872" address="unix:///run/containerd/s/87c64d764be5b9db33cc7d99b8e923eba003be632f9350995891a655ddb66dc4" protocol=ttrpc version=3 Sep 9 04:51:50.739365 systemd[1]: Started cri-containerd-5971f31eed9df808bfecbec57bbeead6f87c73c7411959f9cb7d6a12ec0c2872.scope - libcontainer container 5971f31eed9df808bfecbec57bbeead6f87c73c7411959f9cb7d6a12ec0c2872. Sep 9 04:51:50.773037 containerd[1537]: time="2025-09-09T04:51:50.772802627Z" level=info msg="StartContainer for \"5971f31eed9df808bfecbec57bbeead6f87c73c7411959f9cb7d6a12ec0c2872\" returns successfully" Sep 9 04:51:50.775207 containerd[1537]: time="2025-09-09T04:51:50.775181273Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 04:51:50.930291 systemd-networkd[1443]: cali13bb5b132e6: Gained IPv6LL Sep 9 04:51:51.955575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4289371700.mount: Deactivated successfully. Sep 9 04:51:51.970697 containerd[1537]: time="2025-09-09T04:51:51.970307722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:51.970996 containerd[1537]: time="2025-09-09T04:51:51.970746195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 04:51:51.971474 containerd[1537]: time="2025-09-09T04:51:51.971444070Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:51.974220 containerd[1537]: time="2025-09-09T04:51:51.974191164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:51.975243 containerd[1537]: time="2025-09-09T04:51:51.975216694Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.200002375s" Sep 9 04:51:51.975412 containerd[1537]: time="2025-09-09T04:51:51.975328312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 04:51:51.978794 containerd[1537]: time="2025-09-09T04:51:51.978764000Z" level=info msg="CreateContainer within sandbox \"c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 04:51:51.986364 containerd[1537]: time="2025-09-09T04:51:51.984694900Z" level=info msg="Container effe71aaff8654c0abfaba537e9eb831761101ba325ec2ba49ecca3ec9d90b11: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:51.992605 containerd[1537]: time="2025-09-09T04:51:51.992559079Z" level=info msg="CreateContainer within sandbox \"c89895016474ca21f78e071da0e28fc9e94894cc3c93bb84d0255995e64cc5ed\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"effe71aaff8654c0abfaba537e9eb831761101ba325ec2ba49ecca3ec9d90b11\"" Sep 9 04:51:51.993088 containerd[1537]: time="2025-09-09T04:51:51.993057041Z" level=info msg="StartContainer for \"effe71aaff8654c0abfaba537e9eb831761101ba325ec2ba49ecca3ec9d90b11\"" Sep 9 04:51:51.995311 containerd[1537]: time="2025-09-09T04:51:51.995281289Z" level=info msg="connecting to shim effe71aaff8654c0abfaba537e9eb831761101ba325ec2ba49ecca3ec9d90b11" address="unix:///run/containerd/s/87c64d764be5b9db33cc7d99b8e923eba003be632f9350995891a655ddb66dc4" protocol=ttrpc version=3 Sep 9 04:51:52.020318 systemd[1]: Started cri-containerd-effe71aaff8654c0abfaba537e9eb831761101ba325ec2ba49ecca3ec9d90b11.scope - libcontainer container effe71aaff8654c0abfaba537e9eb831761101ba325ec2ba49ecca3ec9d90b11. Sep 9 04:51:52.070261 containerd[1537]: time="2025-09-09T04:51:52.070222273Z" level=info msg="StartContainer for \"effe71aaff8654c0abfaba537e9eb831761101ba325ec2ba49ecca3ec9d90b11\" returns successfully" Sep 9 04:51:53.009632 kubelet[2675]: I0909 04:51:53.009566 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5699f8795d-t8zgr" podStartSLOduration=2.816069308 podStartE2EDuration="5.009548999s" podCreationTimestamp="2025-09-09 04:51:48 +0000 UTC" firstStartedPulling="2025-09-09 04:51:49.78244784 +0000 UTC m=+32.048609415" lastFinishedPulling="2025-09-09 04:51:51.975927531 +0000 UTC m=+34.242089106" observedRunningTime="2025-09-09 04:51:52.979100805 +0000 UTC m=+35.245262380" watchObservedRunningTime="2025-09-09 04:51:53.009548999 +0000 UTC m=+35.275710574" Sep 9 04:51:54.828995 containerd[1537]: time="2025-09-09T04:51:54.828932234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j8z42,Uid:c6c8756f-3aea-476b-8d23-25f7676c20c7,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:54.946945 systemd-networkd[1443]: cali6a78b068e4f: Link UP Sep 9 04:51:54.947910 systemd-networkd[1443]: cali6a78b068e4f: Gained carrier Sep 9 04:51:54.961689 containerd[1537]: 2025-09-09 04:51:54.853 [INFO][4242] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:51:54.961689 containerd[1537]: 2025-09-09 04:51:54.867 [INFO][4242] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--j8z42-eth0 coredns-668d6bf9bc- kube-system c6c8756f-3aea-476b-8d23-25f7676c20c7 781 0 2025-09-09 04:51:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-j8z42 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6a78b068e4f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-" Sep 9 04:51:54.961689 containerd[1537]: 2025-09-09 04:51:54.867 [INFO][4242] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" Sep 9 04:51:54.961689 containerd[1537]: 2025-09-09 04:51:54.897 [INFO][4256] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" HandleID="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Workload="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.898 [INFO][4256] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" HandleID="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Workload="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003ac1c0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-j8z42", "timestamp":"2025-09-09 04:51:54.897843012 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.898 [INFO][4256] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.898 [INFO][4256] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.898 [INFO][4256] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.910 [INFO][4256] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" host="localhost" Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.919 [INFO][4256] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.924 [INFO][4256] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.927 [INFO][4256] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.930 [INFO][4256] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:54.961886 containerd[1537]: 2025-09-09 04:51:54.930 [INFO][4256] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" host="localhost" Sep 9 04:51:54.962091 containerd[1537]: 2025-09-09 04:51:54.932 [INFO][4256] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2 Sep 9 04:51:54.962091 containerd[1537]: 2025-09-09 04:51:54.935 [INFO][4256] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" host="localhost" Sep 9 04:51:54.962091 containerd[1537]: 2025-09-09 04:51:54.942 [INFO][4256] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" host="localhost" Sep 9 04:51:54.962091 containerd[1537]: 2025-09-09 04:51:54.942 [INFO][4256] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" host="localhost" Sep 9 04:51:54.962091 containerd[1537]: 2025-09-09 04:51:54.942 [INFO][4256] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:54.962091 containerd[1537]: 2025-09-09 04:51:54.942 [INFO][4256] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" HandleID="k8s-pod-network.19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Workload="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" Sep 9 04:51:54.962236 containerd[1537]: 2025-09-09 04:51:54.944 [INFO][4242] cni-plugin/k8s.go 418: Populated endpoint ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--j8z42-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c6c8756f-3aea-476b-8d23-25f7676c20c7", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-j8z42", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6a78b068e4f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:54.962311 containerd[1537]: 2025-09-09 04:51:54.944 [INFO][4242] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" Sep 9 04:51:54.962311 containerd[1537]: 2025-09-09 04:51:54.944 [INFO][4242] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6a78b068e4f ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" Sep 9 04:51:54.962311 containerd[1537]: 2025-09-09 04:51:54.948 [INFO][4242] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" Sep 9 04:51:54.962376 containerd[1537]: 2025-09-09 04:51:54.948 [INFO][4242] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--j8z42-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c6c8756f-3aea-476b-8d23-25f7676c20c7", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2", Pod:"coredns-668d6bf9bc-j8z42", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6a78b068e4f", MAC:"52:8a:dc:92:0f:70", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:54.962376 containerd[1537]: 2025-09-09 04:51:54.958 [INFO][4242] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" Namespace="kube-system" Pod="coredns-668d6bf9bc-j8z42" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--j8z42-eth0" Sep 9 04:51:54.981286 containerd[1537]: time="2025-09-09T04:51:54.981246332Z" level=info msg="connecting to shim 19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2" address="unix:///run/containerd/s/8485fb3b541b98acb37957f4292625aa9c96e58c831816068be8b06da6ffdcf0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:55.005346 systemd[1]: Started cri-containerd-19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2.scope - libcontainer container 19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2. Sep 9 04:51:55.018293 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:55.040798 containerd[1537]: time="2025-09-09T04:51:55.040760320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j8z42,Uid:c6c8756f-3aea-476b-8d23-25f7676c20c7,Namespace:kube-system,Attempt:0,} returns sandbox id \"19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2\"" Sep 9 04:51:55.043905 containerd[1537]: time="2025-09-09T04:51:55.043877296Z" level=info msg="CreateContainer within sandbox \"19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:51:55.055966 containerd[1537]: time="2025-09-09T04:51:55.055917857Z" level=info msg="Container a9f1db4ece4b0c32e27a4ec8e841fc94bd430297a2d858c983737617d8567565: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:55.060401 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1168905640.mount: Deactivated successfully. Sep 9 04:51:55.069013 containerd[1537]: time="2025-09-09T04:51:55.068660842Z" level=info msg="CreateContainer within sandbox \"19bc6cf018e943f423e4c8eb2bde1101b2373629e3129d1e24e408eb7ac36ec2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a9f1db4ece4b0c32e27a4ec8e841fc94bd430297a2d858c983737617d8567565\"" Sep 9 04:51:55.069940 containerd[1537]: time="2025-09-09T04:51:55.069904224Z" level=info msg="StartContainer for \"a9f1db4ece4b0c32e27a4ec8e841fc94bd430297a2d858c983737617d8567565\"" Sep 9 04:51:55.071022 containerd[1537]: time="2025-09-09T04:51:55.070991463Z" level=info msg="connecting to shim a9f1db4ece4b0c32e27a4ec8e841fc94bd430297a2d858c983737617d8567565" address="unix:///run/containerd/s/8485fb3b541b98acb37957f4292625aa9c96e58c831816068be8b06da6ffdcf0" protocol=ttrpc version=3 Sep 9 04:51:55.101317 systemd[1]: Started cri-containerd-a9f1db4ece4b0c32e27a4ec8e841fc94bd430297a2d858c983737617d8567565.scope - libcontainer container a9f1db4ece4b0c32e27a4ec8e841fc94bd430297a2d858c983737617d8567565. Sep 9 04:51:55.130081 containerd[1537]: time="2025-09-09T04:51:55.130021539Z" level=info msg="StartContainer for \"a9f1db4ece4b0c32e27a4ec8e841fc94bd430297a2d858c983737617d8567565\" returns successfully" Sep 9 04:51:55.826737 containerd[1537]: time="2025-09-09T04:51:55.826693707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-t4qjf,Uid:b37d8bb0-5597-4873-af34-4b17c8a8ba2d,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:51:55.826869 containerd[1537]: time="2025-09-09T04:51:55.826747435Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75b67558cc-6p5fq,Uid:21127a09-5bac-4c83-afd6-43950e57c418,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:55.934244 systemd-networkd[1443]: calife93e19878e: Link UP Sep 9 04:51:55.934667 systemd-networkd[1443]: calife93e19878e: Gained carrier Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.849 [INFO][4376] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.865 [INFO][4376] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0 calico-apiserver-c8977fc6c- calico-apiserver b37d8bb0-5597-4873-af34-4b17c8a8ba2d 788 0 2025-09-09 04:51:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c8977fc6c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-c8977fc6c-t4qjf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calife93e19878e [] [] }} ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.867 [INFO][4376] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.893 [INFO][4406] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" HandleID="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Workload="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.894 [INFO][4406] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" HandleID="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Workload="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3160), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-c8977fc6c-t4qjf", "timestamp":"2025-09-09 04:51:55.893871215 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.894 [INFO][4406] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.894 [INFO][4406] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.894 [INFO][4406] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.903 [INFO][4406] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.907 [INFO][4406] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.913 [INFO][4406] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.914 [INFO][4406] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.916 [INFO][4406] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.916 [INFO][4406] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.918 [INFO][4406] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711 Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.921 [INFO][4406] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.926 [INFO][4406] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.926 [INFO][4406] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" host="localhost" Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.926 [INFO][4406] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:55.944321 containerd[1537]: 2025-09-09 04:51:55.926 [INFO][4406] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" HandleID="k8s-pod-network.88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Workload="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" Sep 9 04:51:55.944968 containerd[1537]: 2025-09-09 04:51:55.929 [INFO][4376] cni-plugin/k8s.go 418: Populated endpoint ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0", GenerateName:"calico-apiserver-c8977fc6c-", Namespace:"calico-apiserver", SelfLink:"", UID:"b37d8bb0-5597-4873-af34-4b17c8a8ba2d", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c8977fc6c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-c8977fc6c-t4qjf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calife93e19878e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:55.944968 containerd[1537]: 2025-09-09 04:51:55.930 [INFO][4376] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" Sep 9 04:51:55.944968 containerd[1537]: 2025-09-09 04:51:55.930 [INFO][4376] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calife93e19878e ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" Sep 9 04:51:55.944968 containerd[1537]: 2025-09-09 04:51:55.934 [INFO][4376] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" Sep 9 04:51:55.944968 containerd[1537]: 2025-09-09 04:51:55.934 [INFO][4376] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0", GenerateName:"calico-apiserver-c8977fc6c-", Namespace:"calico-apiserver", SelfLink:"", UID:"b37d8bb0-5597-4873-af34-4b17c8a8ba2d", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c8977fc6c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711", Pod:"calico-apiserver-c8977fc6c-t4qjf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calife93e19878e", MAC:"2a:cb:75:de:f1:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:55.944968 containerd[1537]: 2025-09-09 04:51:55.942 [INFO][4376] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-t4qjf" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--t4qjf-eth0" Sep 9 04:51:55.966801 containerd[1537]: time="2025-09-09T04:51:55.966761480Z" level=info msg="connecting to shim 88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711" address="unix:///run/containerd/s/c5d9dbc9327fe04b4a651dc29ad5cb6f35452f3bb58235a1ea496066061cbddc" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:55.989515 kubelet[2675]: I0909 04:51:55.988779 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-j8z42" podStartSLOduration=32.988763739 podStartE2EDuration="32.988763739s" podCreationTimestamp="2025-09-09 04:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:51:55.987656057 +0000 UTC m=+38.253817632" watchObservedRunningTime="2025-09-09 04:51:55.988763739 +0000 UTC m=+38.254925314" Sep 9 04:51:55.994314 systemd[1]: Started cri-containerd-88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711.scope - libcontainer container 88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711. Sep 9 04:51:56.017807 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:56.041021 systemd-networkd[1443]: cali2d42f1f51da: Link UP Sep 9 04:51:56.041176 systemd-networkd[1443]: cali2d42f1f51da: Gained carrier Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.852 [INFO][4386] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.867 [INFO][4386] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0 calico-kube-controllers-75b67558cc- calico-system 21127a09-5bac-4c83-afd6-43950e57c418 787 0 2025-09-09 04:51:36 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:75b67558cc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-75b67558cc-6p5fq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2d42f1f51da [] [] }} ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.867 [INFO][4386] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.896 [INFO][4410] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" HandleID="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Workload="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.896 [INFO][4410] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" HandleID="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Workload="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c270), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-75b67558cc-6p5fq", "timestamp":"2025-09-09 04:51:55.896183474 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.896 [INFO][4410] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.926 [INFO][4410] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:55.927 [INFO][4410] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.005 [INFO][4410] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.011 [INFO][4410] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.017 [INFO][4410] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.018 [INFO][4410] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.021 [INFO][4410] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.021 [INFO][4410] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.023 [INFO][4410] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90 Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.027 [INFO][4410] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.033 [INFO][4410] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.033 [INFO][4410] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" host="localhost" Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.033 [INFO][4410] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:56.053277 containerd[1537]: 2025-09-09 04:51:56.033 [INFO][4410] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" HandleID="k8s-pod-network.b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Workload="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" Sep 9 04:51:56.053771 containerd[1537]: 2025-09-09 04:51:56.039 [INFO][4386] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0", GenerateName:"calico-kube-controllers-75b67558cc-", Namespace:"calico-system", SelfLink:"", UID:"21127a09-5bac-4c83-afd6-43950e57c418", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75b67558cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-75b67558cc-6p5fq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2d42f1f51da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:56.053771 containerd[1537]: 2025-09-09 04:51:56.039 [INFO][4386] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" Sep 9 04:51:56.053771 containerd[1537]: 2025-09-09 04:51:56.039 [INFO][4386] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d42f1f51da ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" Sep 9 04:51:56.053771 containerd[1537]: 2025-09-09 04:51:56.041 [INFO][4386] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" Sep 9 04:51:56.053771 containerd[1537]: 2025-09-09 04:51:56.042 [INFO][4386] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0", GenerateName:"calico-kube-controllers-75b67558cc-", Namespace:"calico-system", SelfLink:"", UID:"21127a09-5bac-4c83-afd6-43950e57c418", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75b67558cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90", Pod:"calico-kube-controllers-75b67558cc-6p5fq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2d42f1f51da", MAC:"86:e6:af:49:cd:69", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:56.053771 containerd[1537]: 2025-09-09 04:51:56.051 [INFO][4386] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" Namespace="calico-system" Pod="calico-kube-controllers-75b67558cc-6p5fq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75b67558cc--6p5fq-eth0" Sep 9 04:51:56.074305 containerd[1537]: time="2025-09-09T04:51:56.074220908Z" level=info msg="connecting to shim b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90" address="unix:///run/containerd/s/d4a8017705fd1e0ee6d8acbdaf20bd7f5be9498fea644bcfed763dd213bc72d8" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:56.075306 containerd[1537]: time="2025-09-09T04:51:56.075279739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-t4qjf,Uid:b37d8bb0-5597-4873-af34-4b17c8a8ba2d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711\"" Sep 9 04:51:56.077125 containerd[1537]: time="2025-09-09T04:51:56.076962298Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:51:56.113377 systemd[1]: Started cri-containerd-b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90.scope - libcontainer container b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90. Sep 9 04:51:56.124434 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:56.144322 containerd[1537]: time="2025-09-09T04:51:56.144287478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75b67558cc-6p5fq,Uid:21127a09-5bac-4c83-afd6-43950e57c418,Namespace:calico-system,Attempt:0,} returns sandbox id \"b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90\"" Sep 9 04:51:56.626319 systemd-networkd[1443]: cali6a78b068e4f: Gained IPv6LL Sep 9 04:51:56.828183 containerd[1537]: time="2025-09-09T04:51:56.828119180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-ltdk2,Uid:8ba30964-2505-4075-b0a9-71a192936193,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:51:56.994568 systemd-networkd[1443]: cali694ec913280: Link UP Sep 9 04:51:56.994786 systemd-networkd[1443]: cali694ec913280: Gained carrier Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.864 [INFO][4555] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.892 [INFO][4555] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0 calico-apiserver-c8977fc6c- calico-apiserver 8ba30964-2505-4075-b0a9-71a192936193 791 0 2025-09-09 04:51:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c8977fc6c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-c8977fc6c-ltdk2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali694ec913280 [] [] }} ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.892 [INFO][4555] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.927 [INFO][4570] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" HandleID="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Workload="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.927 [INFO][4570] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" HandleID="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Workload="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dca00), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-c8977fc6c-ltdk2", "timestamp":"2025-09-09 04:51:56.927256487 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.927 [INFO][4570] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.927 [INFO][4570] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.927 [INFO][4570] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.950 [INFO][4570] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.959 [INFO][4570] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.964 [INFO][4570] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.966 [INFO][4570] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.968 [INFO][4570] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.968 [INFO][4570] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.971 [INFO][4570] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8 Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.977 [INFO][4570] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.988 [INFO][4570] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.988 [INFO][4570] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" host="localhost" Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.988 [INFO][4570] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:57.010414 containerd[1537]: 2025-09-09 04:51:56.988 [INFO][4570] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" HandleID="k8s-pod-network.a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Workload="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" Sep 9 04:51:57.011120 containerd[1537]: 2025-09-09 04:51:56.992 [INFO][4555] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0", GenerateName:"calico-apiserver-c8977fc6c-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ba30964-2505-4075-b0a9-71a192936193", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c8977fc6c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-c8977fc6c-ltdk2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali694ec913280", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:57.011120 containerd[1537]: 2025-09-09 04:51:56.992 [INFO][4555] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" Sep 9 04:51:57.011120 containerd[1537]: 2025-09-09 04:51:56.992 [INFO][4555] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali694ec913280 ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" Sep 9 04:51:57.011120 containerd[1537]: 2025-09-09 04:51:56.994 [INFO][4555] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" Sep 9 04:51:57.011120 containerd[1537]: 2025-09-09 04:51:56.995 [INFO][4555] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0", GenerateName:"calico-apiserver-c8977fc6c-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ba30964-2505-4075-b0a9-71a192936193", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c8977fc6c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8", Pod:"calico-apiserver-c8977fc6c-ltdk2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali694ec913280", MAC:"f6:0e:e8:84:ac:4d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:57.011120 containerd[1537]: 2025-09-09 04:51:57.005 [INFO][4555] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" Namespace="calico-apiserver" Pod="calico-apiserver-c8977fc6c-ltdk2" WorkloadEndpoint="localhost-k8s-calico--apiserver--c8977fc6c--ltdk2-eth0" Sep 9 04:51:57.037218 containerd[1537]: time="2025-09-09T04:51:57.037172991Z" level=info msg="connecting to shim a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8" address="unix:///run/containerd/s/78b6d93d8d1745b58742084389cfa8c80bdb5bc13d4ed280bd69604f26bd02be" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:57.072351 systemd[1]: Started cri-containerd-a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8.scope - libcontainer container a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8. Sep 9 04:51:57.084447 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:57.118098 systemd[1]: Started sshd@7-10.0.0.32:22-10.0.0.1:57204.service - OpenSSH per-connection server daemon (10.0.0.1:57204). Sep 9 04:51:57.139760 containerd[1537]: time="2025-09-09T04:51:57.139723717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c8977fc6c-ltdk2,Uid:8ba30964-2505-4075-b0a9-71a192936193,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8\"" Sep 9 04:51:57.191744 sshd[4634]: Accepted publickey for core from 10.0.0.1 port 57204 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:51:57.194546 sshd-session[4634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:57.200890 systemd-logind[1514]: New session 8 of user core. Sep 9 04:51:57.202237 systemd-networkd[1443]: cali2d42f1f51da: Gained IPv6LL Sep 9 04:51:57.205313 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 04:51:57.431769 sshd[4638]: Connection closed by 10.0.0.1 port 57204 Sep 9 04:51:57.431669 sshd-session[4634]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:57.435561 systemd[1]: sshd@7-10.0.0.32:22-10.0.0.1:57204.service: Deactivated successfully. Sep 9 04:51:57.439862 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 04:51:57.441371 systemd-logind[1514]: Session 8 logged out. Waiting for processes to exit. Sep 9 04:51:57.442981 systemd-logind[1514]: Removed session 8. Sep 9 04:51:57.522410 systemd-networkd[1443]: calife93e19878e: Gained IPv6LL Sep 9 04:51:57.765456 containerd[1537]: time="2025-09-09T04:51:57.765351181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:57.766650 containerd[1537]: time="2025-09-09T04:51:57.766618557Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 04:51:57.767801 containerd[1537]: time="2025-09-09T04:51:57.767552806Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:57.770057 containerd[1537]: time="2025-09-09T04:51:57.770016227Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:57.770821 containerd[1537]: time="2025-09-09T04:51:57.770791295Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 1.693794151s" Sep 9 04:51:57.770940 containerd[1537]: time="2025-09-09T04:51:57.770923633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:51:57.772056 containerd[1537]: time="2025-09-09T04:51:57.772034747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 04:51:57.774067 containerd[1537]: time="2025-09-09T04:51:57.774034304Z" level=info msg="CreateContainer within sandbox \"88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:51:57.782619 containerd[1537]: time="2025-09-09T04:51:57.780196158Z" level=info msg="Container 0f6aa855c284208b2c055c21b64754b5bc341293d20e8ec468713255decdf954: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:57.788728 containerd[1537]: time="2025-09-09T04:51:57.788613564Z" level=info msg="CreateContainer within sandbox \"88f49aa33344649063625868629fcc9d35da9df884695d82d433a47c7e561711\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0f6aa855c284208b2c055c21b64754b5bc341293d20e8ec468713255decdf954\"" Sep 9 04:51:57.789117 containerd[1537]: time="2025-09-09T04:51:57.789091070Z" level=info msg="StartContainer for \"0f6aa855c284208b2c055c21b64754b5bc341293d20e8ec468713255decdf954\"" Sep 9 04:51:57.790383 containerd[1537]: time="2025-09-09T04:51:57.790355245Z" level=info msg="connecting to shim 0f6aa855c284208b2c055c21b64754b5bc341293d20e8ec468713255decdf954" address="unix:///run/containerd/s/c5d9dbc9327fe04b4a651dc29ad5cb6f35452f3bb58235a1ea496066061cbddc" protocol=ttrpc version=3 Sep 9 04:51:57.811321 systemd[1]: Started cri-containerd-0f6aa855c284208b2c055c21b64754b5bc341293d20e8ec468713255decdf954.scope - libcontainer container 0f6aa855c284208b2c055c21b64754b5bc341293d20e8ec468713255decdf954. Sep 9 04:51:57.827286 containerd[1537]: time="2025-09-09T04:51:57.827247275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pz4n4,Uid:093f1e96-6646-4411-bb5b-eecbd26e4d17,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:57.859108 containerd[1537]: time="2025-09-09T04:51:57.859069123Z" level=info msg="StartContainer for \"0f6aa855c284208b2c055c21b64754b5bc341293d20e8ec468713255decdf954\" returns successfully" Sep 9 04:51:57.960275 systemd-networkd[1443]: cali9c544412e05: Link UP Sep 9 04:51:57.960475 systemd-networkd[1443]: cali9c544412e05: Gained carrier Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.858 [INFO][4703] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.871 [INFO][4703] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--pz4n4-eth0 csi-node-driver- calico-system 093f1e96-6646-4411-bb5b-eecbd26e4d17 686 0 2025-09-09 04:51:36 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-pz4n4 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9c544412e05 [] [] }} ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.871 [INFO][4703] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-eth0" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.896 [INFO][4731] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" HandleID="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Workload="localhost-k8s-csi--node--driver--pz4n4-eth0" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.896 [INFO][4731] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" HandleID="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Workload="localhost-k8s-csi--node--driver--pz4n4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000435150), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-pz4n4", "timestamp":"2025-09-09 04:51:57.896359249 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.896 [INFO][4731] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.896 [INFO][4731] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.896 [INFO][4731] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.908 [INFO][4731] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.915 [INFO][4731] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.920 [INFO][4731] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.922 [INFO][4731] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.924 [INFO][4731] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.924 [INFO][4731] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.925 [INFO][4731] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.936 [INFO][4731] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.955 [INFO][4731] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.955 [INFO][4731] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" host="localhost" Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.955 [INFO][4731] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:58.076468 containerd[1537]: 2025-09-09 04:51:57.955 [INFO][4731] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" HandleID="k8s-pod-network.1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Workload="localhost-k8s-csi--node--driver--pz4n4-eth0" Sep 9 04:51:58.077281 containerd[1537]: 2025-09-09 04:51:57.957 [INFO][4703] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--pz4n4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"093f1e96-6646-4411-bb5b-eecbd26e4d17", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-pz4n4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9c544412e05", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:58.077281 containerd[1537]: 2025-09-09 04:51:57.957 [INFO][4703] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-eth0" Sep 9 04:51:58.077281 containerd[1537]: 2025-09-09 04:51:57.957 [INFO][4703] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c544412e05 ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-eth0" Sep 9 04:51:58.077281 containerd[1537]: 2025-09-09 04:51:57.959 [INFO][4703] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-eth0" Sep 9 04:51:58.077281 containerd[1537]: 2025-09-09 04:51:57.960 [INFO][4703] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--pz4n4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"093f1e96-6646-4411-bb5b-eecbd26e4d17", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b", Pod:"csi-node-driver-pz4n4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9c544412e05", MAC:"26:a3:89:35:a4:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:58.077281 containerd[1537]: 2025-09-09 04:51:58.073 [INFO][4703] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" Namespace="calico-system" Pod="csi-node-driver-pz4n4" WorkloadEndpoint="localhost-k8s-csi--node--driver--pz4n4-eth0" Sep 9 04:51:58.342811 containerd[1537]: time="2025-09-09T04:51:58.342606180Z" level=info msg="connecting to shim 1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b" address="unix:///run/containerd/s/e34e426b39f1e2c26a179d761a71f0d1e44b039977796637a460308518a73d0d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:58.369299 systemd[1]: Started cri-containerd-1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b.scope - libcontainer container 1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b. Sep 9 04:51:58.379061 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:58.391128 containerd[1537]: time="2025-09-09T04:51:58.391087924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pz4n4,Uid:093f1e96-6646-4411-bb5b-eecbd26e4d17,Namespace:calico-system,Attempt:0,} returns sandbox id \"1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b\"" Sep 9 04:51:58.806618 systemd-networkd[1443]: cali694ec913280: Gained IPv6LL Sep 9 04:51:58.827010 containerd[1537]: time="2025-09-09T04:51:58.826680487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xmwf5,Uid:46872eaa-5736-4fd1-abbb-412aa5718fa0,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:58.968264 kubelet[2675]: I0909 04:51:58.967943 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:51:58.984995 kubelet[2675]: I0909 04:51:58.983418 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c8977fc6c-t4qjf" podStartSLOduration=26.288183693 podStartE2EDuration="27.983401483s" podCreationTimestamp="2025-09-09 04:51:31 +0000 UTC" firstStartedPulling="2025-09-09 04:51:56.07669298 +0000 UTC m=+38.342854555" lastFinishedPulling="2025-09-09 04:51:57.77191081 +0000 UTC m=+40.038072345" observedRunningTime="2025-09-09 04:51:58.094832212 +0000 UTC m=+40.360993827" watchObservedRunningTime="2025-09-09 04:51:58.983401483 +0000 UTC m=+41.249563018" Sep 9 04:51:59.145016 systemd-networkd[1443]: calie05d008b5f2: Link UP Sep 9 04:51:59.145223 systemd-networkd[1443]: calie05d008b5f2: Gained carrier Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.019 [INFO][4830] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.052 [INFO][4830] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--xmwf5-eth0 goldmane-54d579b49d- calico-system 46872eaa-5736-4fd1-abbb-412aa5718fa0 789 0 2025-09-09 04:51:35 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-xmwf5 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie05d008b5f2 [] [] }} ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.052 [INFO][4830] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.089 [INFO][4849] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" HandleID="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Workload="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.089 [INFO][4849] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" HandleID="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Workload="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003acc10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-xmwf5", "timestamp":"2025-09-09 04:51:59.089299887 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.089 [INFO][4849] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.089 [INFO][4849] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.090 [INFO][4849] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.103 [INFO][4849] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.111 [INFO][4849] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.116 [INFO][4849] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.120 [INFO][4849] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.123 [INFO][4849] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.123 [INFO][4849] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.125 [INFO][4849] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7 Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.129 [INFO][4849] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.138 [INFO][4849] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.139 [INFO][4849] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" host="localhost" Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.139 [INFO][4849] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:59.162834 containerd[1537]: 2025-09-09 04:51:59.139 [INFO][4849] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" HandleID="k8s-pod-network.6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Workload="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" Sep 9 04:51:59.164088 containerd[1537]: 2025-09-09 04:51:59.142 [INFO][4830] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--xmwf5-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"46872eaa-5736-4fd1-abbb-412aa5718fa0", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-xmwf5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie05d008b5f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:59.164088 containerd[1537]: 2025-09-09 04:51:59.142 [INFO][4830] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" Sep 9 04:51:59.164088 containerd[1537]: 2025-09-09 04:51:59.142 [INFO][4830] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie05d008b5f2 ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" Sep 9 04:51:59.164088 containerd[1537]: 2025-09-09 04:51:59.145 [INFO][4830] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" Sep 9 04:51:59.164088 containerd[1537]: 2025-09-09 04:51:59.145 [INFO][4830] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--xmwf5-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"46872eaa-5736-4fd1-abbb-412aa5718fa0", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7", Pod:"goldmane-54d579b49d-xmwf5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie05d008b5f2", MAC:"4e:6e:2f:ad:04:ac", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:59.164088 containerd[1537]: 2025-09-09 04:51:59.160 [INFO][4830] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" Namespace="calico-system" Pod="goldmane-54d579b49d-xmwf5" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xmwf5-eth0" Sep 9 04:51:59.200817 containerd[1537]: time="2025-09-09T04:51:59.200703997Z" level=info msg="connecting to shim 6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7" address="unix:///run/containerd/s/ffa1117146814b8692d25c0cf9999ba9a30db46b9838f82a6a3a24d39f6f88c5" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:59.240536 systemd[1]: Started cri-containerd-6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7.scope - libcontainer container 6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7. Sep 9 04:51:59.276744 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:59.331640 containerd[1537]: time="2025-09-09T04:51:59.331536586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xmwf5,Uid:46872eaa-5736-4fd1-abbb-412aa5718fa0,Namespace:calico-system,Attempt:0,} returns sandbox id \"6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7\"" Sep 9 04:51:59.698367 systemd-networkd[1443]: cali9c544412e05: Gained IPv6LL Sep 9 04:51:59.753619 systemd-networkd[1443]: vxlan.calico: Link UP Sep 9 04:51:59.753629 systemd-networkd[1443]: vxlan.calico: Gained carrier Sep 9 04:51:59.828157 containerd[1537]: time="2025-09-09T04:51:59.828102936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nsjpv,Uid:f1c0f5dc-c087-4532-a9e5-5f9167f6f542,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:59.838434 containerd[1537]: time="2025-09-09T04:51:59.838329522Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:59.839860 containerd[1537]: time="2025-09-09T04:51:59.839811358Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 04:51:59.840408 containerd[1537]: time="2025-09-09T04:51:59.840376592Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:59.847193 containerd[1537]: time="2025-09-09T04:51:59.847118360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:59.848147 containerd[1537]: time="2025-09-09T04:51:59.847960751Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.075800786s" Sep 9 04:51:59.848147 containerd[1537]: time="2025-09-09T04:51:59.847995195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 04:51:59.851956 containerd[1537]: time="2025-09-09T04:51:59.851922913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:51:59.867911 containerd[1537]: time="2025-09-09T04:51:59.867869612Z" level=info msg="CreateContainer within sandbox \"b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 04:51:59.877209 containerd[1537]: time="2025-09-09T04:51:59.877160476Z" level=info msg="Container bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:59.906334 containerd[1537]: time="2025-09-09T04:51:59.906281151Z" level=info msg="CreateContainer within sandbox \"b6f587d1c6b1cbf5a2508411f829b4fae4c299fd6148569f290c0f617d186a90\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b\"" Sep 9 04:51:59.907402 containerd[1537]: time="2025-09-09T04:51:59.907355972Z" level=info msg="StartContainer for \"bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b\"" Sep 9 04:51:59.908460 containerd[1537]: time="2025-09-09T04:51:59.908417072Z" level=info msg="connecting to shim bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b" address="unix:///run/containerd/s/d4a8017705fd1e0ee6d8acbdaf20bd7f5be9498fea644bcfed763dd213bc72d8" protocol=ttrpc version=3 Sep 9 04:51:59.941315 systemd[1]: Started cri-containerd-bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b.scope - libcontainer container bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b. Sep 9 04:52:00.023243 containerd[1537]: time="2025-09-09T04:52:00.022808748Z" level=info msg="StartContainer for \"bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b\" returns successfully" Sep 9 04:52:00.024125 systemd-networkd[1443]: cali34b15c3d50c: Link UP Sep 9 04:52:00.024955 systemd-networkd[1443]: cali34b15c3d50c: Gained carrier Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.902 [INFO][4977] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0 coredns-668d6bf9bc- kube-system f1c0f5dc-c087-4532-a9e5-5f9167f6f542 785 0 2025-09-09 04:51:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-nsjpv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali34b15c3d50c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.902 [INFO][4977] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.961 [INFO][5009] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" HandleID="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Workload="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.961 [INFO][5009] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" HandleID="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Workload="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136320), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-nsjpv", "timestamp":"2025-09-09 04:51:59.961588154 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.962 [INFO][5009] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.962 [INFO][5009] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.963 [INFO][5009] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.981 [INFO][5009] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.985 [INFO][5009] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.989 [INFO][5009] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.992 [INFO][5009] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.994 [INFO][5009] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.994 [INFO][5009] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:51:59.998 [INFO][5009] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:52:00.004 [INFO][5009] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:52:00.014 [INFO][5009] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:52:00.015 [INFO][5009] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" host="localhost" Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:52:00.015 [INFO][5009] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:52:00.043289 containerd[1537]: 2025-09-09 04:52:00.015 [INFO][5009] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" HandleID="k8s-pod-network.ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Workload="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" Sep 9 04:52:00.043789 containerd[1537]: 2025-09-09 04:52:00.021 [INFO][4977] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f1c0f5dc-c087-4532-a9e5-5f9167f6f542", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-nsjpv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali34b15c3d50c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:52:00.043789 containerd[1537]: 2025-09-09 04:52:00.021 [INFO][4977] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" Sep 9 04:52:00.043789 containerd[1537]: 2025-09-09 04:52:00.021 [INFO][4977] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali34b15c3d50c ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" Sep 9 04:52:00.043789 containerd[1537]: 2025-09-09 04:52:00.023 [INFO][4977] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" Sep 9 04:52:00.043789 containerd[1537]: 2025-09-09 04:52:00.025 [INFO][4977] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f1c0f5dc-c087-4532-a9e5-5f9167f6f542", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 51, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a", Pod:"coredns-668d6bf9bc-nsjpv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali34b15c3d50c", MAC:"72:ff:03:24:70:aa", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:52:00.043789 containerd[1537]: 2025-09-09 04:52:00.040 [INFO][4977] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" Namespace="kube-system" Pod="coredns-668d6bf9bc-nsjpv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--nsjpv-eth0" Sep 9 04:52:00.072122 containerd[1537]: time="2025-09-09T04:52:00.072073803Z" level=info msg="connecting to shim ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a" address="unix:///run/containerd/s/e4b4c8a5f510413ddabb38cd27bd83d9a5c3e7e20bdaffe8be5d2bf42035ee5d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:52:00.095352 systemd[1]: Started cri-containerd-ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a.scope - libcontainer container ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a. Sep 9 04:52:00.104289 containerd[1537]: time="2025-09-09T04:52:00.104210655Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:00.104389 containerd[1537]: time="2025-09-09T04:52:00.104258101Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 04:52:00.107731 containerd[1537]: time="2025-09-09T04:52:00.107583729Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 255.622612ms" Sep 9 04:52:00.107731 containerd[1537]: time="2025-09-09T04:52:00.107616293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:52:00.110556 containerd[1537]: time="2025-09-09T04:52:00.110463979Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 04:52:00.112966 containerd[1537]: time="2025-09-09T04:52:00.112926416Z" level=info msg="CreateContainer within sandbox \"a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:52:00.115595 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:52:00.122320 containerd[1537]: time="2025-09-09T04:52:00.122283699Z" level=info msg="Container d7c4db18c2e55f4720488584f6954b1fc213a988638e39e01b34839169c64869: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:52:00.138918 containerd[1537]: time="2025-09-09T04:52:00.138855830Z" level=info msg="CreateContainer within sandbox \"a44b3fb2253e6623103ed656516494357c352487bc2041bd93cbb19ed9bf6ce8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d7c4db18c2e55f4720488584f6954b1fc213a988638e39e01b34839169c64869\"" Sep 9 04:52:00.141032 containerd[1537]: time="2025-09-09T04:52:00.140987984Z" level=info msg="StartContainer for \"d7c4db18c2e55f4720488584f6954b1fc213a988638e39e01b34839169c64869\"" Sep 9 04:52:00.142839 containerd[1537]: time="2025-09-09T04:52:00.142791816Z" level=info msg="connecting to shim d7c4db18c2e55f4720488584f6954b1fc213a988638e39e01b34839169c64869" address="unix:///run/containerd/s/78b6d93d8d1745b58742084389cfa8c80bdb5bc13d4ed280bd69604f26bd02be" protocol=ttrpc version=3 Sep 9 04:52:00.167446 containerd[1537]: time="2025-09-09T04:52:00.167390419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nsjpv,Uid:f1c0f5dc-c087-4532-a9e5-5f9167f6f542,Namespace:kube-system,Attempt:0,} returns sandbox id \"ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a\"" Sep 9 04:52:00.176362 containerd[1537]: time="2025-09-09T04:52:00.176326568Z" level=info msg="CreateContainer within sandbox \"ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:52:00.180335 systemd[1]: Started cri-containerd-d7c4db18c2e55f4720488584f6954b1fc213a988638e39e01b34839169c64869.scope - libcontainer container d7c4db18c2e55f4720488584f6954b1fc213a988638e39e01b34839169c64869. Sep 9 04:52:00.187684 containerd[1537]: time="2025-09-09T04:52:00.187649344Z" level=info msg="Container 1930c59bdbaa0d0af63a5c0409a55d1a88a07f99b8111eb84dfe39cc20e6d7fd: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:52:00.194063 containerd[1537]: time="2025-09-09T04:52:00.194013882Z" level=info msg="CreateContainer within sandbox \"ecfba1b3f84f7f1c81513277592973165219a2c1fd4b7b3b33c2243bfb246a5a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1930c59bdbaa0d0af63a5c0409a55d1a88a07f99b8111eb84dfe39cc20e6d7fd\"" Sep 9 04:52:00.194606 containerd[1537]: time="2025-09-09T04:52:00.194580395Z" level=info msg="StartContainer for \"1930c59bdbaa0d0af63a5c0409a55d1a88a07f99b8111eb84dfe39cc20e6d7fd\"" Sep 9 04:52:00.195455 containerd[1537]: time="2025-09-09T04:52:00.195424624Z" level=info msg="connecting to shim 1930c59bdbaa0d0af63a5c0409a55d1a88a07f99b8111eb84dfe39cc20e6d7fd" address="unix:///run/containerd/s/e4b4c8a5f510413ddabb38cd27bd83d9a5c3e7e20bdaffe8be5d2bf42035ee5d" protocol=ttrpc version=3 Sep 9 04:52:00.217320 systemd[1]: Started cri-containerd-1930c59bdbaa0d0af63a5c0409a55d1a88a07f99b8111eb84dfe39cc20e6d7fd.scope - libcontainer container 1930c59bdbaa0d0af63a5c0409a55d1a88a07f99b8111eb84dfe39cc20e6d7fd. Sep 9 04:52:00.228479 containerd[1537]: time="2025-09-09T04:52:00.228442109Z" level=info msg="StartContainer for \"d7c4db18c2e55f4720488584f6954b1fc213a988638e39e01b34839169c64869\" returns successfully" Sep 9 04:52:00.263791 containerd[1537]: time="2025-09-09T04:52:00.263518659Z" level=info msg="StartContainer for \"1930c59bdbaa0d0af63a5c0409a55d1a88a07f99b8111eb84dfe39cc20e6d7fd\" returns successfully" Sep 9 04:52:00.853043 systemd-networkd[1443]: calie05d008b5f2: Gained IPv6LL Sep 9 04:52:01.000182 containerd[1537]: time="2025-09-09T04:52:00.999913026Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:01.000400 containerd[1537]: time="2025-09-09T04:52:01.000365884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 04:52:01.001503 containerd[1537]: time="2025-09-09T04:52:01.001462344Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:01.003576 containerd[1537]: time="2025-09-09T04:52:01.003534925Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:01.004392 containerd[1537]: time="2025-09-09T04:52:01.004371510Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 893.868325ms" Sep 9 04:52:01.004484 containerd[1537]: time="2025-09-09T04:52:01.004467282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 04:52:01.006613 containerd[1537]: time="2025-09-09T04:52:01.006570146Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 04:52:01.008506 containerd[1537]: time="2025-09-09T04:52:01.008469985Z" level=info msg="CreateContainer within sandbox \"1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 04:52:01.042162 containerd[1537]: time="2025-09-09T04:52:01.041568664Z" level=info msg="Container f0836b597e0a31d9628f6924e9d7bcc0f8a4855fb392b55c245fd21e6a58c2f2: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:52:01.053358 kubelet[2675]: I0909 04:52:01.053039 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-nsjpv" podStartSLOduration=38.053020984 podStartE2EDuration="38.053020984s" podCreationTimestamp="2025-09-09 04:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:52:01.052460553 +0000 UTC m=+43.318622168" watchObservedRunningTime="2025-09-09 04:52:01.053020984 +0000 UTC m=+43.319182559" Sep 9 04:52:01.060451 containerd[1537]: time="2025-09-09T04:52:01.060392910Z" level=info msg="CreateContainer within sandbox \"1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"f0836b597e0a31d9628f6924e9d7bcc0f8a4855fb392b55c245fd21e6a58c2f2\"" Sep 9 04:52:01.061353 containerd[1537]: time="2025-09-09T04:52:01.061294823Z" level=info msg="StartContainer for \"f0836b597e0a31d9628f6924e9d7bcc0f8a4855fb392b55c245fd21e6a58c2f2\"" Sep 9 04:52:01.062856 containerd[1537]: time="2025-09-09T04:52:01.062824736Z" level=info msg="connecting to shim f0836b597e0a31d9628f6924e9d7bcc0f8a4855fb392b55c245fd21e6a58c2f2" address="unix:///run/containerd/s/e34e426b39f1e2c26a179d761a71f0d1e44b039977796637a460308518a73d0d" protocol=ttrpc version=3 Sep 9 04:52:01.088025 kubelet[2675]: I0909 04:52:01.087190 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c8977fc6c-ltdk2" podStartSLOduration=27.118855615 podStartE2EDuration="30.087170435s" podCreationTimestamp="2025-09-09 04:51:31 +0000 UTC" firstStartedPulling="2025-09-09 04:51:57.140854553 +0000 UTC m=+39.407016128" lastFinishedPulling="2025-09-09 04:52:00.109169373 +0000 UTC m=+42.375330948" observedRunningTime="2025-09-09 04:52:01.06962299 +0000 UTC m=+43.335784605" watchObservedRunningTime="2025-09-09 04:52:01.087170435 +0000 UTC m=+43.353332010" Sep 9 04:52:01.098343 systemd[1]: Started cri-containerd-f0836b597e0a31d9628f6924e9d7bcc0f8a4855fb392b55c245fd21e6a58c2f2.scope - libcontainer container f0836b597e0a31d9628f6924e9d7bcc0f8a4855fb392b55c245fd21e6a58c2f2. Sep 9 04:52:01.108538 kubelet[2675]: I0909 04:52:01.108356 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-75b67558cc-6p5fq" podStartSLOduration=21.402435699 podStartE2EDuration="25.108339976s" podCreationTimestamp="2025-09-09 04:51:36 +0000 UTC" firstStartedPulling="2025-09-09 04:51:56.145203408 +0000 UTC m=+38.411364983" lastFinishedPulling="2025-09-09 04:51:59.851107685 +0000 UTC m=+42.117269260" observedRunningTime="2025-09-09 04:52:01.107606644 +0000 UTC m=+43.373768219" watchObservedRunningTime="2025-09-09 04:52:01.108339976 +0000 UTC m=+43.374501511" Sep 9 04:52:01.160634 containerd[1537]: time="2025-09-09T04:52:01.160576580Z" level=info msg="StartContainer for \"f0836b597e0a31d9628f6924e9d7bcc0f8a4855fb392b55c245fd21e6a58c2f2\" returns successfully" Sep 9 04:52:01.298405 systemd-networkd[1443]: vxlan.calico: Gained IPv6LL Sep 9 04:52:01.874428 systemd-networkd[1443]: cali34b15c3d50c: Gained IPv6LL Sep 9 04:52:02.047524 kubelet[2675]: I0909 04:52:02.047313 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:52:02.047524 kubelet[2675]: I0909 04:52:02.047333 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:52:02.452213 systemd[1]: Started sshd@8-10.0.0.32:22-10.0.0.1:46830.service - OpenSSH per-connection server daemon (10.0.0.1:46830). Sep 9 04:52:02.514740 sshd[5293]: Accepted publickey for core from 10.0.0.1 port 46830 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:02.516707 sshd-session[5293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:02.521876 systemd-logind[1514]: New session 9 of user core. Sep 9 04:52:02.531330 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 04:52:02.738730 sshd[5296]: Connection closed by 10.0.0.1 port 46830 Sep 9 04:52:02.739431 sshd-session[5293]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:02.745041 systemd[1]: sshd@8-10.0.0.32:22-10.0.0.1:46830.service: Deactivated successfully. Sep 9 04:52:02.748592 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 04:52:02.752034 systemd-logind[1514]: Session 9 logged out. Waiting for processes to exit. Sep 9 04:52:02.753917 systemd-logind[1514]: Removed session 9. Sep 9 04:52:03.528495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2656052350.mount: Deactivated successfully. Sep 9 04:52:04.017130 containerd[1537]: time="2025-09-09T04:52:04.017055064Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:04.017923 containerd[1537]: time="2025-09-09T04:52:04.017887962Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 04:52:04.019158 containerd[1537]: time="2025-09-09T04:52:04.018896961Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:04.021842 containerd[1537]: time="2025-09-09T04:52:04.021763819Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:04.024258 containerd[1537]: time="2025-09-09T04:52:04.024213908Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.017597196s" Sep 9 04:52:04.024258 containerd[1537]: time="2025-09-09T04:52:04.024247632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 04:52:04.026880 containerd[1537]: time="2025-09-09T04:52:04.026200263Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 04:52:04.027020 containerd[1537]: time="2025-09-09T04:52:04.026984475Z" level=info msg="CreateContainer within sandbox \"6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 04:52:04.040839 containerd[1537]: time="2025-09-09T04:52:04.039986169Z" level=info msg="Container a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:52:04.050969 containerd[1537]: time="2025-09-09T04:52:04.050911178Z" level=info msg="CreateContainer within sandbox \"6447518bd2655dea0d0a03ead66086e64dd8fab9ea796948c2c24488aff38ba7\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968\"" Sep 9 04:52:04.051631 containerd[1537]: time="2025-09-09T04:52:04.051590018Z" level=info msg="StartContainer for \"a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968\"" Sep 9 04:52:04.053605 containerd[1537]: time="2025-09-09T04:52:04.053558851Z" level=info msg="connecting to shim a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968" address="unix:///run/containerd/s/ffa1117146814b8692d25c0cf9999ba9a30db46b9838f82a6a3a24d39f6f88c5" protocol=ttrpc version=3 Sep 9 04:52:04.076328 systemd[1]: Started cri-containerd-a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968.scope - libcontainer container a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968. Sep 9 04:52:04.116298 containerd[1537]: time="2025-09-09T04:52:04.116178959Z" level=info msg="StartContainer for \"a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968\" returns successfully" Sep 9 04:52:04.956340 containerd[1537]: time="2025-09-09T04:52:04.956294646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:04.957131 containerd[1537]: time="2025-09-09T04:52:04.956953884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 04:52:04.958316 containerd[1537]: time="2025-09-09T04:52:04.958287042Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:04.960415 containerd[1537]: time="2025-09-09T04:52:04.960382129Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:04.962074 containerd[1537]: time="2025-09-09T04:52:04.962045925Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 935.792416ms" Sep 9 04:52:04.962127 containerd[1537]: time="2025-09-09T04:52:04.962081809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 04:52:04.965331 containerd[1537]: time="2025-09-09T04:52:04.965300989Z" level=info msg="CreateContainer within sandbox \"1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 04:52:04.973183 containerd[1537]: time="2025-09-09T04:52:04.971118716Z" level=info msg="Container fbbbcabeb08546f8823066f964437cb36e189b244bebe68f3103104cb16f3cc5: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:52:04.981213 containerd[1537]: time="2025-09-09T04:52:04.981175382Z" level=info msg="CreateContainer within sandbox \"1af69363bbbed7ab68d50ec958cbd6d914dc6d743f967c254e7c57be25126a1b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"fbbbcabeb08546f8823066f964437cb36e189b244bebe68f3103104cb16f3cc5\"" Sep 9 04:52:04.981795 containerd[1537]: time="2025-09-09T04:52:04.981729568Z" level=info msg="StartContainer for \"fbbbcabeb08546f8823066f964437cb36e189b244bebe68f3103104cb16f3cc5\"" Sep 9 04:52:04.983351 containerd[1537]: time="2025-09-09T04:52:04.983320035Z" level=info msg="connecting to shim fbbbcabeb08546f8823066f964437cb36e189b244bebe68f3103104cb16f3cc5" address="unix:///run/containerd/s/e34e426b39f1e2c26a179d761a71f0d1e44b039977796637a460308518a73d0d" protocol=ttrpc version=3 Sep 9 04:52:05.012341 systemd[1]: Started cri-containerd-fbbbcabeb08546f8823066f964437cb36e189b244bebe68f3103104cb16f3cc5.scope - libcontainer container fbbbcabeb08546f8823066f964437cb36e189b244bebe68f3103104cb16f3cc5. Sep 9 04:52:05.100825 containerd[1537]: time="2025-09-09T04:52:05.100774790Z" level=info msg="StartContainer for \"fbbbcabeb08546f8823066f964437cb36e189b244bebe68f3103104cb16f3cc5\" returns successfully" Sep 9 04:52:05.124537 kubelet[2675]: I0909 04:52:05.124454 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-xmwf5" podStartSLOduration=25.434588554 podStartE2EDuration="30.124437169s" podCreationTimestamp="2025-09-09 04:51:35 +0000 UTC" firstStartedPulling="2025-09-09 04:51:59.335470624 +0000 UTC m=+41.601632199" lastFinishedPulling="2025-09-09 04:52:04.025319239 +0000 UTC m=+46.291480814" observedRunningTime="2025-09-09 04:52:05.124394284 +0000 UTC m=+47.390555899" watchObservedRunningTime="2025-09-09 04:52:05.124437169 +0000 UTC m=+47.390598784" Sep 9 04:52:05.206709 containerd[1537]: time="2025-09-09T04:52:05.206596678Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968\" id:\"71b36787ea325d99224a2d3fc77e258ed6e5ca0b618f7ca4005c4d603efd69f8\" pid:5410 exit_status:1 exited_at:{seconds:1757393525 nanos:205900438}" Sep 9 04:52:05.929156 kubelet[2675]: I0909 04:52:05.929063 2675 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 04:52:05.936873 kubelet[2675]: I0909 04:52:05.936846 2675 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 04:52:06.121037 kubelet[2675]: I0909 04:52:06.120982 2675 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-pz4n4" podStartSLOduration=23.550416851 podStartE2EDuration="30.12096358s" podCreationTimestamp="2025-09-09 04:51:36 +0000 UTC" firstStartedPulling="2025-09-09 04:51:58.392426625 +0000 UTC m=+40.658588200" lastFinishedPulling="2025-09-09 04:52:04.962973394 +0000 UTC m=+47.229134929" observedRunningTime="2025-09-09 04:52:06.120575616 +0000 UTC m=+48.386737231" watchObservedRunningTime="2025-09-09 04:52:06.12096358 +0000 UTC m=+48.387125155" Sep 9 04:52:06.189157 containerd[1537]: time="2025-09-09T04:52:06.188974349Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a386e5ccc5cf7a39f993584b5c3500eee20771f923e1482d3500cdcd06c5f968\" id:\"0276eece927f1770cb13f394c3ec82a8b9f77eea308ffb4fd44421dffe240e90\" pid:5440 exit_status:1 exited_at:{seconds:1757393526 nanos:188641431}" Sep 9 04:52:07.756520 systemd[1]: Started sshd@9-10.0.0.32:22-10.0.0.1:46832.service - OpenSSH per-connection server daemon (10.0.0.1:46832). Sep 9 04:52:07.815929 sshd[5453]: Accepted publickey for core from 10.0.0.1 port 46832 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:07.817356 sshd-session[5453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:07.821222 systemd-logind[1514]: New session 10 of user core. Sep 9 04:52:07.828344 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 04:52:08.037501 sshd[5456]: Connection closed by 10.0.0.1 port 46832 Sep 9 04:52:08.038049 sshd-session[5453]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:08.047481 systemd[1]: sshd@9-10.0.0.32:22-10.0.0.1:46832.service: Deactivated successfully. Sep 9 04:52:08.050641 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 04:52:08.051473 systemd-logind[1514]: Session 10 logged out. Waiting for processes to exit. Sep 9 04:52:08.053504 systemd[1]: Started sshd@10-10.0.0.32:22-10.0.0.1:46834.service - OpenSSH per-connection server daemon (10.0.0.1:46834). Sep 9 04:52:08.055730 systemd-logind[1514]: Removed session 10. Sep 9 04:52:08.115613 sshd[5473]: Accepted publickey for core from 10.0.0.1 port 46834 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:08.117122 sshd-session[5473]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:08.123266 systemd-logind[1514]: New session 11 of user core. Sep 9 04:52:08.132517 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 04:52:08.335385 sshd[5476]: Connection closed by 10.0.0.1 port 46834 Sep 9 04:52:08.336536 sshd-session[5473]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:08.349920 systemd[1]: sshd@10-10.0.0.32:22-10.0.0.1:46834.service: Deactivated successfully. Sep 9 04:52:08.353374 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 04:52:08.354752 systemd-logind[1514]: Session 11 logged out. Waiting for processes to exit. Sep 9 04:52:08.358774 systemd[1]: Started sshd@11-10.0.0.32:22-10.0.0.1:46844.service - OpenSSH per-connection server daemon (10.0.0.1:46844). Sep 9 04:52:08.361797 systemd-logind[1514]: Removed session 11. Sep 9 04:52:08.410822 sshd[5492]: Accepted publickey for core from 10.0.0.1 port 46844 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:08.413401 sshd-session[5492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:08.417741 systemd-logind[1514]: New session 12 of user core. Sep 9 04:52:08.430268 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 04:52:08.599443 sshd[5495]: Connection closed by 10.0.0.1 port 46844 Sep 9 04:52:08.599951 sshd-session[5492]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:08.603742 systemd[1]: sshd@11-10.0.0.32:22-10.0.0.1:46844.service: Deactivated successfully. Sep 9 04:52:08.605973 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 04:52:08.606742 systemd-logind[1514]: Session 12 logged out. Waiting for processes to exit. Sep 9 04:52:08.607655 systemd-logind[1514]: Removed session 12. Sep 9 04:52:09.487983 kubelet[2675]: I0909 04:52:09.487735 2675 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:52:09.529882 containerd[1537]: time="2025-09-09T04:52:09.529840377Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b\" id:\"3ed89240a208643e7502408016b94e3d52836075665eeb12893d61aefa7924f3\" pid:5519 exited_at:{seconds:1757393529 nanos:528965922}" Sep 9 04:52:09.578673 containerd[1537]: time="2025-09-09T04:52:09.578633890Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b\" id:\"1945b325daf16bb5504cc2bf78b5b27ceb2f4802f21c39beb7a3ccdb800763ab\" pid:5542 exited_at:{seconds:1757393529 nanos:577894610}" Sep 9 04:52:13.615543 systemd[1]: Started sshd@12-10.0.0.32:22-10.0.0.1:33502.service - OpenSSH per-connection server daemon (10.0.0.1:33502). Sep 9 04:52:13.674774 sshd[5561]: Accepted publickey for core from 10.0.0.1 port 33502 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:13.676273 sshd-session[5561]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:13.680028 systemd-logind[1514]: New session 13 of user core. Sep 9 04:52:13.686308 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 04:52:13.847593 sshd[5564]: Connection closed by 10.0.0.1 port 33502 Sep 9 04:52:13.848020 sshd-session[5561]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:13.862332 systemd[1]: sshd@12-10.0.0.32:22-10.0.0.1:33502.service: Deactivated successfully. Sep 9 04:52:13.864024 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 04:52:13.864825 systemd-logind[1514]: Session 13 logged out. Waiting for processes to exit. Sep 9 04:52:13.867915 systemd[1]: Started sshd@13-10.0.0.32:22-10.0.0.1:33506.service - OpenSSH per-connection server daemon (10.0.0.1:33506). Sep 9 04:52:13.869212 systemd-logind[1514]: Removed session 13. Sep 9 04:52:13.930888 sshd[5577]: Accepted publickey for core from 10.0.0.1 port 33506 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:13.932210 sshd-session[5577]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:13.936595 systemd-logind[1514]: New session 14 of user core. Sep 9 04:52:13.945283 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 04:52:14.152260 sshd[5580]: Connection closed by 10.0.0.1 port 33506 Sep 9 04:52:14.151824 sshd-session[5577]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:14.167779 systemd[1]: sshd@13-10.0.0.32:22-10.0.0.1:33506.service: Deactivated successfully. Sep 9 04:52:14.169547 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 04:52:14.172172 systemd-logind[1514]: Session 14 logged out. Waiting for processes to exit. Sep 9 04:52:14.173868 systemd-logind[1514]: Removed session 14. Sep 9 04:52:14.175632 systemd[1]: Started sshd@14-10.0.0.32:22-10.0.0.1:33508.service - OpenSSH per-connection server daemon (10.0.0.1:33508). Sep 9 04:52:14.231008 sshd[5592]: Accepted publickey for core from 10.0.0.1 port 33508 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:14.232323 sshd-session[5592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:14.236881 systemd-logind[1514]: New session 15 of user core. Sep 9 04:52:14.251319 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 04:52:14.883727 sshd[5595]: Connection closed by 10.0.0.1 port 33508 Sep 9 04:52:14.884385 sshd-session[5592]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:14.895787 systemd[1]: sshd@14-10.0.0.32:22-10.0.0.1:33508.service: Deactivated successfully. Sep 9 04:52:14.899814 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 04:52:14.900838 systemd-logind[1514]: Session 15 logged out. Waiting for processes to exit. Sep 9 04:52:14.904777 systemd[1]: Started sshd@15-10.0.0.32:22-10.0.0.1:33520.service - OpenSSH per-connection server daemon (10.0.0.1:33520). Sep 9 04:52:14.906095 systemd-logind[1514]: Removed session 15. Sep 9 04:52:14.970854 sshd[5614]: Accepted publickey for core from 10.0.0.1 port 33520 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:14.972299 sshd-session[5614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:14.977178 systemd-logind[1514]: New session 16 of user core. Sep 9 04:52:14.987332 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 04:52:15.266896 sshd[5617]: Connection closed by 10.0.0.1 port 33520 Sep 9 04:52:15.267774 sshd-session[5614]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:15.276802 systemd[1]: sshd@15-10.0.0.32:22-10.0.0.1:33520.service: Deactivated successfully. Sep 9 04:52:15.279415 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 04:52:15.280188 systemd-logind[1514]: Session 16 logged out. Waiting for processes to exit. Sep 9 04:52:15.283502 systemd[1]: Started sshd@16-10.0.0.32:22-10.0.0.1:33532.service - OpenSSH per-connection server daemon (10.0.0.1:33532). Sep 9 04:52:15.284074 systemd-logind[1514]: Removed session 16. Sep 9 04:52:15.350641 sshd[5629]: Accepted publickey for core from 10.0.0.1 port 33532 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:15.352023 sshd-session[5629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:15.357268 systemd-logind[1514]: New session 17 of user core. Sep 9 04:52:15.362375 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 04:52:15.502445 sshd[5632]: Connection closed by 10.0.0.1 port 33532 Sep 9 04:52:15.503220 sshd-session[5629]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:15.506460 systemd-logind[1514]: Session 17 logged out. Waiting for processes to exit. Sep 9 04:52:15.506613 systemd[1]: sshd@16-10.0.0.32:22-10.0.0.1:33532.service: Deactivated successfully. Sep 9 04:52:15.508593 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 04:52:15.510835 systemd-logind[1514]: Removed session 17. Sep 9 04:52:20.027930 containerd[1537]: time="2025-09-09T04:52:20.027849838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da39557d4f0fdb8c571d6ae045641f3da76b4469821c49c874539c17932f697e\" id:\"2b7e89c21913be900a73e9bb2c59d7988df97228b50f091d7b1c68326a6e6fab\" pid:5660 exited_at:{seconds:1757393540 nanos:27489684}" Sep 9 04:52:20.522664 systemd[1]: Started sshd@17-10.0.0.32:22-10.0.0.1:36938.service - OpenSSH per-connection server daemon (10.0.0.1:36938). Sep 9 04:52:20.598183 sshd[5681]: Accepted publickey for core from 10.0.0.1 port 36938 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:20.599098 sshd-session[5681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:20.602776 systemd-logind[1514]: New session 18 of user core. Sep 9 04:52:20.612322 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 04:52:20.774669 sshd[5684]: Connection closed by 10.0.0.1 port 36938 Sep 9 04:52:20.774133 sshd-session[5681]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:20.778278 systemd[1]: sshd@17-10.0.0.32:22-10.0.0.1:36938.service: Deactivated successfully. Sep 9 04:52:20.779915 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 04:52:20.780565 systemd-logind[1514]: Session 18 logged out. Waiting for processes to exit. Sep 9 04:52:20.781449 systemd-logind[1514]: Removed session 18. Sep 9 04:52:25.798877 systemd[1]: Started sshd@18-10.0.0.32:22-10.0.0.1:36950.service - OpenSSH per-connection server daemon (10.0.0.1:36950). Sep 9 04:52:25.853804 sshd[5701]: Accepted publickey for core from 10.0.0.1 port 36950 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:25.854808 sshd-session[5701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:25.860007 systemd-logind[1514]: New session 19 of user core. Sep 9 04:52:25.870367 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 04:52:25.997051 sshd[5704]: Connection closed by 10.0.0.1 port 36950 Sep 9 04:52:25.997393 sshd-session[5701]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:26.001425 systemd[1]: sshd@18-10.0.0.32:22-10.0.0.1:36950.service: Deactivated successfully. Sep 9 04:52:26.004225 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 04:52:26.004931 systemd-logind[1514]: Session 19 logged out. Waiting for processes to exit. Sep 9 04:52:26.006183 systemd-logind[1514]: Removed session 19. Sep 9 04:52:31.011345 systemd[1]: Started sshd@19-10.0.0.32:22-10.0.0.1:52964.service - OpenSSH per-connection server daemon (10.0.0.1:52964). Sep 9 04:52:31.075033 sshd[5717]: Accepted publickey for core from 10.0.0.1 port 52964 ssh2: RSA SHA256:y2XmME+qZ8Vxpxr1aV4RZrrEEzQsCrVNEgcY8K5ZHGs Sep 9 04:52:31.076611 sshd-session[5717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:31.086350 systemd-logind[1514]: New session 20 of user core. Sep 9 04:52:31.096310 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 04:52:31.263775 sshd[5720]: Connection closed by 10.0.0.1 port 52964 Sep 9 04:52:31.264360 sshd-session[5717]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:31.268061 systemd[1]: sshd@19-10.0.0.32:22-10.0.0.1:52964.service: Deactivated successfully. Sep 9 04:52:31.269971 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 04:52:31.270721 systemd-logind[1514]: Session 20 logged out. Waiting for processes to exit. Sep 9 04:52:31.272099 systemd-logind[1514]: Removed session 20. Sep 9 04:52:31.840810 containerd[1537]: time="2025-09-09T04:52:31.840761971Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bd1ef6bbea74b51031bca66a6a59fa5e0b8c23a9c6a385f6399a797e5749e84b\" id:\"87672cfd143d4a04c777db18fa84be3788f3acecda6b05d0d223b491afb96a02\" pid:5745 exited_at:{seconds:1757393551 nanos:840321798}"