Sep 9 04:52:42.777666 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 9 04:52:42.777687 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 03:38:34 -00 2025 Sep 9 04:52:42.777697 kernel: KASLR enabled Sep 9 04:52:42.777703 kernel: efi: EFI v2.7 by EDK II Sep 9 04:52:42.777709 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb221f18 Sep 9 04:52:42.777715 kernel: random: crng init done Sep 9 04:52:42.777721 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 9 04:52:42.777727 kernel: secureboot: Secure boot enabled Sep 9 04:52:42.777733 kernel: ACPI: Early table checksum verification disabled Sep 9 04:52:42.777740 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Sep 9 04:52:42.777747 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 9 04:52:42.777753 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777766 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777773 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777780 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777789 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777795 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777801 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777808 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777814 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:52:42.777820 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 9 04:52:42.777826 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 04:52:42.777832 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:52:42.777838 kernel: NODE_DATA(0) allocated [mem 0xdc737a00-0xdc73efff] Sep 9 04:52:42.777844 kernel: Zone ranges: Sep 9 04:52:42.777852 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:52:42.777858 kernel: DMA32 empty Sep 9 04:52:42.777864 kernel: Normal empty Sep 9 04:52:42.777870 kernel: Device empty Sep 9 04:52:42.777876 kernel: Movable zone start for each node Sep 9 04:52:42.777882 kernel: Early memory node ranges Sep 9 04:52:42.777888 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Sep 9 04:52:42.777894 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Sep 9 04:52:42.777900 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Sep 9 04:52:42.777907 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Sep 9 04:52:42.777913 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Sep 9 04:52:42.777919 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Sep 9 04:52:42.777927 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Sep 9 04:52:42.777933 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Sep 9 04:52:42.777939 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 9 04:52:42.777948 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:52:42.777954 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 9 04:52:42.777961 kernel: cma: Reserved 16 MiB at 0x00000000d7a00000 on node -1 Sep 9 04:52:42.777967 kernel: psci: probing for conduit method from ACPI. Sep 9 04:52:42.777975 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 04:52:42.777981 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 04:52:42.777988 kernel: psci: Trusted OS migration not required Sep 9 04:52:42.777994 kernel: psci: SMC Calling Convention v1.1 Sep 9 04:52:42.778001 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 9 04:52:42.778007 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 04:52:42.778014 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 04:52:42.778020 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 9 04:52:42.778027 kernel: Detected PIPT I-cache on CPU0 Sep 9 04:52:42.778034 kernel: CPU features: detected: GIC system register CPU interface Sep 9 04:52:42.778041 kernel: CPU features: detected: Spectre-v4 Sep 9 04:52:42.778047 kernel: CPU features: detected: Spectre-BHB Sep 9 04:52:42.778054 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 04:52:42.778060 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 04:52:42.778066 kernel: CPU features: detected: ARM erratum 1418040 Sep 9 04:52:42.778073 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 04:52:42.778079 kernel: alternatives: applying boot alternatives Sep 9 04:52:42.778087 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:52:42.778093 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 04:52:42.778100 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 04:52:42.778108 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 04:52:42.778114 kernel: Fallback order for Node 0: 0 Sep 9 04:52:42.778120 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 9 04:52:42.778127 kernel: Policy zone: DMA Sep 9 04:52:42.778133 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 04:52:42.778140 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 9 04:52:42.778146 kernel: software IO TLB: area num 4. Sep 9 04:52:42.778152 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 9 04:52:42.778158 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Sep 9 04:52:42.778165 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 04:52:42.778171 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 04:52:42.778178 kernel: rcu: RCU event tracing is enabled. Sep 9 04:52:42.778186 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 04:52:42.778192 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 04:52:42.778199 kernel: Tracing variant of Tasks RCU enabled. Sep 9 04:52:42.778205 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 04:52:42.778212 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 04:52:42.778218 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:52:42.778225 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:52:42.778232 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 04:52:42.778238 kernel: GICv3: 256 SPIs implemented Sep 9 04:52:42.778244 kernel: GICv3: 0 Extended SPIs implemented Sep 9 04:52:42.778251 kernel: Root IRQ handler: gic_handle_irq Sep 9 04:52:42.778258 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 9 04:52:42.778265 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 9 04:52:42.778271 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 9 04:52:42.778277 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 9 04:52:42.778284 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 9 04:52:42.778290 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 9 04:52:42.778297 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 9 04:52:42.778303 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 9 04:52:42.778310 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 04:52:42.778316 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:52:42.778334 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 9 04:52:42.778341 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 9 04:52:42.778349 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 9 04:52:42.778356 kernel: arm-pv: using stolen time PV Sep 9 04:52:42.778363 kernel: Console: colour dummy device 80x25 Sep 9 04:52:42.778369 kernel: ACPI: Core revision 20240827 Sep 9 04:52:42.778376 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 9 04:52:42.778383 kernel: pid_max: default: 32768 minimum: 301 Sep 9 04:52:42.778390 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 04:52:42.778402 kernel: landlock: Up and running. Sep 9 04:52:42.778408 kernel: SELinux: Initializing. Sep 9 04:52:42.778415 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:52:42.778423 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:52:42.778430 kernel: rcu: Hierarchical SRCU implementation. Sep 9 04:52:42.778437 kernel: rcu: Max phase no-delay instances is 400. Sep 9 04:52:42.778444 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 04:52:42.778451 kernel: Remapping and enabling EFI services. Sep 9 04:52:42.778458 kernel: smp: Bringing up secondary CPUs ... Sep 9 04:52:42.778464 kernel: Detected PIPT I-cache on CPU1 Sep 9 04:52:42.778471 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 9 04:52:42.778478 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 9 04:52:42.778491 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:52:42.778498 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 9 04:52:42.778506 kernel: Detected PIPT I-cache on CPU2 Sep 9 04:52:42.778514 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 9 04:52:42.778521 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 9 04:52:42.778528 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:52:42.778535 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 9 04:52:42.778543 kernel: Detected PIPT I-cache on CPU3 Sep 9 04:52:42.778552 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 9 04:52:42.778559 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 9 04:52:42.778569 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:52:42.778576 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 9 04:52:42.778583 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 04:52:42.778590 kernel: SMP: Total of 4 processors activated. Sep 9 04:52:42.778598 kernel: CPU: All CPU(s) started at EL1 Sep 9 04:52:42.778605 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 04:52:42.778613 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 04:52:42.778621 kernel: CPU features: detected: Common not Private translations Sep 9 04:52:42.778629 kernel: CPU features: detected: CRC32 instructions Sep 9 04:52:42.778636 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 9 04:52:42.778643 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 04:52:42.778650 kernel: CPU features: detected: LSE atomic instructions Sep 9 04:52:42.778657 kernel: CPU features: detected: Privileged Access Never Sep 9 04:52:42.778664 kernel: CPU features: detected: RAS Extension Support Sep 9 04:52:42.778671 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 04:52:42.778678 kernel: alternatives: applying system-wide alternatives Sep 9 04:52:42.778687 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 9 04:52:42.778695 kernel: Memory: 2422372K/2572288K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38976K init, 1038K bss, 127580K reserved, 16384K cma-reserved) Sep 9 04:52:42.778702 kernel: devtmpfs: initialized Sep 9 04:52:42.778710 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 04:52:42.778717 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 04:52:42.778725 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 04:52:42.778732 kernel: 0 pages in range for non-PLT usage Sep 9 04:52:42.778739 kernel: 508560 pages in range for PLT usage Sep 9 04:52:42.778747 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 04:52:42.778755 kernel: SMBIOS 3.0.0 present. Sep 9 04:52:42.778768 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 9 04:52:42.778776 kernel: DMI: Memory slots populated: 1/1 Sep 9 04:52:42.778782 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 04:52:42.778789 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 04:52:42.778797 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 04:52:42.778804 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 04:52:42.778813 kernel: audit: initializing netlink subsys (disabled) Sep 9 04:52:42.778820 kernel: audit: type=2000 audit(0.024:1): state=initialized audit_enabled=0 res=1 Sep 9 04:52:42.778829 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 04:52:42.778836 kernel: cpuidle: using governor menu Sep 9 04:52:42.778844 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 04:52:42.778851 kernel: ASID allocator initialised with 32768 entries Sep 9 04:52:42.778858 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 04:52:42.778865 kernel: Serial: AMBA PL011 UART driver Sep 9 04:52:42.778872 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 04:52:42.778879 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 04:52:42.778886 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 04:52:42.778894 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 04:52:42.778902 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 04:52:42.778909 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 04:52:42.778916 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 04:52:42.778925 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 04:52:42.778949 kernel: ACPI: Added _OSI(Module Device) Sep 9 04:52:42.778956 kernel: ACPI: Added _OSI(Processor Device) Sep 9 04:52:42.778964 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 04:52:42.778971 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 04:52:42.778979 kernel: ACPI: Interpreter enabled Sep 9 04:52:42.778986 kernel: ACPI: Using GIC for interrupt routing Sep 9 04:52:42.778993 kernel: ACPI: MCFG table detected, 1 entries Sep 9 04:52:42.779000 kernel: ACPI: CPU0 has been hot-added Sep 9 04:52:42.779007 kernel: ACPI: CPU1 has been hot-added Sep 9 04:52:42.779013 kernel: ACPI: CPU2 has been hot-added Sep 9 04:52:42.779020 kernel: ACPI: CPU3 has been hot-added Sep 9 04:52:42.779027 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 9 04:52:42.779034 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 04:52:42.779043 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 04:52:42.779191 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 04:52:42.779269 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 04:52:42.779393 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 04:52:42.779454 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 9 04:52:42.779511 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 9 04:52:42.779520 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 9 04:52:42.779531 kernel: PCI host bridge to bus 0000:00 Sep 9 04:52:42.779596 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 9 04:52:42.779648 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 9 04:52:42.779701 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 9 04:52:42.779752 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 04:52:42.779845 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 9 04:52:42.779921 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 04:52:42.780004 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 9 04:52:42.780071 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 9 04:52:42.780137 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 9 04:52:42.780201 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 9 04:52:42.780265 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 9 04:52:42.780335 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 9 04:52:42.780388 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 9 04:52:42.780443 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 9 04:52:42.780495 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 9 04:52:42.780504 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 9 04:52:42.780511 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 9 04:52:42.780518 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 9 04:52:42.780525 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 9 04:52:42.780532 kernel: iommu: Default domain type: Translated Sep 9 04:52:42.780539 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 04:52:42.780548 kernel: efivars: Registered efivars operations Sep 9 04:52:42.780555 kernel: vgaarb: loaded Sep 9 04:52:42.780562 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 04:52:42.780569 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 04:52:42.780576 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 04:52:42.780583 kernel: pnp: PnP ACPI init Sep 9 04:52:42.780653 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 9 04:52:42.780663 kernel: pnp: PnP ACPI: found 1 devices Sep 9 04:52:42.780672 kernel: NET: Registered PF_INET protocol family Sep 9 04:52:42.780679 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 04:52:42.780687 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 04:52:42.780694 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 04:52:42.780701 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 04:52:42.780708 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 04:52:42.780715 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 04:52:42.780723 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:52:42.780730 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:52:42.780738 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 04:52:42.780745 kernel: PCI: CLS 0 bytes, default 64 Sep 9 04:52:42.780753 kernel: kvm [1]: HYP mode not available Sep 9 04:52:42.780766 kernel: Initialise system trusted keyrings Sep 9 04:52:42.780774 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 04:52:42.780781 kernel: Key type asymmetric registered Sep 9 04:52:42.780788 kernel: Asymmetric key parser 'x509' registered Sep 9 04:52:42.780795 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 04:52:42.780805 kernel: io scheduler mq-deadline registered Sep 9 04:52:42.780813 kernel: io scheduler kyber registered Sep 9 04:52:42.780821 kernel: io scheduler bfq registered Sep 9 04:52:42.780828 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 9 04:52:42.780836 kernel: ACPI: button: Power Button [PWRB] Sep 9 04:52:42.780843 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 9 04:52:42.780908 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 9 04:52:42.780918 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 04:52:42.780925 kernel: thunder_xcv, ver 1.0 Sep 9 04:52:42.780932 kernel: thunder_bgx, ver 1.0 Sep 9 04:52:42.780940 kernel: nicpf, ver 1.0 Sep 9 04:52:42.780947 kernel: nicvf, ver 1.0 Sep 9 04:52:42.781015 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 04:52:42.781073 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T04:52:42 UTC (1757393562) Sep 9 04:52:42.781082 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 04:52:42.781090 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 04:52:42.781097 kernel: watchdog: NMI not fully supported Sep 9 04:52:42.781104 kernel: watchdog: Hard watchdog permanently disabled Sep 9 04:52:42.781113 kernel: NET: Registered PF_INET6 protocol family Sep 9 04:52:42.781120 kernel: Segment Routing with IPv6 Sep 9 04:52:42.781127 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 04:52:42.781134 kernel: NET: Registered PF_PACKET protocol family Sep 9 04:52:42.781141 kernel: Key type dns_resolver registered Sep 9 04:52:42.781148 kernel: registered taskstats version 1 Sep 9 04:52:42.781155 kernel: Loading compiled-in X.509 certificates Sep 9 04:52:42.781163 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 44d1e8b5c5ffbaa3cedd99c03d41580671fabec5' Sep 9 04:52:42.781170 kernel: Demotion targets for Node 0: null Sep 9 04:52:42.781178 kernel: Key type .fscrypt registered Sep 9 04:52:42.781185 kernel: Key type fscrypt-provisioning registered Sep 9 04:52:42.781192 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 04:52:42.781200 kernel: ima: Allocated hash algorithm: sha1 Sep 9 04:52:42.781207 kernel: ima: No architecture policies found Sep 9 04:52:42.781214 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 04:52:42.781221 kernel: clk: Disabling unused clocks Sep 9 04:52:42.781228 kernel: PM: genpd: Disabling unused power domains Sep 9 04:52:42.781235 kernel: Warning: unable to open an initial console. Sep 9 04:52:42.781243 kernel: Freeing unused kernel memory: 38976K Sep 9 04:52:42.781250 kernel: Run /init as init process Sep 9 04:52:42.781257 kernel: with arguments: Sep 9 04:52:42.781264 kernel: /init Sep 9 04:52:42.781271 kernel: with environment: Sep 9 04:52:42.781277 kernel: HOME=/ Sep 9 04:52:42.781284 kernel: TERM=linux Sep 9 04:52:42.781292 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 04:52:42.781300 systemd[1]: Successfully made /usr/ read-only. Sep 9 04:52:42.781312 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:52:42.781329 systemd[1]: Detected virtualization kvm. Sep 9 04:52:42.781337 systemd[1]: Detected architecture arm64. Sep 9 04:52:42.781344 systemd[1]: Running in initrd. Sep 9 04:52:42.781351 systemd[1]: No hostname configured, using default hostname. Sep 9 04:52:42.781359 systemd[1]: Hostname set to . Sep 9 04:52:42.781367 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:52:42.781376 systemd[1]: Queued start job for default target initrd.target. Sep 9 04:52:42.781383 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:52:42.781391 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:52:42.781399 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 04:52:42.781407 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:52:42.781414 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 04:52:42.781423 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 04:52:42.781432 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 04:52:42.781440 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 04:52:42.781448 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:52:42.781455 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:52:42.781463 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:52:42.781471 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:52:42.781478 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:52:42.781486 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:52:42.781494 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:52:42.781510 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:52:42.781522 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 04:52:42.781530 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 04:52:42.781537 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:52:42.781545 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:52:42.781552 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:52:42.781560 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:52:42.781567 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 04:52:42.781576 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:52:42.781584 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 04:52:42.781592 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 04:52:42.781599 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 04:52:42.781607 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:52:42.781615 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:52:42.781623 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:52:42.781630 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 04:52:42.781640 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:52:42.781647 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 04:52:42.781655 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:52:42.781677 systemd-journald[243]: Collecting audit messages is disabled. Sep 9 04:52:42.781698 systemd-journald[243]: Journal started Sep 9 04:52:42.781715 systemd-journald[243]: Runtime Journal (/run/log/journal/16a3c34739514c7db9bbbe09e9feeac7) is 6M, max 48.5M, 42.4M free. Sep 9 04:52:42.786422 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 04:52:42.771950 systemd-modules-load[245]: Inserted module 'overlay' Sep 9 04:52:42.788220 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:52:42.788240 kernel: Bridge firewalling registered Sep 9 04:52:42.789845 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 9 04:52:42.791670 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:52:42.793142 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:52:42.796369 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:52:42.801650 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 04:52:42.803590 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:52:42.805827 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:52:42.825470 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:52:42.829116 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:52:42.831830 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:52:42.835120 systemd-tmpfiles[272]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 04:52:42.837171 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:52:42.838886 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:52:42.842393 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 04:52:42.844978 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:52:42.871690 dracut-cmdline[291]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:52:42.886411 systemd-resolved[292]: Positive Trust Anchors: Sep 9 04:52:42.886440 systemd-resolved[292]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:52:42.886472 systemd-resolved[292]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:52:42.891363 systemd-resolved[292]: Defaulting to hostname 'linux'. Sep 9 04:52:42.892411 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:52:42.895873 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:52:42.940362 kernel: SCSI subsystem initialized Sep 9 04:52:42.945337 kernel: Loading iSCSI transport class v2.0-870. Sep 9 04:52:42.953368 kernel: iscsi: registered transport (tcp) Sep 9 04:52:42.965424 kernel: iscsi: registered transport (qla4xxx) Sep 9 04:52:42.965471 kernel: QLogic iSCSI HBA Driver Sep 9 04:52:42.983430 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:52:42.999894 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:52:43.003866 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:52:43.046795 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 04:52:43.049284 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 04:52:43.112374 kernel: raid6: neonx8 gen() 15812 MB/s Sep 9 04:52:43.129351 kernel: raid6: neonx4 gen() 15823 MB/s Sep 9 04:52:43.146383 kernel: raid6: neonx2 gen() 13190 MB/s Sep 9 04:52:43.163376 kernel: raid6: neonx1 gen() 10469 MB/s Sep 9 04:52:43.180349 kernel: raid6: int64x8 gen() 6900 MB/s Sep 9 04:52:43.197362 kernel: raid6: int64x4 gen() 7341 MB/s Sep 9 04:52:43.214352 kernel: raid6: int64x2 gen() 6104 MB/s Sep 9 04:52:43.231351 kernel: raid6: int64x1 gen() 5047 MB/s Sep 9 04:52:43.231372 kernel: raid6: using algorithm neonx4 gen() 15823 MB/s Sep 9 04:52:43.248353 kernel: raid6: .... xor() 12336 MB/s, rmw enabled Sep 9 04:52:43.248369 kernel: raid6: using neon recovery algorithm Sep 9 04:52:43.253526 kernel: xor: measuring software checksum speed Sep 9 04:52:43.253556 kernel: 8regs : 21556 MB/sec Sep 9 04:52:43.254636 kernel: 32regs : 21693 MB/sec Sep 9 04:52:43.254651 kernel: arm64_neon : 28070 MB/sec Sep 9 04:52:43.254661 kernel: xor: using function: arm64_neon (28070 MB/sec) Sep 9 04:52:43.306354 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 04:52:43.313088 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:52:43.315912 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:52:43.343753 systemd-udevd[502]: Using default interface naming scheme 'v255'. Sep 9 04:52:43.347803 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:52:43.350912 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 04:52:43.381813 dracut-pre-trigger[511]: rd.md=0: removing MD RAID activation Sep 9 04:52:43.404167 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:52:43.406666 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:52:43.462259 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:52:43.467182 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 04:52:43.518108 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 9 04:52:43.518251 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 04:52:43.522668 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 04:52:43.522714 kernel: GPT:9289727 != 19775487 Sep 9 04:52:43.523733 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 04:52:43.524377 kernel: GPT:9289727 != 19775487 Sep 9 04:52:43.524408 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 04:52:43.530382 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:52:43.537223 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:52:43.538535 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:52:43.540913 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:52:43.543721 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:52:43.570738 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 04:52:43.572309 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:52:43.576136 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 04:52:43.584668 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 04:52:43.597979 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 04:52:43.599368 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 04:52:43.609875 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:52:43.611191 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:52:43.613537 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:52:43.615697 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:52:43.618476 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 04:52:43.620443 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 04:52:43.639742 disk-uuid[592]: Primary Header is updated. Sep 9 04:52:43.639742 disk-uuid[592]: Secondary Entries is updated. Sep 9 04:52:43.639742 disk-uuid[592]: Secondary Header is updated. Sep 9 04:52:43.644353 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:52:43.645419 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:52:44.654145 disk-uuid[597]: The operation has completed successfully. Sep 9 04:52:44.655286 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:52:44.680872 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 04:52:44.680967 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 04:52:44.710598 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 04:52:44.728146 sh[611]: Success Sep 9 04:52:44.740924 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 04:52:44.740964 kernel: device-mapper: uevent: version 1.0.3 Sep 9 04:52:44.740979 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 04:52:44.747346 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 04:52:44.776543 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 04:52:44.779400 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 04:52:44.800689 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 04:52:44.806341 kernel: BTRFS: device fsid 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (623) Sep 9 04:52:44.808519 kernel: BTRFS info (device dm-0): first mount of filesystem 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 Sep 9 04:52:44.808556 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:52:44.811843 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 04:52:44.811875 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 04:52:44.812880 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 04:52:44.814140 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:52:44.815543 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 04:52:44.816248 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 04:52:44.817835 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 04:52:44.844112 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (654) Sep 9 04:52:44.844150 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:52:44.844160 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:52:44.848351 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:52:44.848387 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:52:44.852376 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:52:44.853828 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 04:52:44.856040 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 04:52:44.919275 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:52:44.923423 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:52:44.958001 systemd-networkd[796]: lo: Link UP Sep 9 04:52:44.958926 systemd-networkd[796]: lo: Gained carrier Sep 9 04:52:44.960489 systemd-networkd[796]: Enumeration completed Sep 9 04:52:44.961354 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:52:44.961659 systemd-networkd[796]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:52:44.961663 systemd-networkd[796]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:52:44.964514 ignition[700]: Ignition 2.22.0 Sep 9 04:52:44.962636 systemd[1]: Reached target network.target - Network. Sep 9 04:52:44.964521 ignition[700]: Stage: fetch-offline Sep 9 04:52:44.964389 systemd-networkd[796]: eth0: Link UP Sep 9 04:52:44.964561 ignition[700]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:52:44.964525 systemd-networkd[796]: eth0: Gained carrier Sep 9 04:52:44.964569 ignition[700]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:52:44.964534 systemd-networkd[796]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:52:44.964659 ignition[700]: parsed url from cmdline: "" Sep 9 04:52:44.964662 ignition[700]: no config URL provided Sep 9 04:52:44.964666 ignition[700]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:52:44.964672 ignition[700]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:52:44.964689 ignition[700]: op(1): [started] loading QEMU firmware config module Sep 9 04:52:44.964701 ignition[700]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 04:52:44.972723 ignition[700]: op(1): [finished] loading QEMU firmware config module Sep 9 04:52:44.986396 systemd-networkd[796]: eth0: DHCPv4 address 10.0.0.40/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:52:45.024389 ignition[700]: parsing config with SHA512: 2dd0a01aafe5ed17b841157c2e6edc003bb1f054527f115c0f766009b63e14bf49e9f6336a4d7086ce5108f03abb4a64937b780d9a6ae0dc2f34992dd726fdde Sep 9 04:52:45.031057 unknown[700]: fetched base config from "system" Sep 9 04:52:45.031069 unknown[700]: fetched user config from "qemu" Sep 9 04:52:45.031546 ignition[700]: fetch-offline: fetch-offline passed Sep 9 04:52:45.031610 ignition[700]: Ignition finished successfully Sep 9 04:52:45.034205 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:52:45.035646 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 04:52:45.036430 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 04:52:45.074933 ignition[808]: Ignition 2.22.0 Sep 9 04:52:45.074950 ignition[808]: Stage: kargs Sep 9 04:52:45.075075 ignition[808]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:52:45.075083 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:52:45.075818 ignition[808]: kargs: kargs passed Sep 9 04:52:45.078942 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 04:52:45.075862 ignition[808]: Ignition finished successfully Sep 9 04:52:45.081055 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 04:52:45.109798 ignition[816]: Ignition 2.22.0 Sep 9 04:52:45.109815 ignition[816]: Stage: disks Sep 9 04:52:45.109954 ignition[816]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:52:45.109962 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:52:45.110706 ignition[816]: disks: disks passed Sep 9 04:52:45.112961 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 04:52:45.110746 ignition[816]: Ignition finished successfully Sep 9 04:52:45.114450 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 04:52:45.115736 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 04:52:45.117676 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:52:45.119277 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:52:45.121278 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:52:45.124053 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 04:52:45.147552 systemd-fsck[826]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 04:52:45.153896 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 04:52:45.156455 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 04:52:45.212348 kernel: EXT4-fs (vda9): mounted filesystem 88574756-967d-44b3-be66-46689c8baf27 r/w with ordered data mode. Quota mode: none. Sep 9 04:52:45.212696 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 04:52:45.213956 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 04:52:45.216785 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:52:45.218407 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 04:52:45.219401 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 04:52:45.219441 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 04:52:45.219464 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:52:45.238032 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 04:52:45.240212 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 04:52:45.249383 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (834) Sep 9 04:52:45.251333 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:52:45.251370 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:52:45.253830 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:52:45.254248 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:52:45.255852 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:52:45.283443 initrd-setup-root[859]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 04:52:45.288348 initrd-setup-root[866]: cut: /sysroot/etc/group: No such file or directory Sep 9 04:52:45.291651 initrd-setup-root[873]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 04:52:45.294932 initrd-setup-root[880]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 04:52:45.365440 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 04:52:45.367447 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 04:52:45.369040 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 04:52:45.395371 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:52:45.409106 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 04:52:45.422737 ignition[948]: INFO : Ignition 2.22.0 Sep 9 04:52:45.422737 ignition[948]: INFO : Stage: mount Sep 9 04:52:45.425340 ignition[948]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:52:45.425340 ignition[948]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:52:45.425340 ignition[948]: INFO : mount: mount passed Sep 9 04:52:45.425340 ignition[948]: INFO : Ignition finished successfully Sep 9 04:52:45.426129 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 04:52:45.428185 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 04:52:45.806729 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 04:52:45.808230 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:52:45.836337 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (961) Sep 9 04:52:45.837935 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:52:45.837949 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:52:45.840344 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:52:45.840368 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:52:45.841439 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:52:45.870700 ignition[978]: INFO : Ignition 2.22.0 Sep 9 04:52:45.870700 ignition[978]: INFO : Stage: files Sep 9 04:52:45.872381 ignition[978]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:52:45.872381 ignition[978]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:52:45.872381 ignition[978]: DEBUG : files: compiled without relabeling support, skipping Sep 9 04:52:45.872381 ignition[978]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 04:52:45.872381 ignition[978]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 04:52:45.878235 ignition[978]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 04:52:45.878235 ignition[978]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 04:52:45.878235 ignition[978]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 04:52:45.878235 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 04:52:45.878235 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 9 04:52:45.874579 unknown[978]: wrote ssh authorized keys file for user: core Sep 9 04:52:45.913591 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 04:52:46.412490 systemd-networkd[796]: eth0: Gained IPv6LL Sep 9 04:52:46.415947 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 04:52:46.415947 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 04:52:46.420140 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 04:52:46.420140 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:52:46.420140 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:52:46.420140 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:52:46.420140 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:52:46.420140 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:52:46.420140 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:52:46.437036 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:52:46.437036 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:52:46.437036 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:52:46.437036 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:52:46.437036 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:52:46.437036 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 9 04:52:46.798517 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 04:52:47.102080 ignition[978]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 04:52:47.102080 ignition[978]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 04:52:47.105381 ignition[978]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:52:47.108168 ignition[978]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:52:47.108168 ignition[978]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 04:52:47.108168 ignition[978]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 04:52:47.108168 ignition[978]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:52:47.115319 ignition[978]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:52:47.115319 ignition[978]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 04:52:47.115319 ignition[978]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 04:52:47.121976 ignition[978]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:52:47.124527 ignition[978]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:52:47.127719 ignition[978]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 04:52:47.127719 ignition[978]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 04:52:47.127719 ignition[978]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 04:52:47.127719 ignition[978]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:52:47.127719 ignition[978]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:52:47.127719 ignition[978]: INFO : files: files passed Sep 9 04:52:47.127719 ignition[978]: INFO : Ignition finished successfully Sep 9 04:52:47.127999 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 04:52:47.131256 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 04:52:47.133388 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 04:52:47.152267 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 04:52:47.153431 initrd-setup-root-after-ignition[1008]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 04:52:47.154444 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 04:52:47.156307 initrd-setup-root-after-ignition[1010]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:52:47.156307 initrd-setup-root-after-ignition[1010]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:52:47.158968 initrd-setup-root-after-ignition[1014]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:52:47.158542 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:52:47.160272 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 04:52:47.163007 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 04:52:47.202125 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 04:52:47.202230 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 04:52:47.204567 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 04:52:47.206377 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 04:52:47.208249 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 04:52:47.208979 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 04:52:47.231215 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:52:47.233606 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 04:52:47.251203 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:52:47.252510 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:52:47.254701 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 04:52:47.256568 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 04:52:47.256686 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:52:47.259375 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 04:52:47.261417 systemd[1]: Stopped target basic.target - Basic System. Sep 9 04:52:47.263163 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 04:52:47.264914 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:52:47.266926 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 04:52:47.268991 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:52:47.271013 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 04:52:47.273030 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:52:47.275140 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 04:52:47.277301 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 04:52:47.279153 systemd[1]: Stopped target swap.target - Swaps. Sep 9 04:52:47.280815 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 04:52:47.280940 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:52:47.283337 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:52:47.285476 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:52:47.287494 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 04:52:47.288495 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:52:47.289882 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 04:52:47.289993 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 04:52:47.293039 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 04:52:47.293159 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:52:47.295273 systemd[1]: Stopped target paths.target - Path Units. Sep 9 04:52:47.297045 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 04:52:47.298024 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:52:47.299422 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 04:52:47.301384 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 04:52:47.303104 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 04:52:47.303191 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:52:47.305030 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 04:52:47.305112 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:52:47.307540 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 04:52:47.307656 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:52:47.309678 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 04:52:47.309811 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 04:52:47.312246 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 04:52:47.314132 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 04:52:47.314271 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:52:47.326893 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 04:52:47.327780 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 04:52:47.327914 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:52:47.329996 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 04:52:47.330102 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:52:47.336257 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 04:52:47.336378 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 04:52:47.343288 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 04:52:47.345927 ignition[1035]: INFO : Ignition 2.22.0 Sep 9 04:52:47.345927 ignition[1035]: INFO : Stage: umount Sep 9 04:52:47.347780 ignition[1035]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:52:47.347780 ignition[1035]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:52:47.347780 ignition[1035]: INFO : umount: umount passed Sep 9 04:52:47.347780 ignition[1035]: INFO : Ignition finished successfully Sep 9 04:52:47.350841 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 04:52:47.350931 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 04:52:47.352233 systemd[1]: Stopped target network.target - Network. Sep 9 04:52:47.353826 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 04:52:47.353883 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 04:52:47.355704 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 04:52:47.355762 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 04:52:47.357585 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 04:52:47.357638 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 04:52:47.359487 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 04:52:47.359530 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 04:52:47.361418 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 04:52:47.363179 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 04:52:47.372296 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 04:52:47.372462 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 04:52:47.375946 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 04:52:47.376188 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 04:52:47.376282 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 04:52:47.380168 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 04:52:47.380730 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 04:52:47.383120 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 04:52:47.383157 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:52:47.386226 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 04:52:47.387267 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 04:52:47.387335 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:52:47.389734 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 04:52:47.389791 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:52:47.393714 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 04:52:47.393768 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 04:52:47.397692 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 04:52:47.397758 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:52:47.405533 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:52:47.408992 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 04:52:47.409054 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:52:47.410148 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 04:52:47.410233 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 04:52:47.412571 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 04:52:47.412652 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 04:52:47.418929 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 04:52:47.425531 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:52:47.427058 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 04:52:47.427092 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 04:52:47.429599 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 04:52:47.429634 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:52:47.431428 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 04:52:47.431478 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:52:47.434282 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 04:52:47.434341 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 04:52:47.437382 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 04:52:47.437434 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:52:47.441419 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 04:52:47.442542 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 04:52:47.442606 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:52:47.446000 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 04:52:47.446045 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:52:47.449825 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:52:47.449886 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:52:47.454492 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 04:52:47.454541 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 04:52:47.454572 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:52:47.454823 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 04:52:47.454925 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 04:52:47.457355 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 04:52:47.457437 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 04:52:47.460139 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 04:52:47.462638 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 04:52:47.481123 systemd[1]: Switching root. Sep 9 04:52:47.510669 systemd-journald[243]: Journal stopped Sep 9 04:52:48.232266 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Sep 9 04:52:48.232315 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 04:52:48.232348 kernel: SELinux: policy capability open_perms=1 Sep 9 04:52:48.232359 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 04:52:48.232369 kernel: SELinux: policy capability always_check_network=0 Sep 9 04:52:48.232382 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 04:52:48.232393 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 04:52:48.232406 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 04:52:48.232415 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 04:52:48.232424 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 04:52:48.232433 kernel: audit: type=1403 audit(1757393567.678:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 04:52:48.232447 systemd[1]: Successfully loaded SELinux policy in 53.045ms. Sep 9 04:52:48.232468 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.560ms. Sep 9 04:52:48.232480 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:52:48.232491 systemd[1]: Detected virtualization kvm. Sep 9 04:52:48.232501 systemd[1]: Detected architecture arm64. Sep 9 04:52:48.232515 systemd[1]: Detected first boot. Sep 9 04:52:48.232525 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:52:48.232536 zram_generator::config[1080]: No configuration found. Sep 9 04:52:48.232548 kernel: NET: Registered PF_VSOCK protocol family Sep 9 04:52:48.232557 systemd[1]: Populated /etc with preset unit settings. Sep 9 04:52:48.232567 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 04:52:48.232577 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 04:52:48.232599 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 04:52:48.232609 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 04:52:48.232619 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 04:52:48.232639 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 04:52:48.232649 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 04:52:48.232661 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 04:52:48.232675 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 04:52:48.232685 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 04:52:48.232696 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 04:52:48.232705 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 04:52:48.232715 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:52:48.232725 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:52:48.232735 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 04:52:48.232752 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 04:52:48.232765 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 04:52:48.232775 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:52:48.232785 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 04:52:48.232795 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:52:48.232805 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:52:48.232814 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 04:52:48.232824 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 04:52:48.232835 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 04:52:48.232845 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 04:52:48.232855 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:52:48.232865 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:52:48.232874 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:52:48.232885 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:52:48.232894 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 04:52:48.232904 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 04:52:48.232913 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 04:52:48.232925 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:52:48.232935 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:52:48.232946 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:52:48.232957 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 04:52:48.232966 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 04:52:48.232976 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 04:52:48.232986 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 04:52:48.232995 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 04:52:48.233005 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 04:52:48.233016 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 04:52:48.233026 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 04:52:48.233036 systemd[1]: Reached target machines.target - Containers. Sep 9 04:52:48.233045 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 04:52:48.233055 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:52:48.233065 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:52:48.233075 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 04:52:48.233084 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:52:48.233094 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:52:48.233105 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:52:48.233114 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 04:52:48.233124 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:52:48.233134 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 04:52:48.233144 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 04:52:48.233153 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 04:52:48.233163 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 04:52:48.233172 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 04:52:48.233184 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:52:48.233196 kernel: fuse: init (API version 7.41) Sep 9 04:52:48.233205 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:52:48.233215 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:52:48.233225 kernel: loop: module loaded Sep 9 04:52:48.233233 kernel: ACPI: bus type drm_connector registered Sep 9 04:52:48.233242 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:52:48.233253 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 04:52:48.233263 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 04:52:48.233274 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:52:48.233301 systemd-journald[1152]: Collecting audit messages is disabled. Sep 9 04:52:48.233386 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 04:52:48.233399 systemd[1]: Stopped verity-setup.service. Sep 9 04:52:48.233413 systemd-journald[1152]: Journal started Sep 9 04:52:48.233432 systemd-journald[1152]: Runtime Journal (/run/log/journal/16a3c34739514c7db9bbbe09e9feeac7) is 6M, max 48.5M, 42.4M free. Sep 9 04:52:48.033378 systemd[1]: Queued start job for default target multi-user.target. Sep 9 04:52:48.056296 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 04:52:48.056707 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 04:52:48.238006 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:52:48.238671 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 04:52:48.239797 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 04:52:48.241031 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 04:52:48.242156 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 04:52:48.243467 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 04:52:48.244656 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 04:52:48.247343 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 04:52:48.248919 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:52:48.251657 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 04:52:48.251867 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 04:52:48.253519 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:52:48.253685 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:52:48.255301 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:52:48.255527 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:52:48.257067 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:52:48.257223 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:52:48.258912 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 04:52:48.259066 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 04:52:48.260579 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:52:48.260750 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:52:48.262184 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:52:48.263685 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:52:48.265451 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 04:52:48.267108 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 04:52:48.278961 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:52:48.281516 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 04:52:48.283648 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 04:52:48.285011 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 04:52:48.285048 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:52:48.286974 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 04:52:48.296122 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 04:52:48.297450 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:52:48.298671 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 04:52:48.300818 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 04:52:48.302209 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:52:48.305498 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 04:52:48.306752 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:52:48.308058 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:52:48.311671 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 04:52:48.312260 systemd-journald[1152]: Time spent on flushing to /var/log/journal/16a3c34739514c7db9bbbe09e9feeac7 is 26.774ms for 884 entries. Sep 9 04:52:48.312260 systemd-journald[1152]: System Journal (/var/log/journal/16a3c34739514c7db9bbbe09e9feeac7) is 8M, max 195.6M, 187.6M free. Sep 9 04:52:48.357467 systemd-journald[1152]: Received client request to flush runtime journal. Sep 9 04:52:48.357517 kernel: loop0: detected capacity change from 0 to 119368 Sep 9 04:52:48.315512 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 04:52:48.320784 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:52:48.323117 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 04:52:48.327305 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 04:52:48.331925 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 04:52:48.335358 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 04:52:48.339618 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 04:52:48.342397 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:52:48.358592 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 04:52:48.362454 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 04:52:48.364758 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 04:52:48.368385 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 04:52:48.371162 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:52:48.393349 kernel: loop1: detected capacity change from 0 to 100632 Sep 9 04:52:48.400241 systemd-tmpfiles[1213]: ACLs are not supported, ignoring. Sep 9 04:52:48.400259 systemd-tmpfiles[1213]: ACLs are not supported, ignoring. Sep 9 04:52:48.404262 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:52:48.418351 kernel: loop2: detected capacity change from 0 to 207008 Sep 9 04:52:48.452397 kernel: loop3: detected capacity change from 0 to 119368 Sep 9 04:52:48.462351 kernel: loop4: detected capacity change from 0 to 100632 Sep 9 04:52:48.465345 kernel: loop5: detected capacity change from 0 to 207008 Sep 9 04:52:48.468805 (sd-merge)[1219]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 04:52:48.469191 (sd-merge)[1219]: Merged extensions into '/usr'. Sep 9 04:52:48.472846 systemd[1]: Reload requested from client PID 1196 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 04:52:48.472864 systemd[1]: Reloading... Sep 9 04:52:48.517872 zram_generator::config[1243]: No configuration found. Sep 9 04:52:48.547543 ldconfig[1191]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 04:52:48.665831 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 04:52:48.666206 systemd[1]: Reloading finished in 192 ms. Sep 9 04:52:48.694730 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 04:52:48.696196 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 04:52:48.713446 systemd[1]: Starting ensure-sysext.service... Sep 9 04:52:48.715185 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:52:48.723932 systemd[1]: Reload requested from client PID 1279 ('systemctl') (unit ensure-sysext.service)... Sep 9 04:52:48.723948 systemd[1]: Reloading... Sep 9 04:52:48.727979 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 04:52:48.728007 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 04:52:48.728225 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 04:52:48.728433 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 04:52:48.729048 systemd-tmpfiles[1280]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 04:52:48.729250 systemd-tmpfiles[1280]: ACLs are not supported, ignoring. Sep 9 04:52:48.729297 systemd-tmpfiles[1280]: ACLs are not supported, ignoring. Sep 9 04:52:48.732013 systemd-tmpfiles[1280]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:52:48.732029 systemd-tmpfiles[1280]: Skipping /boot Sep 9 04:52:48.737885 systemd-tmpfiles[1280]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:52:48.737901 systemd-tmpfiles[1280]: Skipping /boot Sep 9 04:52:48.770347 zram_generator::config[1305]: No configuration found. Sep 9 04:52:48.896254 systemd[1]: Reloading finished in 172 ms. Sep 9 04:52:48.921396 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 04:52:48.926793 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:52:48.938230 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:52:48.940436 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 04:52:48.956495 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 04:52:48.961674 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:52:48.964264 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:52:48.967236 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 04:52:48.973171 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 04:52:48.974789 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 04:52:48.980720 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:52:48.981773 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:52:48.985509 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:52:48.989330 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:52:48.990446 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:52:48.990557 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:52:48.992561 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 04:52:48.994662 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:52:48.994847 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:52:48.996827 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:52:48.998364 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:52:48.999961 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:52:49.000083 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:52:49.003082 augenrules[1375]: No rules Sep 9 04:52:49.004387 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 04:52:49.005708 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:52:49.005917 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:52:49.010227 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 04:52:49.015034 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:52:49.016492 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:52:49.018590 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:52:49.020952 systemd-udevd[1354]: Using default interface naming scheme 'v255'. Sep 9 04:52:49.029551 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:52:49.030792 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:52:49.030895 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:52:49.030972 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 04:52:49.031857 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 04:52:49.035074 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 04:52:49.036823 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:52:49.036957 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:52:49.038431 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:52:49.040085 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:52:49.045349 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:52:49.046913 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:52:49.047078 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:52:49.069035 systemd[1]: Finished ensure-sysext.service. Sep 9 04:52:49.076398 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:52:49.078507 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:52:49.081624 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:52:49.086507 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:52:49.100448 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:52:49.103218 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:52:49.105489 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:52:49.105542 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:52:49.107982 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:52:49.111457 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 04:52:49.112269 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 04:52:49.112731 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:52:49.112929 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:52:49.115522 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:52:49.115668 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:52:49.122967 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:52:49.123147 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:52:49.130060 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:52:49.130402 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:52:49.132452 augenrules[1426]: /sbin/augenrules: No change Sep 9 04:52:49.138232 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 04:52:49.138387 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:52:49.138439 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:52:49.149830 augenrules[1458]: No rules Sep 9 04:52:49.150549 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:52:49.151393 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:52:49.154490 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:52:49.159758 systemd-resolved[1353]: Positive Trust Anchors: Sep 9 04:52:49.160010 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 04:52:49.160578 systemd-resolved[1353]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:52:49.160682 systemd-resolved[1353]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:52:49.169358 systemd-resolved[1353]: Defaulting to hostname 'linux'. Sep 9 04:52:49.171052 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:52:49.173555 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:52:49.182351 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 04:52:49.209889 systemd-networkd[1434]: lo: Link UP Sep 9 04:52:49.209897 systemd-networkd[1434]: lo: Gained carrier Sep 9 04:52:49.210696 systemd-networkd[1434]: Enumeration completed Sep 9 04:52:49.210797 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:52:49.211456 systemd-networkd[1434]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:52:49.211466 systemd-networkd[1434]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:52:49.212072 systemd[1]: Reached target network.target - Network. Sep 9 04:52:49.212400 systemd-networkd[1434]: eth0: Link UP Sep 9 04:52:49.212524 systemd-networkd[1434]: eth0: Gained carrier Sep 9 04:52:49.212539 systemd-networkd[1434]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:52:49.214755 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 04:52:49.217768 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 04:52:49.219211 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 04:52:49.221528 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:52:49.222732 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 04:52:49.224046 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 04:52:49.227416 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 04:52:49.232371 systemd-networkd[1434]: eth0: DHCPv4 address 10.0.0.40/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:52:49.233224 systemd-timesyncd[1438]: Network configuration changed, trying to establish connection. Sep 9 04:52:49.234874 systemd-timesyncd[1438]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 04:52:49.234930 systemd-timesyncd[1438]: Initial clock synchronization to Tue 2025-09-09 04:52:49.126430 UTC. Sep 9 04:52:49.237220 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 04:52:49.237252 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:52:49.238516 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 04:52:49.239995 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 04:52:49.241590 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 04:52:49.244382 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:52:49.245872 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 04:52:49.248160 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 04:52:49.252141 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 04:52:49.254574 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 04:52:49.257380 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 04:52:49.260672 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 04:52:49.262161 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 04:52:49.264056 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 04:52:49.270965 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:52:49.274189 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:52:49.275256 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:52:49.275435 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:52:49.276951 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 04:52:49.279163 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 04:52:49.281202 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 04:52:49.293227 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 04:52:49.295161 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 04:52:49.296063 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 04:52:49.297124 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 04:52:49.299723 jq[1493]: false Sep 9 04:52:49.300515 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 04:52:49.304571 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 04:52:49.307151 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 04:52:49.307335 extend-filesystems[1494]: Found /dev/vda6 Sep 9 04:52:49.312053 extend-filesystems[1494]: Found /dev/vda9 Sep 9 04:52:49.313436 extend-filesystems[1494]: Checking size of /dev/vda9 Sep 9 04:52:49.312229 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 04:52:49.315133 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:52:49.317629 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 04:52:49.318131 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 04:52:49.318716 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 04:52:49.322467 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 04:52:49.327529 extend-filesystems[1494]: Resized partition /dev/vda9 Sep 9 04:52:49.330375 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 04:52:49.332068 extend-filesystems[1522]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 04:52:49.333609 jq[1517]: true Sep 9 04:52:49.333463 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 04:52:49.335262 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 04:52:49.335463 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 04:52:49.335838 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 04:52:49.335989 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 04:52:49.340354 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 04:52:49.344065 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 04:52:49.344229 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 04:52:49.359026 (ntainerd)[1526]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 04:52:49.370665 tar[1524]: linux-arm64/LICENSE Sep 9 04:52:49.370665 tar[1524]: linux-arm64/helm Sep 9 04:52:49.382541 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 04:52:49.384656 dbus-daemon[1491]: [system] SELinux support is enabled Sep 9 04:52:49.384898 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 04:52:49.389249 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 04:52:49.395454 jq[1525]: true Sep 9 04:52:49.389271 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 04:52:49.391183 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 04:52:49.391199 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 04:52:49.393658 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:52:49.398482 extend-filesystems[1522]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 04:52:49.398482 extend-filesystems[1522]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 04:52:49.398482 extend-filesystems[1522]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 04:52:49.406474 extend-filesystems[1494]: Resized filesystem in /dev/vda9 Sep 9 04:52:49.401250 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 04:52:49.407953 update_engine[1515]: I20250909 04:52:49.400159 1515 main.cc:92] Flatcar Update Engine starting Sep 9 04:52:49.403370 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 04:52:49.409937 systemd-logind[1508]: Watching system buttons on /dev/input/event0 (Power Button) Sep 9 04:52:49.412260 systemd-logind[1508]: New seat seat0. Sep 9 04:52:49.413168 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 04:52:49.414566 systemd[1]: Started update-engine.service - Update Engine. Sep 9 04:52:49.417790 update_engine[1515]: I20250909 04:52:49.417728 1515 update_check_scheduler.cc:74] Next update check in 6m48s Sep 9 04:52:49.419413 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 04:52:49.460637 bash[1560]: Updated "/home/core/.ssh/authorized_keys" Sep 9 04:52:49.463243 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 04:52:49.465858 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 04:52:49.468722 locksmithd[1547]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 04:52:49.540668 containerd[1526]: time="2025-09-09T04:52:49Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 04:52:49.540668 containerd[1526]: time="2025-09-09T04:52:49.541016480Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 04:52:49.550630 containerd[1526]: time="2025-09-09T04:52:49.550590600Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.44µs" Sep 9 04:52:49.550630 containerd[1526]: time="2025-09-09T04:52:49.550622760Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 04:52:49.550732 containerd[1526]: time="2025-09-09T04:52:49.550639920Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 04:52:49.550816 containerd[1526]: time="2025-09-09T04:52:49.550795400Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 04:52:49.550837 containerd[1526]: time="2025-09-09T04:52:49.550818360Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 04:52:49.550855 containerd[1526]: time="2025-09-09T04:52:49.550841760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:52:49.550903 containerd[1526]: time="2025-09-09T04:52:49.550887320Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:52:49.550922 containerd[1526]: time="2025-09-09T04:52:49.550903480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551124 containerd[1526]: time="2025-09-09T04:52:49.551094240Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551124 containerd[1526]: time="2025-09-09T04:52:49.551115600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551160 containerd[1526]: time="2025-09-09T04:52:49.551126680Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551160 containerd[1526]: time="2025-09-09T04:52:49.551135760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551212 containerd[1526]: time="2025-09-09T04:52:49.551198400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551416 containerd[1526]: time="2025-09-09T04:52:49.551400760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551447 containerd[1526]: time="2025-09-09T04:52:49.551434040Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:52:49.551477 containerd[1526]: time="2025-09-09T04:52:49.551447240Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 04:52:49.551495 containerd[1526]: time="2025-09-09T04:52:49.551474400Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 04:52:49.551684 containerd[1526]: time="2025-09-09T04:52:49.551669080Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 04:52:49.551762 containerd[1526]: time="2025-09-09T04:52:49.551744960Z" level=info msg="metadata content store policy set" policy=shared Sep 9 04:52:49.554599 containerd[1526]: time="2025-09-09T04:52:49.554574120Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 04:52:49.554658 containerd[1526]: time="2025-09-09T04:52:49.554625480Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 04:52:49.554658 containerd[1526]: time="2025-09-09T04:52:49.554638680Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 04:52:49.554658 containerd[1526]: time="2025-09-09T04:52:49.554649040Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 04:52:49.554742 containerd[1526]: time="2025-09-09T04:52:49.554660240Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 04:52:49.554742 containerd[1526]: time="2025-09-09T04:52:49.554706640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 04:52:49.554742 containerd[1526]: time="2025-09-09T04:52:49.554720240Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 04:52:49.554742 containerd[1526]: time="2025-09-09T04:52:49.554731480Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 04:52:49.554818 containerd[1526]: time="2025-09-09T04:52:49.554751400Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 04:52:49.554818 containerd[1526]: time="2025-09-09T04:52:49.554762120Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 04:52:49.554818 containerd[1526]: time="2025-09-09T04:52:49.554771880Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 04:52:49.554818 containerd[1526]: time="2025-09-09T04:52:49.554783960Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 04:52:49.554905 containerd[1526]: time="2025-09-09T04:52:49.554885320Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 04:52:49.554934 containerd[1526]: time="2025-09-09T04:52:49.554911280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 04:52:49.554934 containerd[1526]: time="2025-09-09T04:52:49.554925960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 04:52:49.554966 containerd[1526]: time="2025-09-09T04:52:49.554936520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 04:52:49.554966 containerd[1526]: time="2025-09-09T04:52:49.554945880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 04:52:49.554966 containerd[1526]: time="2025-09-09T04:52:49.554954960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 04:52:49.554966 containerd[1526]: time="2025-09-09T04:52:49.554964840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 04:52:49.555032 containerd[1526]: time="2025-09-09T04:52:49.554983400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 04:52:49.555032 containerd[1526]: time="2025-09-09T04:52:49.554994440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 04:52:49.555032 containerd[1526]: time="2025-09-09T04:52:49.555004480Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 04:52:49.555032 containerd[1526]: time="2025-09-09T04:52:49.555014080Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 04:52:49.555217 containerd[1526]: time="2025-09-09T04:52:49.555199760Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 04:52:49.555249 containerd[1526]: time="2025-09-09T04:52:49.555219200Z" level=info msg="Start snapshots syncer" Sep 9 04:52:49.555249 containerd[1526]: time="2025-09-09T04:52:49.555245920Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 04:52:49.555496 containerd[1526]: time="2025-09-09T04:52:49.555462160Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 04:52:49.555590 containerd[1526]: time="2025-09-09T04:52:49.555510800Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 04:52:49.555590 containerd[1526]: time="2025-09-09T04:52:49.555573720Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 04:52:49.555708 containerd[1526]: time="2025-09-09T04:52:49.555687680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555718000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555756040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555769080Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555780400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555791000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555802920Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555829400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 04:52:49.555839 containerd[1526]: time="2025-09-09T04:52:49.555839760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555850200Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555888240Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555901600Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555909680Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555918440Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555925760Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555935040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 04:52:49.555974 containerd[1526]: time="2025-09-09T04:52:49.555945040Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 04:52:49.556102 containerd[1526]: time="2025-09-09T04:52:49.556017360Z" level=info msg="runtime interface created" Sep 9 04:52:49.556102 containerd[1526]: time="2025-09-09T04:52:49.556023040Z" level=info msg="created NRI interface" Sep 9 04:52:49.556102 containerd[1526]: time="2025-09-09T04:52:49.556030800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 04:52:49.556102 containerd[1526]: time="2025-09-09T04:52:49.556041120Z" level=info msg="Connect containerd service" Sep 9 04:52:49.556102 containerd[1526]: time="2025-09-09T04:52:49.556065720Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 04:52:49.556911 containerd[1526]: time="2025-09-09T04:52:49.556884120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:52:49.621971 containerd[1526]: time="2025-09-09T04:52:49.621925080Z" level=info msg="Start subscribing containerd event" Sep 9 04:52:49.622214 containerd[1526]: time="2025-09-09T04:52:49.622119960Z" level=info msg="Start recovering state" Sep 9 04:52:49.622266 containerd[1526]: time="2025-09-09T04:52:49.622243320Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 04:52:49.622500 containerd[1526]: time="2025-09-09T04:52:49.622482240Z" level=info msg="Start event monitor" Sep 9 04:52:49.622818 containerd[1526]: time="2025-09-09T04:52:49.622705640Z" level=info msg="Start cni network conf syncer for default" Sep 9 04:52:49.622818 containerd[1526]: time="2025-09-09T04:52:49.622497920Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 04:52:49.622818 containerd[1526]: time="2025-09-09T04:52:49.622760280Z" level=info msg="Start streaming server" Sep 9 04:52:49.622818 containerd[1526]: time="2025-09-09T04:52:49.622790240Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 04:52:49.622818 containerd[1526]: time="2025-09-09T04:52:49.622798200Z" level=info msg="runtime interface starting up..." Sep 9 04:52:49.622818 containerd[1526]: time="2025-09-09T04:52:49.622804000Z" level=info msg="starting plugins..." Sep 9 04:52:49.622818 containerd[1526]: time="2025-09-09T04:52:49.622823280Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 04:52:49.622980 containerd[1526]: time="2025-09-09T04:52:49.622932880Z" level=info msg="containerd successfully booted in 0.082977s" Sep 9 04:52:49.623103 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 04:52:49.693866 tar[1524]: linux-arm64/README.md Sep 9 04:52:49.711448 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 04:52:50.700471 systemd-networkd[1434]: eth0: Gained IPv6LL Sep 9 04:52:50.702847 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 04:52:50.706997 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 04:52:50.709568 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 04:52:50.711941 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:52:50.723670 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 04:52:50.737987 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 04:52:50.738488 sshd_keygen[1520]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 04:52:50.739546 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 04:52:50.741070 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 04:52:50.748809 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 04:52:50.759392 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 04:52:50.762008 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 04:52:50.782669 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 04:52:50.782901 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 04:52:50.785907 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 04:52:50.809655 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 04:52:50.812426 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 04:52:50.814490 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 04:52:50.815977 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 04:52:51.256925 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:52:51.258555 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 04:52:51.260447 (kubelet)[1630]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:52:51.260612 systemd[1]: Startup finished in 1.990s (kernel) + 5.077s (initrd) + 3.635s (userspace) = 10.703s. Sep 9 04:52:51.588273 kubelet[1630]: E0909 04:52:51.588152 1630 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:52:51.590798 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:52:51.590956 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:52:51.591241 systemd[1]: kubelet.service: Consumed 741ms CPU time, 256.4M memory peak. Sep 9 04:52:54.956489 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 04:52:54.957464 systemd[1]: Started sshd@0-10.0.0.40:22-10.0.0.1:36604.service - OpenSSH per-connection server daemon (10.0.0.1:36604). Sep 9 04:52:55.038502 sshd[1644]: Accepted publickey for core from 10.0.0.1 port 36604 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:52:55.040274 sshd-session[1644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:55.051289 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 04:52:55.051433 systemd-logind[1508]: New session 1 of user core. Sep 9 04:52:55.052200 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 04:52:55.078396 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 04:52:55.080629 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 04:52:55.095280 (systemd)[1649]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 04:52:55.097297 systemd-logind[1508]: New session c1 of user core. Sep 9 04:52:55.192251 systemd[1649]: Queued start job for default target default.target. Sep 9 04:52:55.200164 systemd[1649]: Created slice app.slice - User Application Slice. Sep 9 04:52:55.200190 systemd[1649]: Reached target paths.target - Paths. Sep 9 04:52:55.200230 systemd[1649]: Reached target timers.target - Timers. Sep 9 04:52:55.201528 systemd[1649]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 04:52:55.210253 systemd[1649]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 04:52:55.210299 systemd[1649]: Reached target sockets.target - Sockets. Sep 9 04:52:55.210363 systemd[1649]: Reached target basic.target - Basic System. Sep 9 04:52:55.210391 systemd[1649]: Reached target default.target - Main User Target. Sep 9 04:52:55.210417 systemd[1649]: Startup finished in 108ms. Sep 9 04:52:55.210529 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 04:52:55.211769 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 04:52:55.271031 systemd[1]: Started sshd@1-10.0.0.40:22-10.0.0.1:36620.service - OpenSSH per-connection server daemon (10.0.0.1:36620). Sep 9 04:52:55.321834 sshd[1660]: Accepted publickey for core from 10.0.0.1 port 36620 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:52:55.322867 sshd-session[1660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:55.326635 systemd-logind[1508]: New session 2 of user core. Sep 9 04:52:55.343474 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 04:52:55.392851 sshd[1663]: Connection closed by 10.0.0.1 port 36620 Sep 9 04:52:55.393113 sshd-session[1660]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:55.407071 systemd[1]: sshd@1-10.0.0.40:22-10.0.0.1:36620.service: Deactivated successfully. Sep 9 04:52:55.409435 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 04:52:55.410111 systemd-logind[1508]: Session 2 logged out. Waiting for processes to exit. Sep 9 04:52:55.412051 systemd[1]: Started sshd@2-10.0.0.40:22-10.0.0.1:36622.service - OpenSSH per-connection server daemon (10.0.0.1:36622). Sep 9 04:52:55.412667 systemd-logind[1508]: Removed session 2. Sep 9 04:52:55.457748 sshd[1669]: Accepted publickey for core from 10.0.0.1 port 36622 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:52:55.458821 sshd-session[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:55.462984 systemd-logind[1508]: New session 3 of user core. Sep 9 04:52:55.473478 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 04:52:55.521965 sshd[1672]: Connection closed by 10.0.0.1 port 36622 Sep 9 04:52:55.522371 sshd-session[1669]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:55.532192 systemd[1]: sshd@2-10.0.0.40:22-10.0.0.1:36622.service: Deactivated successfully. Sep 9 04:52:55.533531 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 04:52:55.534874 systemd-logind[1508]: Session 3 logged out. Waiting for processes to exit. Sep 9 04:52:55.536054 systemd[1]: Started sshd@3-10.0.0.40:22-10.0.0.1:36628.service - OpenSSH per-connection server daemon (10.0.0.1:36628). Sep 9 04:52:55.537199 systemd-logind[1508]: Removed session 3. Sep 9 04:52:55.588679 sshd[1678]: Accepted publickey for core from 10.0.0.1 port 36628 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:52:55.589737 sshd-session[1678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:55.593908 systemd-logind[1508]: New session 4 of user core. Sep 9 04:52:55.603447 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 04:52:55.654227 sshd[1681]: Connection closed by 10.0.0.1 port 36628 Sep 9 04:52:55.654535 sshd-session[1678]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:55.664103 systemd[1]: sshd@3-10.0.0.40:22-10.0.0.1:36628.service: Deactivated successfully. Sep 9 04:52:55.665520 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 04:52:55.666907 systemd-logind[1508]: Session 4 logged out. Waiting for processes to exit. Sep 9 04:52:55.668825 systemd[1]: Started sshd@4-10.0.0.40:22-10.0.0.1:36644.service - OpenSSH per-connection server daemon (10.0.0.1:36644). Sep 9 04:52:55.669418 systemd-logind[1508]: Removed session 4. Sep 9 04:52:55.725702 sshd[1687]: Accepted publickey for core from 10.0.0.1 port 36644 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:52:55.726887 sshd-session[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:55.731023 systemd-logind[1508]: New session 5 of user core. Sep 9 04:52:55.742456 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 04:52:55.798648 sudo[1691]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 04:52:55.798918 sudo[1691]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:52:55.811159 sudo[1691]: pam_unix(sudo:session): session closed for user root Sep 9 04:52:55.812496 sshd[1690]: Connection closed by 10.0.0.1 port 36644 Sep 9 04:52:55.812975 sshd-session[1687]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:55.829306 systemd[1]: sshd@4-10.0.0.40:22-10.0.0.1:36644.service: Deactivated successfully. Sep 9 04:52:55.831575 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 04:52:55.832250 systemd-logind[1508]: Session 5 logged out. Waiting for processes to exit. Sep 9 04:52:55.834430 systemd[1]: Started sshd@5-10.0.0.40:22-10.0.0.1:36652.service - OpenSSH per-connection server daemon (10.0.0.1:36652). Sep 9 04:52:55.834973 systemd-logind[1508]: Removed session 5. Sep 9 04:52:55.890109 sshd[1697]: Accepted publickey for core from 10.0.0.1 port 36652 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:52:55.891290 sshd-session[1697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:55.895747 systemd-logind[1508]: New session 6 of user core. Sep 9 04:52:55.902488 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 04:52:55.952830 sudo[1702]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 04:52:55.953385 sudo[1702]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:52:56.024355 sudo[1702]: pam_unix(sudo:session): session closed for user root Sep 9 04:52:56.029478 sudo[1701]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 04:52:56.029729 sudo[1701]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:52:56.037635 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:52:56.075545 augenrules[1724]: No rules Sep 9 04:52:56.076940 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:52:56.077140 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:52:56.078459 sudo[1701]: pam_unix(sudo:session): session closed for user root Sep 9 04:52:56.079932 sshd[1700]: Connection closed by 10.0.0.1 port 36652 Sep 9 04:52:56.080278 sshd-session[1697]: pam_unix(sshd:session): session closed for user core Sep 9 04:52:56.089179 systemd[1]: sshd@5-10.0.0.40:22-10.0.0.1:36652.service: Deactivated successfully. Sep 9 04:52:56.090560 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 04:52:56.091916 systemd-logind[1508]: Session 6 logged out. Waiting for processes to exit. Sep 9 04:52:56.093742 systemd[1]: Started sshd@6-10.0.0.40:22-10.0.0.1:36654.service - OpenSSH per-connection server daemon (10.0.0.1:36654). Sep 9 04:52:56.094628 systemd-logind[1508]: Removed session 6. Sep 9 04:52:56.157288 sshd[1733]: Accepted publickey for core from 10.0.0.1 port 36654 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:52:56.158482 sshd-session[1733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:52:56.162411 systemd-logind[1508]: New session 7 of user core. Sep 9 04:52:56.170469 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 04:52:56.220097 sudo[1737]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 04:52:56.220381 sudo[1737]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:52:56.483866 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 04:52:56.493666 (dockerd)[1758]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 04:52:56.684154 dockerd[1758]: time="2025-09-09T04:52:56.684092617Z" level=info msg="Starting up" Sep 9 04:52:56.684894 dockerd[1758]: time="2025-09-09T04:52:56.684873680Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 04:52:56.694545 dockerd[1758]: time="2025-09-09T04:52:56.694512320Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 04:52:56.727769 dockerd[1758]: time="2025-09-09T04:52:56.727716409Z" level=info msg="Loading containers: start." Sep 9 04:52:56.735344 kernel: Initializing XFRM netlink socket Sep 9 04:52:56.914429 systemd-networkd[1434]: docker0: Link UP Sep 9 04:52:56.917581 dockerd[1758]: time="2025-09-09T04:52:56.917535715Z" level=info msg="Loading containers: done." Sep 9 04:52:56.930625 dockerd[1758]: time="2025-09-09T04:52:56.930580996Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 04:52:56.930752 dockerd[1758]: time="2025-09-09T04:52:56.930652598Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 04:52:56.930752 dockerd[1758]: time="2025-09-09T04:52:56.930729329Z" level=info msg="Initializing buildkit" Sep 9 04:52:56.949780 dockerd[1758]: time="2025-09-09T04:52:56.949732644Z" level=info msg="Completed buildkit initialization" Sep 9 04:52:56.956920 dockerd[1758]: time="2025-09-09T04:52:56.956887438Z" level=info msg="Daemon has completed initialization" Sep 9 04:52:56.957169 dockerd[1758]: time="2025-09-09T04:52:56.956970411Z" level=info msg="API listen on /run/docker.sock" Sep 9 04:52:56.957138 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 04:52:57.736227 containerd[1526]: time="2025-09-09T04:52:57.736188841Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 04:52:58.488592 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2929875262.mount: Deactivated successfully. Sep 9 04:52:59.690214 containerd[1526]: time="2025-09-09T04:52:59.689339391Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:59.700438 containerd[1526]: time="2025-09-09T04:52:59.700403223Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=26328359" Sep 9 04:52:59.712370 containerd[1526]: time="2025-09-09T04:52:59.712318852Z" level=info msg="ImageCreate event name:\"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:59.734517 containerd[1526]: time="2025-09-09T04:52:59.734467949Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:52:59.735675 containerd[1526]: time="2025-09-09T04:52:59.735639000Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"26325157\" in 1.999409922s" Sep 9 04:52:59.735737 containerd[1526]: time="2025-09-09T04:52:59.735677522Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\"" Sep 9 04:52:59.736347 containerd[1526]: time="2025-09-09T04:52:59.736297706Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 04:53:00.799413 containerd[1526]: time="2025-09-09T04:53:00.799366662Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:00.800691 containerd[1526]: time="2025-09-09T04:53:00.800435838Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=22528554" Sep 9 04:53:00.801331 containerd[1526]: time="2025-09-09T04:53:00.801296479Z" level=info msg="ImageCreate event name:\"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:00.804254 containerd[1526]: time="2025-09-09T04:53:00.804217512Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:00.805660 containerd[1526]: time="2025-09-09T04:53:00.805631495Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"24065666\" in 1.06930809s" Sep 9 04:53:00.805756 containerd[1526]: time="2025-09-09T04:53:00.805741342Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\"" Sep 9 04:53:00.806253 containerd[1526]: time="2025-09-09T04:53:00.806227523Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 04:53:01.841481 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 04:53:01.842804 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:53:02.014659 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:02.018715 (kubelet)[2046]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:53:02.191405 containerd[1526]: time="2025-09-09T04:53:02.191190090Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:02.192398 containerd[1526]: time="2025-09-09T04:53:02.192355900Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=17483529" Sep 9 04:53:02.193078 containerd[1526]: time="2025-09-09T04:53:02.193035561Z" level=info msg="ImageCreate event name:\"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:02.196035 containerd[1526]: time="2025-09-09T04:53:02.195995622Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:02.199512 containerd[1526]: time="2025-09-09T04:53:02.199479769Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"19020659\" in 1.393094842s" Sep 9 04:53:02.199724 containerd[1526]: time="2025-09-09T04:53:02.199627126Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\"" Sep 9 04:53:02.200305 containerd[1526]: time="2025-09-09T04:53:02.200287359Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 04:53:02.214265 kubelet[2046]: E0909 04:53:02.214202 2046 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:53:02.217310 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:53:02.217576 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:53:02.217950 systemd[1]: kubelet.service: Consumed 151ms CPU time, 107.9M memory peak. Sep 9 04:53:03.207088 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount35438635.mount: Deactivated successfully. Sep 9 04:53:03.581015 containerd[1526]: time="2025-09-09T04:53:03.580928084Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:03.582024 containerd[1526]: time="2025-09-09T04:53:03.581981284Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=27376726" Sep 9 04:53:03.582837 containerd[1526]: time="2025-09-09T04:53:03.582791784Z" level=info msg="ImageCreate event name:\"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:03.584767 containerd[1526]: time="2025-09-09T04:53:03.584721127Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:03.585387 containerd[1526]: time="2025-09-09T04:53:03.585353413Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"27375743\" in 1.384952721s" Sep 9 04:53:03.585436 containerd[1526]: time="2025-09-09T04:53:03.585388210Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\"" Sep 9 04:53:03.585797 containerd[1526]: time="2025-09-09T04:53:03.585771094Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 04:53:04.128340 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1275212431.mount: Deactivated successfully. Sep 9 04:53:05.290630 containerd[1526]: time="2025-09-09T04:53:05.290559999Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:05.291572 containerd[1526]: time="2025-09-09T04:53:05.291335499Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 9 04:53:05.293095 containerd[1526]: time="2025-09-09T04:53:05.293068486Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:05.295597 containerd[1526]: time="2025-09-09T04:53:05.295566072Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:05.297264 containerd[1526]: time="2025-09-09T04:53:05.297236493Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.711434428s" Sep 9 04:53:05.297385 containerd[1526]: time="2025-09-09T04:53:05.297367413Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 04:53:05.298021 containerd[1526]: time="2025-09-09T04:53:05.297894408Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 04:53:05.722308 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3401225682.mount: Deactivated successfully. Sep 9 04:53:05.725883 containerd[1526]: time="2025-09-09T04:53:05.725830566Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:53:05.726503 containerd[1526]: time="2025-09-09T04:53:05.726469117Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 9 04:53:05.727204 containerd[1526]: time="2025-09-09T04:53:05.727170393Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:53:05.729073 containerd[1526]: time="2025-09-09T04:53:05.729040768Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:53:05.729694 containerd[1526]: time="2025-09-09T04:53:05.729665583Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 431.747299ms" Sep 9 04:53:05.729729 containerd[1526]: time="2025-09-09T04:53:05.729693013Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 04:53:05.730186 containerd[1526]: time="2025-09-09T04:53:05.730149018Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 04:53:06.200550 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount667629040.mount: Deactivated successfully. Sep 9 04:53:08.259820 containerd[1526]: time="2025-09-09T04:53:08.259765293Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:08.260904 containerd[1526]: time="2025-09-09T04:53:08.260873414Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943167" Sep 9 04:53:08.261687 containerd[1526]: time="2025-09-09T04:53:08.261631165Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:08.394124 containerd[1526]: time="2025-09-09T04:53:08.393468201Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:08.394378 containerd[1526]: time="2025-09-09T04:53:08.394352237Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.664159254s" Sep 9 04:53:08.394460 containerd[1526]: time="2025-09-09T04:53:08.394445882Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 9 04:53:12.320137 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 04:53:12.321504 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:53:12.462239 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:12.465805 (kubelet)[2205]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:53:12.497649 kubelet[2205]: E0909 04:53:12.497610 2205 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:53:12.500054 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:53:12.500172 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:53:12.502396 systemd[1]: kubelet.service: Consumed 130ms CPU time, 105.4M memory peak. Sep 9 04:53:13.316495 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:13.316650 systemd[1]: kubelet.service: Consumed 130ms CPU time, 105.4M memory peak. Sep 9 04:53:13.318471 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:53:13.337313 systemd[1]: Reload requested from client PID 2219 ('systemctl') (unit session-7.scope)... Sep 9 04:53:13.337332 systemd[1]: Reloading... Sep 9 04:53:13.408362 zram_generator::config[2261]: No configuration found. Sep 9 04:53:13.563765 systemd[1]: Reloading finished in 226 ms. Sep 9 04:53:13.606589 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:13.608636 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:53:13.611136 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:53:13.611344 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:13.611379 systemd[1]: kubelet.service: Consumed 90ms CPU time, 95.2M memory peak. Sep 9 04:53:13.613071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:53:13.748938 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:13.752256 (kubelet)[2310]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:53:13.786981 kubelet[2310]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:53:13.786981 kubelet[2310]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:53:13.788353 kubelet[2310]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:53:13.788353 kubelet[2310]: I0909 04:53:13.787363 2310 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:53:15.084721 kubelet[2310]: I0909 04:53:15.084670 2310 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 04:53:15.084721 kubelet[2310]: I0909 04:53:15.084708 2310 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:53:15.085067 kubelet[2310]: I0909 04:53:15.084965 2310 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 04:53:15.107502 kubelet[2310]: E0909 04:53:15.107464 2310 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.40:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:53:15.109504 kubelet[2310]: I0909 04:53:15.109479 2310 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:53:15.118343 kubelet[2310]: I0909 04:53:15.115656 2310 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:53:15.118771 kubelet[2310]: I0909 04:53:15.118751 2310 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:53:15.119504 kubelet[2310]: I0909 04:53:15.119456 2310 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:53:15.119810 kubelet[2310]: I0909 04:53:15.119592 2310 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:53:15.120004 kubelet[2310]: I0909 04:53:15.119992 2310 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:53:15.120056 kubelet[2310]: I0909 04:53:15.120048 2310 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 04:53:15.120343 kubelet[2310]: I0909 04:53:15.120300 2310 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:53:15.122777 kubelet[2310]: I0909 04:53:15.122752 2310 kubelet.go:446] "Attempting to sync node with API server" Sep 9 04:53:15.122879 kubelet[2310]: I0909 04:53:15.122865 2310 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:53:15.123008 kubelet[2310]: I0909 04:53:15.122993 2310 kubelet.go:352] "Adding apiserver pod source" Sep 9 04:53:15.123080 kubelet[2310]: I0909 04:53:15.123070 2310 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:53:15.125826 kubelet[2310]: I0909 04:53:15.125808 2310 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:53:15.126644 kubelet[2310]: I0909 04:53:15.126477 2310 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:53:15.126644 kubelet[2310]: W0909 04:53:15.126597 2310 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 04:53:15.127642 kubelet[2310]: W0909 04:53:15.127572 2310 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Sep 9 04:53:15.127722 kubelet[2310]: E0909 04:53:15.127650 2310 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:53:15.127776 kubelet[2310]: I0909 04:53:15.127752 2310 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:53:15.127861 kubelet[2310]: I0909 04:53:15.127804 2310 server.go:1287] "Started kubelet" Sep 9 04:53:15.127908 kubelet[2310]: W0909 04:53:15.127603 2310 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Sep 9 04:53:15.128699 kubelet[2310]: E0909 04:53:15.127997 2310 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:53:15.128699 kubelet[2310]: I0909 04:53:15.128041 2310 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:53:15.129038 kubelet[2310]: I0909 04:53:15.129016 2310 server.go:479] "Adding debug handlers to kubelet server" Sep 9 04:53:15.129804 kubelet[2310]: I0909 04:53:15.129749 2310 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:53:15.130036 kubelet[2310]: I0909 04:53:15.130013 2310 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:53:15.130961 kubelet[2310]: E0909 04:53:15.130650 2310 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.40:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.40:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18638429e8235d74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 04:53:15.127782772 +0000 UTC m=+1.372542283,LastTimestamp:2025-09-09 04:53:15.127782772 +0000 UTC m=+1.372542283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 04:53:15.131670 kubelet[2310]: I0909 04:53:15.131648 2310 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:53:15.132373 kubelet[2310]: I0909 04:53:15.132355 2310 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:53:15.133108 kubelet[2310]: E0909 04:53:15.133067 2310 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:53:15.133341 kubelet[2310]: E0909 04:53:15.133299 2310 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.40:6443: connect: connection refused" interval="200ms" Sep 9 04:53:15.133443 kubelet[2310]: I0909 04:53:15.133424 2310 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:53:15.133477 kubelet[2310]: I0909 04:53:15.133406 2310 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:53:15.133605 kubelet[2310]: W0909 04:53:15.133558 2310 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Sep 9 04:53:15.133641 kubelet[2310]: E0909 04:53:15.133614 2310 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:53:15.133641 kubelet[2310]: I0909 04:53:15.133585 2310 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:53:15.133883 kubelet[2310]: I0909 04:53:15.133866 2310 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:53:15.134029 kubelet[2310]: E0909 04:53:15.134013 2310 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:53:15.134210 kubelet[2310]: I0909 04:53:15.134155 2310 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:53:15.135019 kubelet[2310]: I0909 04:53:15.135000 2310 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:53:15.146929 kubelet[2310]: I0909 04:53:15.146879 2310 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:53:15.148058 kubelet[2310]: I0909 04:53:15.148037 2310 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:53:15.148120 kubelet[2310]: I0909 04:53:15.148064 2310 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 04:53:15.148120 kubelet[2310]: I0909 04:53:15.148092 2310 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:53:15.148120 kubelet[2310]: I0909 04:53:15.148101 2310 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 04:53:15.148195 kubelet[2310]: E0909 04:53:15.148146 2310 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:53:15.148841 kubelet[2310]: W0909 04:53:15.148522 2310 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Sep 9 04:53:15.148841 kubelet[2310]: E0909 04:53:15.148558 2310 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Sep 9 04:53:15.148841 kubelet[2310]: I0909 04:53:15.148628 2310 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:53:15.148841 kubelet[2310]: I0909 04:53:15.148637 2310 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:53:15.148841 kubelet[2310]: I0909 04:53:15.148652 2310 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:53:15.233503 kubelet[2310]: E0909 04:53:15.233465 2310 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:53:15.248783 kubelet[2310]: E0909 04:53:15.248749 2310 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 04:53:15.300878 kubelet[2310]: I0909 04:53:15.300848 2310 policy_none.go:49] "None policy: Start" Sep 9 04:53:15.301146 kubelet[2310]: I0909 04:53:15.301017 2310 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:53:15.301146 kubelet[2310]: I0909 04:53:15.301038 2310 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:53:15.306845 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 04:53:15.322948 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 04:53:15.325910 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 04:53:15.333998 kubelet[2310]: E0909 04:53:15.333945 2310 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:53:15.334315 kubelet[2310]: E0909 04:53:15.334287 2310 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.40:6443: connect: connection refused" interval="400ms" Sep 9 04:53:15.348394 kubelet[2310]: I0909 04:53:15.348298 2310 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:53:15.348793 kubelet[2310]: I0909 04:53:15.348518 2310 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:53:15.348793 kubelet[2310]: I0909 04:53:15.348531 2310 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:53:15.348793 kubelet[2310]: I0909 04:53:15.348731 2310 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:53:15.350957 kubelet[2310]: E0909 04:53:15.350932 2310 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:53:15.351035 kubelet[2310]: E0909 04:53:15.350971 2310 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 04:53:15.450675 kubelet[2310]: I0909 04:53:15.450310 2310 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:53:15.451509 kubelet[2310]: E0909 04:53:15.451472 2310 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.40:6443/api/v1/nodes\": dial tcp 10.0.0.40:6443: connect: connection refused" node="localhost" Sep 9 04:53:15.457569 systemd[1]: Created slice kubepods-burstable-pod49965dd4df1a622bb76f8930886ba15a.slice - libcontainer container kubepods-burstable-pod49965dd4df1a622bb76f8930886ba15a.slice. Sep 9 04:53:15.487524 kubelet[2310]: E0909 04:53:15.487500 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:15.490435 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Sep 9 04:53:15.492573 kubelet[2310]: E0909 04:53:15.492445 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:15.494766 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Sep 9 04:53:15.496396 kubelet[2310]: E0909 04:53:15.496376 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:15.536964 kubelet[2310]: I0909 04:53:15.536914 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:15.536964 kubelet[2310]: I0909 04:53:15.536949 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:15.537076 kubelet[2310]: I0909 04:53:15.536969 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:53:15.537076 kubelet[2310]: I0909 04:53:15.536983 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/49965dd4df1a622bb76f8930886ba15a-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"49965dd4df1a622bb76f8930886ba15a\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:15.537076 kubelet[2310]: I0909 04:53:15.536996 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/49965dd4df1a622bb76f8930886ba15a-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"49965dd4df1a622bb76f8930886ba15a\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:15.537076 kubelet[2310]: I0909 04:53:15.537018 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:15.537076 kubelet[2310]: I0909 04:53:15.537034 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:15.537169 kubelet[2310]: I0909 04:53:15.537049 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/49965dd4df1a622bb76f8930886ba15a-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"49965dd4df1a622bb76f8930886ba15a\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:15.537169 kubelet[2310]: I0909 04:53:15.537063 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:15.652939 kubelet[2310]: I0909 04:53:15.652859 2310 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:53:15.653296 kubelet[2310]: E0909 04:53:15.653262 2310 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.40:6443/api/v1/nodes\": dial tcp 10.0.0.40:6443: connect: connection refused" node="localhost" Sep 9 04:53:15.734750 kubelet[2310]: E0909 04:53:15.734700 2310 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.40:6443: connect: connection refused" interval="800ms" Sep 9 04:53:15.788772 containerd[1526]: time="2025-09-09T04:53:15.788727857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:49965dd4df1a622bb76f8930886ba15a,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:15.794257 containerd[1526]: time="2025-09-09T04:53:15.794219735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:15.797473 containerd[1526]: time="2025-09-09T04:53:15.797350109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:15.816580 containerd[1526]: time="2025-09-09T04:53:15.816539556Z" level=info msg="connecting to shim d1d757079e90ca845c1fe38fb6d4fa1bf789755e6abe81148a4d7ca39e4c7467" address="unix:///run/containerd/s/052bc1199f0518b063a1af0eea93f5acb06b032d0d835082ad58a435d423d28b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:15.829213 containerd[1526]: time="2025-09-09T04:53:15.829172918Z" level=info msg="connecting to shim 2731e8222de237f95096d9ca49f80fdeb3965e9d5604325e68c831fbcc6d143a" address="unix:///run/containerd/s/ac3fea1d83b5bd26119f98fe372066e7e963a1496e640b27256afe18b56d428c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:15.833980 containerd[1526]: time="2025-09-09T04:53:15.833943303Z" level=info msg="connecting to shim 4f84697fdf8b0251df9feeff99668e5d68f208dd6f0651687219e8c81539ccd2" address="unix:///run/containerd/s/fb5246c358b34c2c23ad8479e0b645066076f378eccd7beb4c034ada3732da2f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:15.844479 systemd[1]: Started cri-containerd-d1d757079e90ca845c1fe38fb6d4fa1bf789755e6abe81148a4d7ca39e4c7467.scope - libcontainer container d1d757079e90ca845c1fe38fb6d4fa1bf789755e6abe81148a4d7ca39e4c7467. Sep 9 04:53:15.862464 systemd[1]: Started cri-containerd-4f84697fdf8b0251df9feeff99668e5d68f208dd6f0651687219e8c81539ccd2.scope - libcontainer container 4f84697fdf8b0251df9feeff99668e5d68f208dd6f0651687219e8c81539ccd2. Sep 9 04:53:15.865885 systemd[1]: Started cri-containerd-2731e8222de237f95096d9ca49f80fdeb3965e9d5604325e68c831fbcc6d143a.scope - libcontainer container 2731e8222de237f95096d9ca49f80fdeb3965e9d5604325e68c831fbcc6d143a. Sep 9 04:53:15.898686 containerd[1526]: time="2025-09-09T04:53:15.898642255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:49965dd4df1a622bb76f8930886ba15a,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1d757079e90ca845c1fe38fb6d4fa1bf789755e6abe81148a4d7ca39e4c7467\"" Sep 9 04:53:15.901866 containerd[1526]: time="2025-09-09T04:53:15.901835918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"4f84697fdf8b0251df9feeff99668e5d68f208dd6f0651687219e8c81539ccd2\"" Sep 9 04:53:15.901968 containerd[1526]: time="2025-09-09T04:53:15.901937749Z" level=info msg="CreateContainer within sandbox \"d1d757079e90ca845c1fe38fb6d4fa1bf789755e6abe81148a4d7ca39e4c7467\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 04:53:15.904373 containerd[1526]: time="2025-09-09T04:53:15.904135892Z" level=info msg="CreateContainer within sandbox \"4f84697fdf8b0251df9feeff99668e5d68f208dd6f0651687219e8c81539ccd2\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 04:53:15.914969 containerd[1526]: time="2025-09-09T04:53:15.914930418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"2731e8222de237f95096d9ca49f80fdeb3965e9d5604325e68c831fbcc6d143a\"" Sep 9 04:53:15.916932 containerd[1526]: time="2025-09-09T04:53:15.916896232Z" level=info msg="CreateContainer within sandbox \"2731e8222de237f95096d9ca49f80fdeb3965e9d5604325e68c831fbcc6d143a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 04:53:15.918145 containerd[1526]: time="2025-09-09T04:53:15.917828864Z" level=info msg="Container 3f6282a21514054b1b3c7c5cfb92a6566e37f310fa575a37a8ac3f625e24c63d: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:15.918536 containerd[1526]: time="2025-09-09T04:53:15.918513694Z" level=info msg="Container 1bc1234b94cc55ecd27f9757b8c163045f33ebe185cf6342cbaa0b2832ec1923: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:15.925297 containerd[1526]: time="2025-09-09T04:53:15.925238579Z" level=info msg="Container db56c2d2081ea4afe4e9323acc5f346b23523485dbcb981305ba89bc13393235: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:15.927672 containerd[1526]: time="2025-09-09T04:53:15.927607679Z" level=info msg="CreateContainer within sandbox \"d1d757079e90ca845c1fe38fb6d4fa1bf789755e6abe81148a4d7ca39e4c7467\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3f6282a21514054b1b3c7c5cfb92a6566e37f310fa575a37a8ac3f625e24c63d\"" Sep 9 04:53:15.928161 containerd[1526]: time="2025-09-09T04:53:15.928131987Z" level=info msg="StartContainer for \"3f6282a21514054b1b3c7c5cfb92a6566e37f310fa575a37a8ac3f625e24c63d\"" Sep 9 04:53:15.929247 containerd[1526]: time="2025-09-09T04:53:15.929184600Z" level=info msg="connecting to shim 3f6282a21514054b1b3c7c5cfb92a6566e37f310fa575a37a8ac3f625e24c63d" address="unix:///run/containerd/s/052bc1199f0518b063a1af0eea93f5acb06b032d0d835082ad58a435d423d28b" protocol=ttrpc version=3 Sep 9 04:53:15.930699 containerd[1526]: time="2025-09-09T04:53:15.930659650Z" level=info msg="CreateContainer within sandbox \"4f84697fdf8b0251df9feeff99668e5d68f208dd6f0651687219e8c81539ccd2\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1bc1234b94cc55ecd27f9757b8c163045f33ebe185cf6342cbaa0b2832ec1923\"" Sep 9 04:53:15.931161 containerd[1526]: time="2025-09-09T04:53:15.931130704Z" level=info msg="StartContainer for \"1bc1234b94cc55ecd27f9757b8c163045f33ebe185cf6342cbaa0b2832ec1923\"" Sep 9 04:53:15.932097 containerd[1526]: time="2025-09-09T04:53:15.932068892Z" level=info msg="connecting to shim 1bc1234b94cc55ecd27f9757b8c163045f33ebe185cf6342cbaa0b2832ec1923" address="unix:///run/containerd/s/fb5246c358b34c2c23ad8479e0b645066076f378eccd7beb4c034ada3732da2f" protocol=ttrpc version=3 Sep 9 04:53:15.933931 containerd[1526]: time="2025-09-09T04:53:15.933497045Z" level=info msg="CreateContainer within sandbox \"2731e8222de237f95096d9ca49f80fdeb3965e9d5604325e68c831fbcc6d143a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"db56c2d2081ea4afe4e9323acc5f346b23523485dbcb981305ba89bc13393235\"" Sep 9 04:53:15.933997 containerd[1526]: time="2025-09-09T04:53:15.933974815Z" level=info msg="StartContainer for \"db56c2d2081ea4afe4e9323acc5f346b23523485dbcb981305ba89bc13393235\"" Sep 9 04:53:15.935025 containerd[1526]: time="2025-09-09T04:53:15.934982411Z" level=info msg="connecting to shim db56c2d2081ea4afe4e9323acc5f346b23523485dbcb981305ba89bc13393235" address="unix:///run/containerd/s/ac3fea1d83b5bd26119f98fe372066e7e963a1496e640b27256afe18b56d428c" protocol=ttrpc version=3 Sep 9 04:53:15.949455 systemd[1]: Started cri-containerd-3f6282a21514054b1b3c7c5cfb92a6566e37f310fa575a37a8ac3f625e24c63d.scope - libcontainer container 3f6282a21514054b1b3c7c5cfb92a6566e37f310fa575a37a8ac3f625e24c63d. Sep 9 04:53:15.966465 systemd[1]: Started cri-containerd-1bc1234b94cc55ecd27f9757b8c163045f33ebe185cf6342cbaa0b2832ec1923.scope - libcontainer container 1bc1234b94cc55ecd27f9757b8c163045f33ebe185cf6342cbaa0b2832ec1923. Sep 9 04:53:15.967494 systemd[1]: Started cri-containerd-db56c2d2081ea4afe4e9323acc5f346b23523485dbcb981305ba89bc13393235.scope - libcontainer container db56c2d2081ea4afe4e9323acc5f346b23523485dbcb981305ba89bc13393235. Sep 9 04:53:16.011227 containerd[1526]: time="2025-09-09T04:53:16.011150216Z" level=info msg="StartContainer for \"db56c2d2081ea4afe4e9323acc5f346b23523485dbcb981305ba89bc13393235\" returns successfully" Sep 9 04:53:16.011227 containerd[1526]: time="2025-09-09T04:53:16.011223265Z" level=info msg="StartContainer for \"3f6282a21514054b1b3c7c5cfb92a6566e37f310fa575a37a8ac3f625e24c63d\" returns successfully" Sep 9 04:53:16.013218 containerd[1526]: time="2025-09-09T04:53:16.013176363Z" level=info msg="StartContainer for \"1bc1234b94cc55ecd27f9757b8c163045f33ebe185cf6342cbaa0b2832ec1923\" returns successfully" Sep 9 04:53:16.055632 kubelet[2310]: I0909 04:53:16.055597 2310 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:53:16.056059 kubelet[2310]: E0909 04:53:16.055965 2310 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.40:6443/api/v1/nodes\": dial tcp 10.0.0.40:6443: connect: connection refused" node="localhost" Sep 9 04:53:16.155780 kubelet[2310]: E0909 04:53:16.155544 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:16.159759 kubelet[2310]: E0909 04:53:16.159737 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:16.161360 kubelet[2310]: E0909 04:53:16.161342 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:16.859162 kubelet[2310]: I0909 04:53:16.859129 2310 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:53:17.165033 kubelet[2310]: E0909 04:53:17.164931 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:17.165833 kubelet[2310]: E0909 04:53:17.165080 2310 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:53:17.386195 kubelet[2310]: E0909 04:53:17.385925 2310 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 04:53:17.451630 kubelet[2310]: I0909 04:53:17.450830 2310 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 04:53:17.534001 kubelet[2310]: I0909 04:53:17.533933 2310 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:17.544355 kubelet[2310]: E0909 04:53:17.543531 2310 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:17.544355 kubelet[2310]: I0909 04:53:17.543560 2310 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:17.545362 kubelet[2310]: E0909 04:53:17.545335 2310 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:17.545362 kubelet[2310]: I0909 04:53:17.545361 2310 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:53:17.547814 kubelet[2310]: E0909 04:53:17.547440 2310 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 9 04:53:18.126228 kubelet[2310]: I0909 04:53:18.126200 2310 apiserver.go:52] "Watching apiserver" Sep 9 04:53:18.134496 kubelet[2310]: I0909 04:53:18.134447 2310 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:53:19.023212 kubelet[2310]: I0909 04:53:19.023182 2310 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:53:19.190462 systemd[1]: Reload requested from client PID 2587 ('systemctl') (unit session-7.scope)... Sep 9 04:53:19.190478 systemd[1]: Reloading... Sep 9 04:53:19.279362 zram_generator::config[2629]: No configuration found. Sep 9 04:53:19.449250 systemd[1]: Reloading finished in 258 ms. Sep 9 04:53:19.480732 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:53:19.493196 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:53:19.493450 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:19.493505 systemd[1]: kubelet.service: Consumed 1.733s CPU time, 127.7M memory peak. Sep 9 04:53:19.495089 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:53:19.646447 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:53:19.652231 (kubelet)[2672]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:53:19.702428 kubelet[2672]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:53:19.702428 kubelet[2672]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:53:19.702428 kubelet[2672]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:53:19.702428 kubelet[2672]: I0909 04:53:19.701775 2672 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:53:19.709540 kubelet[2672]: I0909 04:53:19.709509 2672 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 04:53:19.709540 kubelet[2672]: I0909 04:53:19.709535 2672 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:53:19.709818 kubelet[2672]: I0909 04:53:19.709798 2672 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 04:53:19.711004 kubelet[2672]: I0909 04:53:19.710975 2672 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 04:53:19.713980 kubelet[2672]: I0909 04:53:19.713819 2672 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:53:19.717382 kubelet[2672]: I0909 04:53:19.717362 2672 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:53:19.720277 kubelet[2672]: I0909 04:53:19.720227 2672 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:53:19.720693 kubelet[2672]: I0909 04:53:19.720642 2672 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:53:19.720833 kubelet[2672]: I0909 04:53:19.720681 2672 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:53:19.720910 kubelet[2672]: I0909 04:53:19.720850 2672 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:53:19.720910 kubelet[2672]: I0909 04:53:19.720859 2672 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 04:53:19.720910 kubelet[2672]: I0909 04:53:19.720899 2672 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:53:19.721017 kubelet[2672]: I0909 04:53:19.721007 2672 kubelet.go:446] "Attempting to sync node with API server" Sep 9 04:53:19.721052 kubelet[2672]: I0909 04:53:19.721020 2672 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:53:19.721052 kubelet[2672]: I0909 04:53:19.721050 2672 kubelet.go:352] "Adding apiserver pod source" Sep 9 04:53:19.721089 kubelet[2672]: I0909 04:53:19.721059 2672 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:53:19.721659 kubelet[2672]: I0909 04:53:19.721631 2672 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:53:19.722160 kubelet[2672]: I0909 04:53:19.722122 2672 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 04:53:19.723212 kubelet[2672]: I0909 04:53:19.723194 2672 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:53:19.723277 kubelet[2672]: I0909 04:53:19.723228 2672 server.go:1287] "Started kubelet" Sep 9 04:53:19.724103 kubelet[2672]: I0909 04:53:19.723945 2672 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:53:19.724426 kubelet[2672]: I0909 04:53:19.724375 2672 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:53:19.724483 kubelet[2672]: I0909 04:53:19.724453 2672 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:53:19.724833 kubelet[2672]: I0909 04:53:19.724813 2672 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:53:19.725363 kubelet[2672]: I0909 04:53:19.725128 2672 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:53:19.725424 kubelet[2672]: I0909 04:53:19.725411 2672 server.go:479] "Adding debug handlers to kubelet server" Sep 9 04:53:19.726765 kubelet[2672]: I0909 04:53:19.726740 2672 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:53:19.726839 kubelet[2672]: I0909 04:53:19.726830 2672 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:53:19.727067 kubelet[2672]: E0909 04:53:19.727047 2672 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:53:19.727484 kubelet[2672]: I0909 04:53:19.727459 2672 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:53:19.729068 kubelet[2672]: E0909 04:53:19.729046 2672 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:53:19.729484 kubelet[2672]: I0909 04:53:19.729385 2672 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:53:19.731618 kubelet[2672]: I0909 04:53:19.731587 2672 factory.go:221] Registration of the containerd container factory successfully Sep 9 04:53:19.731618 kubelet[2672]: I0909 04:53:19.731611 2672 factory.go:221] Registration of the systemd container factory successfully Sep 9 04:53:19.737732 kubelet[2672]: I0909 04:53:19.737692 2672 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 04:53:19.738738 kubelet[2672]: I0909 04:53:19.738709 2672 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 04:53:19.738738 kubelet[2672]: I0909 04:53:19.738738 2672 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 04:53:19.738807 kubelet[2672]: I0909 04:53:19.738755 2672 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:53:19.738807 kubelet[2672]: I0909 04:53:19.738761 2672 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 04:53:19.738807 kubelet[2672]: E0909 04:53:19.738801 2672 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:53:19.776680 kubelet[2672]: I0909 04:53:19.776656 2672 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:53:19.776680 kubelet[2672]: I0909 04:53:19.776671 2672 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:53:19.776680 kubelet[2672]: I0909 04:53:19.776689 2672 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:53:19.776843 kubelet[2672]: I0909 04:53:19.776826 2672 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 04:53:19.776866 kubelet[2672]: I0909 04:53:19.776837 2672 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 04:53:19.776866 kubelet[2672]: I0909 04:53:19.776853 2672 policy_none.go:49] "None policy: Start" Sep 9 04:53:19.776866 kubelet[2672]: I0909 04:53:19.776861 2672 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:53:19.776925 kubelet[2672]: I0909 04:53:19.776870 2672 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:53:19.776970 kubelet[2672]: I0909 04:53:19.776954 2672 state_mem.go:75] "Updated machine memory state" Sep 9 04:53:19.780397 kubelet[2672]: I0909 04:53:19.780374 2672 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 04:53:19.780679 kubelet[2672]: I0909 04:53:19.780517 2672 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:53:19.780679 kubelet[2672]: I0909 04:53:19.780529 2672 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:53:19.780923 kubelet[2672]: I0909 04:53:19.780752 2672 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:53:19.781791 kubelet[2672]: E0909 04:53:19.781626 2672 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:53:19.843577 kubelet[2672]: I0909 04:53:19.843543 2672 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:19.843577 kubelet[2672]: I0909 04:53:19.843572 2672 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:53:19.843577 kubelet[2672]: I0909 04:53:19.843581 2672 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:19.850024 kubelet[2672]: E0909 04:53:19.849999 2672 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 9 04:53:19.882152 kubelet[2672]: I0909 04:53:19.882127 2672 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:53:19.889864 kubelet[2672]: I0909 04:53:19.889804 2672 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 9 04:53:19.889943 kubelet[2672]: I0909 04:53:19.889903 2672 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 04:53:19.927620 kubelet[2672]: I0909 04:53:19.927582 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/49965dd4df1a622bb76f8930886ba15a-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"49965dd4df1a622bb76f8930886ba15a\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:19.927620 kubelet[2672]: I0909 04:53:19.927621 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/49965dd4df1a622bb76f8930886ba15a-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"49965dd4df1a622bb76f8930886ba15a\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:20.028182 kubelet[2672]: I0909 04:53:20.028082 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:20.028182 kubelet[2672]: I0909 04:53:20.028128 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:20.028182 kubelet[2672]: I0909 04:53:20.028149 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:20.028182 kubelet[2672]: I0909 04:53:20.028168 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:53:20.028375 kubelet[2672]: I0909 04:53:20.028192 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/49965dd4df1a622bb76f8930886ba15a-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"49965dd4df1a622bb76f8930886ba15a\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:20.028375 kubelet[2672]: I0909 04:53:20.028210 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:20.028375 kubelet[2672]: I0909 04:53:20.028224 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:53:20.722696 kubelet[2672]: I0909 04:53:20.721492 2672 apiserver.go:52] "Watching apiserver" Sep 9 04:53:20.727864 kubelet[2672]: I0909 04:53:20.727836 2672 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:53:20.770113 kubelet[2672]: I0909 04:53:20.770073 2672 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:20.775564 kubelet[2672]: E0909 04:53:20.775537 2672 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 04:53:20.813457 kubelet[2672]: I0909 04:53:20.813398 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.813381106 podStartE2EDuration="1.813381106s" podCreationTimestamp="2025-09-09 04:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:53:20.806468972 +0000 UTC m=+1.148240098" watchObservedRunningTime="2025-09-09 04:53:20.813381106 +0000 UTC m=+1.155152233" Sep 9 04:53:20.813857 kubelet[2672]: I0909 04:53:20.813518 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.813503276 podStartE2EDuration="1.813503276s" podCreationTimestamp="2025-09-09 04:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:53:20.813504636 +0000 UTC m=+1.155275763" watchObservedRunningTime="2025-09-09 04:53:20.813503276 +0000 UTC m=+1.155274363" Sep 9 04:53:20.821438 kubelet[2672]: I0909 04:53:20.821282 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.8212687600000002 podStartE2EDuration="1.82126876s" podCreationTimestamp="2025-09-09 04:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:53:20.820849984 +0000 UTC m=+1.162621110" watchObservedRunningTime="2025-09-09 04:53:20.82126876 +0000 UTC m=+1.163039887" Sep 9 04:53:24.602775 kubelet[2672]: I0909 04:53:24.602738 2672 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 04:53:24.603655 containerd[1526]: time="2025-09-09T04:53:24.603560239Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 04:53:24.604105 kubelet[2672]: I0909 04:53:24.603775 2672 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 04:53:25.584190 systemd[1]: Created slice kubepods-besteffort-pod9c2ee0cd_bf08_4213_8520_ae1709fcb55c.slice - libcontainer container kubepods-besteffort-pod9c2ee0cd_bf08_4213_8520_ae1709fcb55c.slice. Sep 9 04:53:25.669558 kubelet[2672]: I0909 04:53:25.669517 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9c2ee0cd-bf08-4213-8520-ae1709fcb55c-xtables-lock\") pod \"kube-proxy-zql7r\" (UID: \"9c2ee0cd-bf08-4213-8520-ae1709fcb55c\") " pod="kube-system/kube-proxy-zql7r" Sep 9 04:53:25.669558 kubelet[2672]: I0909 04:53:25.669561 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9c2ee0cd-bf08-4213-8520-ae1709fcb55c-kube-proxy\") pod \"kube-proxy-zql7r\" (UID: \"9c2ee0cd-bf08-4213-8520-ae1709fcb55c\") " pod="kube-system/kube-proxy-zql7r" Sep 9 04:53:25.670022 kubelet[2672]: I0909 04:53:25.669576 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9c2ee0cd-bf08-4213-8520-ae1709fcb55c-lib-modules\") pod \"kube-proxy-zql7r\" (UID: \"9c2ee0cd-bf08-4213-8520-ae1709fcb55c\") " pod="kube-system/kube-proxy-zql7r" Sep 9 04:53:25.670022 kubelet[2672]: I0909 04:53:25.669593 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khklr\" (UniqueName: \"kubernetes.io/projected/9c2ee0cd-bf08-4213-8520-ae1709fcb55c-kube-api-access-khklr\") pod \"kube-proxy-zql7r\" (UID: \"9c2ee0cd-bf08-4213-8520-ae1709fcb55c\") " pod="kube-system/kube-proxy-zql7r" Sep 9 04:53:25.694881 systemd[1]: Created slice kubepods-besteffort-podc3528d1a_39b0_423e_8341_63382c3e5611.slice - libcontainer container kubepods-besteffort-podc3528d1a_39b0_423e_8341_63382c3e5611.slice. Sep 9 04:53:25.770269 kubelet[2672]: I0909 04:53:25.770222 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c3528d1a-39b0-423e-8341-63382c3e5611-var-lib-calico\") pod \"tigera-operator-755d956888-fd22g\" (UID: \"c3528d1a-39b0-423e-8341-63382c3e5611\") " pod="tigera-operator/tigera-operator-755d956888-fd22g" Sep 9 04:53:25.770269 kubelet[2672]: I0909 04:53:25.770262 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-754xl\" (UniqueName: \"kubernetes.io/projected/c3528d1a-39b0-423e-8341-63382c3e5611-kube-api-access-754xl\") pod \"tigera-operator-755d956888-fd22g\" (UID: \"c3528d1a-39b0-423e-8341-63382c3e5611\") " pod="tigera-operator/tigera-operator-755d956888-fd22g" Sep 9 04:53:25.895645 containerd[1526]: time="2025-09-09T04:53:25.895544215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zql7r,Uid:9c2ee0cd-bf08-4213-8520-ae1709fcb55c,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:25.914805 containerd[1526]: time="2025-09-09T04:53:25.914766534Z" level=info msg="connecting to shim 02c40bddfe17edc30c1a74e1eb10f84f1a4cb8a830735423431c9754e41fd47e" address="unix:///run/containerd/s/3593b766c9be0dece85c2cebef5c673d54d7c46f9f121ab5c435b72489824180" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:25.934489 systemd[1]: Started cri-containerd-02c40bddfe17edc30c1a74e1eb10f84f1a4cb8a830735423431c9754e41fd47e.scope - libcontainer container 02c40bddfe17edc30c1a74e1eb10f84f1a4cb8a830735423431c9754e41fd47e. Sep 9 04:53:25.954312 containerd[1526]: time="2025-09-09T04:53:25.954275975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zql7r,Uid:9c2ee0cd-bf08-4213-8520-ae1709fcb55c,Namespace:kube-system,Attempt:0,} returns sandbox id \"02c40bddfe17edc30c1a74e1eb10f84f1a4cb8a830735423431c9754e41fd47e\"" Sep 9 04:53:25.957235 containerd[1526]: time="2025-09-09T04:53:25.957206551Z" level=info msg="CreateContainer within sandbox \"02c40bddfe17edc30c1a74e1eb10f84f1a4cb8a830735423431c9754e41fd47e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 04:53:25.965910 containerd[1526]: time="2025-09-09T04:53:25.965865684Z" level=info msg="Container 738d8b51a9d44c5c8598d599f219f23cd9ed8a73f10a97db0afcd099d6217ff6: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:25.972282 containerd[1526]: time="2025-09-09T04:53:25.972230579Z" level=info msg="CreateContainer within sandbox \"02c40bddfe17edc30c1a74e1eb10f84f1a4cb8a830735423431c9754e41fd47e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"738d8b51a9d44c5c8598d599f219f23cd9ed8a73f10a97db0afcd099d6217ff6\"" Sep 9 04:53:25.972718 containerd[1526]: time="2025-09-09T04:53:25.972690482Z" level=info msg="StartContainer for \"738d8b51a9d44c5c8598d599f219f23cd9ed8a73f10a97db0afcd099d6217ff6\"" Sep 9 04:53:25.975496 containerd[1526]: time="2025-09-09T04:53:25.975461904Z" level=info msg="connecting to shim 738d8b51a9d44c5c8598d599f219f23cd9ed8a73f10a97db0afcd099d6217ff6" address="unix:///run/containerd/s/3593b766c9be0dece85c2cebef5c673d54d7c46f9f121ab5c435b72489824180" protocol=ttrpc version=3 Sep 9 04:53:25.995494 systemd[1]: Started cri-containerd-738d8b51a9d44c5c8598d599f219f23cd9ed8a73f10a97db0afcd099d6217ff6.scope - libcontainer container 738d8b51a9d44c5c8598d599f219f23cd9ed8a73f10a97db0afcd099d6217ff6. Sep 9 04:53:25.997698 containerd[1526]: time="2025-09-09T04:53:25.997644839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-fd22g,Uid:c3528d1a-39b0-423e-8341-63382c3e5611,Namespace:tigera-operator,Attempt:0,}" Sep 9 04:53:26.016139 containerd[1526]: time="2025-09-09T04:53:26.016105815Z" level=info msg="connecting to shim 2631ed4f18ab192c2a2acd15200e48d0c43345e80097d013c81ae776773763e5" address="unix:///run/containerd/s/a28f03f5232432dec5ec06474c8069cbe3dd19867923fa105b7eb202ca04bf25" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:26.035294 containerd[1526]: time="2025-09-09T04:53:26.035259773Z" level=info msg="StartContainer for \"738d8b51a9d44c5c8598d599f219f23cd9ed8a73f10a97db0afcd099d6217ff6\" returns successfully" Sep 9 04:53:26.049549 systemd[1]: Started cri-containerd-2631ed4f18ab192c2a2acd15200e48d0c43345e80097d013c81ae776773763e5.scope - libcontainer container 2631ed4f18ab192c2a2acd15200e48d0c43345e80097d013c81ae776773763e5. Sep 9 04:53:26.086990 containerd[1526]: time="2025-09-09T04:53:26.086913283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-fd22g,Uid:c3528d1a-39b0-423e-8341-63382c3e5611,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"2631ed4f18ab192c2a2acd15200e48d0c43345e80097d013c81ae776773763e5\"" Sep 9 04:53:26.089035 containerd[1526]: time="2025-09-09T04:53:26.088840298Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 04:53:27.288143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount936844129.mount: Deactivated successfully. Sep 9 04:53:27.623343 kubelet[2672]: I0909 04:53:27.623228 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-zql7r" podStartSLOduration=2.623210376 podStartE2EDuration="2.623210376s" podCreationTimestamp="2025-09-09 04:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:53:26.797030653 +0000 UTC m=+7.138801780" watchObservedRunningTime="2025-09-09 04:53:27.623210376 +0000 UTC m=+7.964981503" Sep 9 04:53:27.703449 containerd[1526]: time="2025-09-09T04:53:27.703392234Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:27.704600 containerd[1526]: time="2025-09-09T04:53:27.704564637Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 04:53:27.705246 containerd[1526]: time="2025-09-09T04:53:27.705198137Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:27.707466 containerd[1526]: time="2025-09-09T04:53:27.707437506Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:27.709883 containerd[1526]: time="2025-09-09T04:53:27.709352525Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.620480988s" Sep 9 04:53:27.709883 containerd[1526]: time="2025-09-09T04:53:27.709384404Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 04:53:27.712917 containerd[1526]: time="2025-09-09T04:53:27.712870494Z" level=info msg="CreateContainer within sandbox \"2631ed4f18ab192c2a2acd15200e48d0c43345e80097d013c81ae776773763e5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 04:53:27.718529 containerd[1526]: time="2025-09-09T04:53:27.718486396Z" level=info msg="Container f88dbc07470fa4c66cf5464228795e9b3461ef47b6aa7a048a26dff13f648d6b: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:27.725569 containerd[1526]: time="2025-09-09T04:53:27.725525972Z" level=info msg="CreateContainer within sandbox \"2631ed4f18ab192c2a2acd15200e48d0c43345e80097d013c81ae776773763e5\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f88dbc07470fa4c66cf5464228795e9b3461ef47b6aa7a048a26dff13f648d6b\"" Sep 9 04:53:27.727217 containerd[1526]: time="2025-09-09T04:53:27.727183200Z" level=info msg="StartContainer for \"f88dbc07470fa4c66cf5464228795e9b3461ef47b6aa7a048a26dff13f648d6b\"" Sep 9 04:53:27.728226 containerd[1526]: time="2025-09-09T04:53:27.728196808Z" level=info msg="connecting to shim f88dbc07470fa4c66cf5464228795e9b3461ef47b6aa7a048a26dff13f648d6b" address="unix:///run/containerd/s/a28f03f5232432dec5ec06474c8069cbe3dd19867923fa105b7eb202ca04bf25" protocol=ttrpc version=3 Sep 9 04:53:27.749500 systemd[1]: Started cri-containerd-f88dbc07470fa4c66cf5464228795e9b3461ef47b6aa7a048a26dff13f648d6b.scope - libcontainer container f88dbc07470fa4c66cf5464228795e9b3461ef47b6aa7a048a26dff13f648d6b. Sep 9 04:53:27.773357 containerd[1526]: time="2025-09-09T04:53:27.773036146Z" level=info msg="StartContainer for \"f88dbc07470fa4c66cf5464228795e9b3461ef47b6aa7a048a26dff13f648d6b\" returns successfully" Sep 9 04:53:27.799106 kubelet[2672]: I0909 04:53:27.798769 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-fd22g" podStartSLOduration=1.175196202 podStartE2EDuration="2.798579777s" podCreationTimestamp="2025-09-09 04:53:25 +0000 UTC" firstStartedPulling="2025-09-09 04:53:26.088270117 +0000 UTC m=+6.430041244" lastFinishedPulling="2025-09-09 04:53:27.711653692 +0000 UTC m=+8.053424819" observedRunningTime="2025-09-09 04:53:27.798356784 +0000 UTC m=+8.140127911" watchObservedRunningTime="2025-09-09 04:53:27.798579777 +0000 UTC m=+8.140350904" Sep 9 04:53:32.986204 sudo[1737]: pam_unix(sudo:session): session closed for user root Sep 9 04:53:32.987751 sshd[1736]: Connection closed by 10.0.0.1 port 36654 Sep 9 04:53:32.988200 sshd-session[1733]: pam_unix(sshd:session): session closed for user core Sep 9 04:53:32.993690 systemd[1]: sshd@6-10.0.0.40:22-10.0.0.1:36654.service: Deactivated successfully. Sep 9 04:53:32.995405 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 04:53:32.995570 systemd[1]: session-7.scope: Consumed 6.525s CPU time, 224.7M memory peak. Sep 9 04:53:32.996750 systemd-logind[1508]: Session 7 logged out. Waiting for processes to exit. Sep 9 04:53:32.998300 systemd-logind[1508]: Removed session 7. Sep 9 04:53:34.558427 update_engine[1515]: I20250909 04:53:34.558358 1515 update_attempter.cc:509] Updating boot flags... Sep 9 04:53:37.868896 systemd[1]: Created slice kubepods-besteffort-pod28e3e150_3288_46fc_8db2_8a33d0c1ef24.slice - libcontainer container kubepods-besteffort-pod28e3e150_3288_46fc_8db2_8a33d0c1ef24.slice. Sep 9 04:53:37.955585 kubelet[2672]: I0909 04:53:37.955545 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/28e3e150-3288-46fc-8db2-8a33d0c1ef24-typha-certs\") pod \"calico-typha-8dff5c8f4-ndc5j\" (UID: \"28e3e150-3288-46fc-8db2-8a33d0c1ef24\") " pod="calico-system/calico-typha-8dff5c8f4-ndc5j" Sep 9 04:53:37.955940 kubelet[2672]: I0909 04:53:37.955624 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvzc7\" (UniqueName: \"kubernetes.io/projected/28e3e150-3288-46fc-8db2-8a33d0c1ef24-kube-api-access-bvzc7\") pod \"calico-typha-8dff5c8f4-ndc5j\" (UID: \"28e3e150-3288-46fc-8db2-8a33d0c1ef24\") " pod="calico-system/calico-typha-8dff5c8f4-ndc5j" Sep 9 04:53:37.955940 kubelet[2672]: I0909 04:53:37.955651 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28e3e150-3288-46fc-8db2-8a33d0c1ef24-tigera-ca-bundle\") pod \"calico-typha-8dff5c8f4-ndc5j\" (UID: \"28e3e150-3288-46fc-8db2-8a33d0c1ef24\") " pod="calico-system/calico-typha-8dff5c8f4-ndc5j" Sep 9 04:53:38.182801 containerd[1526]: time="2025-09-09T04:53:38.182690478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8dff5c8f4-ndc5j,Uid:28e3e150-3288-46fc-8db2-8a33d0c1ef24,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:38.220917 containerd[1526]: time="2025-09-09T04:53:38.220823553Z" level=info msg="connecting to shim 68db4fa5020298938ef735ac96a2358e13d191d5e1cea1187815432367db8d87" address="unix:///run/containerd/s/ac9cd9d1e7fd43fd69b798216c8105480d619686aba68f79b3136d4c8cc30e09" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:38.243915 systemd[1]: Created slice kubepods-besteffort-pod07d9b11a_ef45_4828_91cf_618e30ad648c.slice - libcontainer container kubepods-besteffort-pod07d9b11a_ef45_4828_91cf_618e30ad648c.slice. Sep 9 04:53:38.259461 kubelet[2672]: I0909 04:53:38.259423 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/07d9b11a-ef45-4828-91cf-618e30ad648c-node-certs\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259461 kubelet[2672]: I0909 04:53:38.259463 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-policysync\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259598 kubelet[2672]: I0909 04:53:38.259480 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-xtables-lock\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259598 kubelet[2672]: I0909 04:53:38.259539 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-flexvol-driver-host\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259598 kubelet[2672]: I0909 04:53:38.259559 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-var-lib-calico\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259598 kubelet[2672]: I0909 04:53:38.259587 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-var-run-calico\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259683 kubelet[2672]: I0909 04:53:38.259622 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-lib-modules\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259683 kubelet[2672]: I0909 04:53:38.259640 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-cni-net-dir\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259723 kubelet[2672]: I0909 04:53:38.259685 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-cni-log-dir\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259723 kubelet[2672]: I0909 04:53:38.259700 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07d9b11a-ef45-4828-91cf-618e30ad648c-tigera-ca-bundle\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259764 kubelet[2672]: I0909 04:53:38.259716 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28wg\" (UniqueName: \"kubernetes.io/projected/07d9b11a-ef45-4828-91cf-618e30ad648c-kube-api-access-m28wg\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.259784 kubelet[2672]: I0909 04:53:38.259763 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/07d9b11a-ef45-4828-91cf-618e30ad648c-cni-bin-dir\") pod \"calico-node-65gnj\" (UID: \"07d9b11a-ef45-4828-91cf-618e30ad648c\") " pod="calico-system/calico-node-65gnj" Sep 9 04:53:38.275512 systemd[1]: Started cri-containerd-68db4fa5020298938ef735ac96a2358e13d191d5e1cea1187815432367db8d87.scope - libcontainer container 68db4fa5020298938ef735ac96a2358e13d191d5e1cea1187815432367db8d87. Sep 9 04:53:38.339082 containerd[1526]: time="2025-09-09T04:53:38.339046150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8dff5c8f4-ndc5j,Uid:28e3e150-3288-46fc-8db2-8a33d0c1ef24,Namespace:calico-system,Attempt:0,} returns sandbox id \"68db4fa5020298938ef735ac96a2358e13d191d5e1cea1187815432367db8d87\"" Sep 9 04:53:38.356710 containerd[1526]: time="2025-09-09T04:53:38.356671433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 04:53:38.365265 kubelet[2672]: E0909 04:53:38.365214 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.366403 kubelet[2672]: W0909 04:53:38.366377 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.366447 kubelet[2672]: E0909 04:53:38.366420 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.366690 kubelet[2672]: E0909 04:53:38.366663 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.366690 kubelet[2672]: W0909 04:53:38.366679 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.366752 kubelet[2672]: E0909 04:53:38.366692 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.366851 kubelet[2672]: E0909 04:53:38.366838 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.366877 kubelet[2672]: W0909 04:53:38.366850 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.366877 kubelet[2672]: E0909 04:53:38.366868 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.367042 kubelet[2672]: E0909 04:53:38.367023 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.367042 kubelet[2672]: W0909 04:53:38.367035 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.367042 kubelet[2672]: E0909 04:53:38.367044 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.367760 kubelet[2672]: E0909 04:53:38.367727 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.367760 kubelet[2672]: W0909 04:53:38.367750 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.367960 kubelet[2672]: E0909 04:53:38.367766 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.369483 kubelet[2672]: E0909 04:53:38.369461 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.369483 kubelet[2672]: W0909 04:53:38.369479 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.369773 kubelet[2672]: E0909 04:53:38.369754 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.370618 kubelet[2672]: E0909 04:53:38.369990 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.370618 kubelet[2672]: W0909 04:53:38.370017 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.371540 kubelet[2672]: E0909 04:53:38.371483 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.371540 kubelet[2672]: W0909 04:53:38.371503 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.372061 kubelet[2672]: E0909 04:53:38.371999 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.372061 kubelet[2672]: W0909 04:53:38.372011 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.372061 kubelet[2672]: E0909 04:53:38.372024 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.373656 kubelet[2672]: E0909 04:53:38.373537 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.373727 kubelet[2672]: E0909 04:53:38.373679 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.373727 kubelet[2672]: W0909 04:53:38.373691 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.373727 kubelet[2672]: E0909 04:53:38.373712 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.374972 kubelet[2672]: E0909 04:53:38.374121 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.374972 kubelet[2672]: E0909 04:53:38.374401 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.374972 kubelet[2672]: W0909 04:53:38.374414 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.374972 kubelet[2672]: E0909 04:53:38.374427 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.374972 kubelet[2672]: E0909 04:53:38.374629 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.374972 kubelet[2672]: W0909 04:53:38.374639 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.374972 kubelet[2672]: E0909 04:53:38.374648 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.375445 kubelet[2672]: E0909 04:53:38.375424 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.375445 kubelet[2672]: W0909 04:53:38.375441 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.375504 kubelet[2672]: E0909 04:53:38.375454 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.376843 kubelet[2672]: E0909 04:53:38.376826 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.376936 kubelet[2672]: W0909 04:53:38.376917 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.376990 kubelet[2672]: E0909 04:53:38.376980 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.389134 kubelet[2672]: E0909 04:53:38.389114 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.389226 kubelet[2672]: W0909 04:53:38.389212 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.389307 kubelet[2672]: E0909 04:53:38.389294 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.525497 kubelet[2672]: E0909 04:53:38.524822 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dgbgk" podUID="552937ce-0bd6-4992-a22b-ff41c9705435" Sep 9 04:53:38.549030 containerd[1526]: time="2025-09-09T04:53:38.548944661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-65gnj,Uid:07d9b11a-ef45-4828-91cf-618e30ad648c,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:38.551384 kubelet[2672]: E0909 04:53:38.551361 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.551488 kubelet[2672]: W0909 04:53:38.551476 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.551557 kubelet[2672]: E0909 04:53:38.551534 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.551918 kubelet[2672]: E0909 04:53:38.551871 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.552041 kubelet[2672]: W0909 04:53:38.552003 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.552110 kubelet[2672]: E0909 04:53:38.552099 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.553595 kubelet[2672]: E0909 04:53:38.553559 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.553595 kubelet[2672]: W0909 04:53:38.553575 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.553595 kubelet[2672]: E0909 04:53:38.553588 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.553892 kubelet[2672]: E0909 04:53:38.553875 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.553892 kubelet[2672]: W0909 04:53:38.553889 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.553974 kubelet[2672]: E0909 04:53:38.553900 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.554283 kubelet[2672]: E0909 04:53:38.554259 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.554283 kubelet[2672]: W0909 04:53:38.554274 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.554283 kubelet[2672]: E0909 04:53:38.554285 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.554484 kubelet[2672]: E0909 04:53:38.554473 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.554484 kubelet[2672]: W0909 04:53:38.554483 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.554556 kubelet[2672]: E0909 04:53:38.554492 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.554629 kubelet[2672]: E0909 04:53:38.554619 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.554659 kubelet[2672]: W0909 04:53:38.554628 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.554659 kubelet[2672]: E0909 04:53:38.554644 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.554946 kubelet[2672]: E0909 04:53:38.554759 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.554946 kubelet[2672]: W0909 04:53:38.554765 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.554946 kubelet[2672]: E0909 04:53:38.554772 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.554946 kubelet[2672]: E0909 04:53:38.554943 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.555059 kubelet[2672]: W0909 04:53:38.554950 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.555059 kubelet[2672]: E0909 04:53:38.554963 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.555187 kubelet[2672]: E0909 04:53:38.555074 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.555187 kubelet[2672]: W0909 04:53:38.555080 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.555187 kubelet[2672]: E0909 04:53:38.555088 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.555458 kubelet[2672]: E0909 04:53:38.555206 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.555458 kubelet[2672]: W0909 04:53:38.555213 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.555458 kubelet[2672]: E0909 04:53:38.555220 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.555458 kubelet[2672]: E0909 04:53:38.555368 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.555458 kubelet[2672]: W0909 04:53:38.555375 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.555458 kubelet[2672]: E0909 04:53:38.555383 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.555809 kubelet[2672]: E0909 04:53:38.555520 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.555809 kubelet[2672]: W0909 04:53:38.555527 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.555809 kubelet[2672]: E0909 04:53:38.555535 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.555809 kubelet[2672]: E0909 04:53:38.555651 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.555809 kubelet[2672]: W0909 04:53:38.555658 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.555809 kubelet[2672]: E0909 04:53:38.555665 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.555809 kubelet[2672]: E0909 04:53:38.555782 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.555809 kubelet[2672]: W0909 04:53:38.555787 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.555809 kubelet[2672]: E0909 04:53:38.555796 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.556256 kubelet[2672]: E0909 04:53:38.555989 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.556256 kubelet[2672]: W0909 04:53:38.555997 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.556256 kubelet[2672]: E0909 04:53:38.556005 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.556256 kubelet[2672]: E0909 04:53:38.556143 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.556256 kubelet[2672]: W0909 04:53:38.556151 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.556256 kubelet[2672]: E0909 04:53:38.556158 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.556716 kubelet[2672]: E0909 04:53:38.556276 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.556716 kubelet[2672]: W0909 04:53:38.556282 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.556716 kubelet[2672]: E0909 04:53:38.556290 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.556716 kubelet[2672]: E0909 04:53:38.556408 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.556716 kubelet[2672]: W0909 04:53:38.556414 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.556716 kubelet[2672]: E0909 04:53:38.556421 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.556716 kubelet[2672]: E0909 04:53:38.556544 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.556716 kubelet[2672]: W0909 04:53:38.556550 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.556716 kubelet[2672]: E0909 04:53:38.556558 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.560940 kubelet[2672]: E0909 04:53:38.560921 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.561515 kubelet[2672]: W0909 04:53:38.561353 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.561515 kubelet[2672]: E0909 04:53:38.561380 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.561515 kubelet[2672]: I0909 04:53:38.561409 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/552937ce-0bd6-4992-a22b-ff41c9705435-kubelet-dir\") pod \"csi-node-driver-dgbgk\" (UID: \"552937ce-0bd6-4992-a22b-ff41c9705435\") " pod="calico-system/csi-node-driver-dgbgk" Sep 9 04:53:38.561800 kubelet[2672]: E0909 04:53:38.561750 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.561800 kubelet[2672]: W0909 04:53:38.561786 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.561870 kubelet[2672]: E0909 04:53:38.561810 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.561870 kubelet[2672]: I0909 04:53:38.561831 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/552937ce-0bd6-4992-a22b-ff41c9705435-varrun\") pod \"csi-node-driver-dgbgk\" (UID: \"552937ce-0bd6-4992-a22b-ff41c9705435\") " pod="calico-system/csi-node-driver-dgbgk" Sep 9 04:53:38.562041 kubelet[2672]: E0909 04:53:38.562025 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.562075 kubelet[2672]: W0909 04:53:38.562043 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.562075 kubelet[2672]: E0909 04:53:38.562063 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.562111 kubelet[2672]: I0909 04:53:38.562078 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glr9c\" (UniqueName: \"kubernetes.io/projected/552937ce-0bd6-4992-a22b-ff41c9705435-kube-api-access-glr9c\") pod \"csi-node-driver-dgbgk\" (UID: \"552937ce-0bd6-4992-a22b-ff41c9705435\") " pod="calico-system/csi-node-driver-dgbgk" Sep 9 04:53:38.562280 kubelet[2672]: E0909 04:53:38.562264 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.562313 kubelet[2672]: W0909 04:53:38.562282 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.562313 kubelet[2672]: E0909 04:53:38.562292 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.562313 kubelet[2672]: I0909 04:53:38.562306 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/552937ce-0bd6-4992-a22b-ff41c9705435-registration-dir\") pod \"csi-node-driver-dgbgk\" (UID: \"552937ce-0bd6-4992-a22b-ff41c9705435\") " pod="calico-system/csi-node-driver-dgbgk" Sep 9 04:53:38.562868 kubelet[2672]: E0909 04:53:38.562565 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.562868 kubelet[2672]: W0909 04:53:38.562581 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.562868 kubelet[2672]: E0909 04:53:38.562608 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.562868 kubelet[2672]: E0909 04:53:38.562771 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.562868 kubelet[2672]: W0909 04:53:38.562779 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.562868 kubelet[2672]: E0909 04:53:38.562792 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.563051 kubelet[2672]: E0909 04:53:38.562992 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.563051 kubelet[2672]: W0909 04:53:38.563000 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.563051 kubelet[2672]: E0909 04:53:38.563015 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.563188 kubelet[2672]: E0909 04:53:38.563166 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.563188 kubelet[2672]: W0909 04:53:38.563177 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.563288 kubelet[2672]: E0909 04:53:38.563190 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.563395 kubelet[2672]: E0909 04:53:38.563380 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.563395 kubelet[2672]: W0909 04:53:38.563393 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.563452 kubelet[2672]: E0909 04:53:38.563418 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.563599 kubelet[2672]: E0909 04:53:38.563583 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.563599 kubelet[2672]: W0909 04:53:38.563591 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.563694 kubelet[2672]: E0909 04:53:38.563631 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.563793 kubelet[2672]: E0909 04:53:38.563759 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.563793 kubelet[2672]: W0909 04:53:38.563770 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.563891 kubelet[2672]: E0909 04:53:38.563812 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.563891 kubelet[2672]: I0909 04:53:38.563829 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/552937ce-0bd6-4992-a22b-ff41c9705435-socket-dir\") pod \"csi-node-driver-dgbgk\" (UID: \"552937ce-0bd6-4992-a22b-ff41c9705435\") " pod="calico-system/csi-node-driver-dgbgk" Sep 9 04:53:38.564019 kubelet[2672]: E0909 04:53:38.564005 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.564019 kubelet[2672]: W0909 04:53:38.564017 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.564177 kubelet[2672]: E0909 04:53:38.564042 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.564334 kubelet[2672]: E0909 04:53:38.564228 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.564334 kubelet[2672]: W0909 04:53:38.564239 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.564334 kubelet[2672]: E0909 04:53:38.564265 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.564533 kubelet[2672]: E0909 04:53:38.564458 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.564533 kubelet[2672]: W0909 04:53:38.564469 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.564533 kubelet[2672]: E0909 04:53:38.564491 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.564715 kubelet[2672]: E0909 04:53:38.564606 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.564715 kubelet[2672]: W0909 04:53:38.564613 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.564715 kubelet[2672]: E0909 04:53:38.564622 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.575935 containerd[1526]: time="2025-09-09T04:53:38.575902137Z" level=info msg="connecting to shim b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4" address="unix:///run/containerd/s/47f0c8cc67e69113475f62c5cea7f1d197f19f8b232b660db2dc406230f67956" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:38.605485 systemd[1]: Started cri-containerd-b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4.scope - libcontainer container b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4. Sep 9 04:53:38.626837 containerd[1526]: time="2025-09-09T04:53:38.626805702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-65gnj,Uid:07d9b11a-ef45-4828-91cf-618e30ad648c,Namespace:calico-system,Attempt:0,} returns sandbox id \"b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4\"" Sep 9 04:53:38.665122 kubelet[2672]: E0909 04:53:38.664982 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.665122 kubelet[2672]: W0909 04:53:38.665005 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.665122 kubelet[2672]: E0909 04:53:38.665026 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.666157 kubelet[2672]: E0909 04:53:38.665966 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.666157 kubelet[2672]: W0909 04:53:38.665983 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.666157 kubelet[2672]: E0909 04:53:38.666002 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.666623 kubelet[2672]: E0909 04:53:38.666596 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.666687 kubelet[2672]: W0909 04:53:38.666675 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.666687 kubelet[2672]: E0909 04:53:38.666716 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.667046 kubelet[2672]: E0909 04:53:38.667008 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.667046 kubelet[2672]: W0909 04:53:38.667026 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.667257 kubelet[2672]: E0909 04:53:38.667219 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.667885 kubelet[2672]: E0909 04:53:38.667866 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.667885 kubelet[2672]: W0909 04:53:38.667880 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.668176 kubelet[2672]: E0909 04:53:38.667910 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.668176 kubelet[2672]: E0909 04:53:38.668040 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.668176 kubelet[2672]: W0909 04:53:38.668048 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.668176 kubelet[2672]: E0909 04:53:38.668068 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.668733 kubelet[2672]: E0909 04:53:38.668715 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.668733 kubelet[2672]: W0909 04:53:38.668729 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.668815 kubelet[2672]: E0909 04:53:38.668754 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.669150 kubelet[2672]: E0909 04:53:38.669137 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.669150 kubelet[2672]: W0909 04:53:38.669149 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.669312 kubelet[2672]: E0909 04:53:38.669276 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.669312 kubelet[2672]: E0909 04:53:38.669309 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.669480 kubelet[2672]: W0909 04:53:38.669317 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.669480 kubelet[2672]: E0909 04:53:38.669384 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.669480 kubelet[2672]: E0909 04:53:38.669452 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.669480 kubelet[2672]: W0909 04:53:38.669459 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.669558 kubelet[2672]: E0909 04:53:38.669544 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.669612 kubelet[2672]: E0909 04:53:38.669600 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.669612 kubelet[2672]: W0909 04:53:38.669611 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.669656 kubelet[2672]: E0909 04:53:38.669625 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.669888 kubelet[2672]: E0909 04:53:38.669876 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.669915 kubelet[2672]: W0909 04:53:38.669888 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.669915 kubelet[2672]: E0909 04:53:38.669903 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.670410 kubelet[2672]: E0909 04:53:38.670394 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.670410 kubelet[2672]: W0909 04:53:38.670406 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.670493 kubelet[2672]: E0909 04:53:38.670448 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.670595 kubelet[2672]: E0909 04:53:38.670574 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.670595 kubelet[2672]: W0909 04:53:38.670593 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.670656 kubelet[2672]: E0909 04:53:38.670617 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.670836 kubelet[2672]: E0909 04:53:38.670824 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.670888 kubelet[2672]: W0909 04:53:38.670838 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.670888 kubelet[2672]: E0909 04:53:38.670863 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.671015 kubelet[2672]: E0909 04:53:38.670998 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.671015 kubelet[2672]: W0909 04:53:38.671008 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.671076 kubelet[2672]: E0909 04:53:38.671050 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.671451 kubelet[2672]: E0909 04:53:38.671150 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.671451 kubelet[2672]: W0909 04:53:38.671159 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.671451 kubelet[2672]: E0909 04:53:38.671191 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.671451 kubelet[2672]: E0909 04:53:38.671279 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.671451 kubelet[2672]: W0909 04:53:38.671287 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.671451 kubelet[2672]: E0909 04:53:38.671297 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.672339 kubelet[2672]: E0909 04:53:38.671927 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.672429 kubelet[2672]: W0909 04:53:38.672411 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.672636 kubelet[2672]: E0909 04:53:38.672482 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.673523 kubelet[2672]: E0909 04:53:38.672643 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.673523 kubelet[2672]: W0909 04:53:38.673387 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.673523 kubelet[2672]: E0909 04:53:38.673401 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.674971 kubelet[2672]: E0909 04:53:38.674406 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.674971 kubelet[2672]: W0909 04:53:38.674422 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.674971 kubelet[2672]: E0909 04:53:38.674454 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.675144 kubelet[2672]: E0909 04:53:38.675128 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.675215 kubelet[2672]: W0909 04:53:38.675202 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.675377 kubelet[2672]: E0909 04:53:38.675292 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.675682 kubelet[2672]: E0909 04:53:38.675667 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.675757 kubelet[2672]: W0909 04:53:38.675744 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.675889 kubelet[2672]: E0909 04:53:38.675857 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.676040 kubelet[2672]: E0909 04:53:38.676028 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.677170 kubelet[2672]: W0909 04:53:38.676736 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.677170 kubelet[2672]: E0909 04:53:38.676772 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.677423 kubelet[2672]: E0909 04:53:38.677405 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.677457 kubelet[2672]: W0909 04:53:38.677422 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.677457 kubelet[2672]: E0909 04:53:38.677438 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:38.684848 kubelet[2672]: E0909 04:53:38.684826 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:38.684848 kubelet[2672]: W0909 04:53:38.684841 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:38.684848 kubelet[2672]: E0909 04:53:38.684854 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.361222 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3768552909.mount: Deactivated successfully. Sep 9 04:53:39.685315 containerd[1526]: time="2025-09-09T04:53:39.685184185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:39.686317 containerd[1526]: time="2025-09-09T04:53:39.686284006Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 04:53:39.687296 containerd[1526]: time="2025-09-09T04:53:39.687260589Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:39.689186 containerd[1526]: time="2025-09-09T04:53:39.689155677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:39.689755 containerd[1526]: time="2025-09-09T04:53:39.689728427Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.332742839s" Sep 9 04:53:39.689844 containerd[1526]: time="2025-09-09T04:53:39.689828865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 04:53:39.691361 containerd[1526]: time="2025-09-09T04:53:39.691147523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 04:53:39.698266 containerd[1526]: time="2025-09-09T04:53:39.698213202Z" level=info msg="CreateContainer within sandbox \"68db4fa5020298938ef735ac96a2358e13d191d5e1cea1187815432367db8d87\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 04:53:39.704336 containerd[1526]: time="2025-09-09T04:53:39.704278578Z" level=info msg="Container 5abeb40d77024b64dba50e9cd295a139e9a79f628e8ca6557fe3612e57a7af30: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:39.710006 containerd[1526]: time="2025-09-09T04:53:39.709956800Z" level=info msg="CreateContainer within sandbox \"68db4fa5020298938ef735ac96a2358e13d191d5e1cea1187815432367db8d87\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5abeb40d77024b64dba50e9cd295a139e9a79f628e8ca6557fe3612e57a7af30\"" Sep 9 04:53:39.710481 containerd[1526]: time="2025-09-09T04:53:39.710458032Z" level=info msg="StartContainer for \"5abeb40d77024b64dba50e9cd295a139e9a79f628e8ca6557fe3612e57a7af30\"" Sep 9 04:53:39.711449 containerd[1526]: time="2025-09-09T04:53:39.711423615Z" level=info msg="connecting to shim 5abeb40d77024b64dba50e9cd295a139e9a79f628e8ca6557fe3612e57a7af30" address="unix:///run/containerd/s/ac9cd9d1e7fd43fd69b798216c8105480d619686aba68f79b3136d4c8cc30e09" protocol=ttrpc version=3 Sep 9 04:53:39.731552 systemd[1]: Started cri-containerd-5abeb40d77024b64dba50e9cd295a139e9a79f628e8ca6557fe3612e57a7af30.scope - libcontainer container 5abeb40d77024b64dba50e9cd295a139e9a79f628e8ca6557fe3612e57a7af30. Sep 9 04:53:39.772385 containerd[1526]: time="2025-09-09T04:53:39.772303413Z" level=info msg="StartContainer for \"5abeb40d77024b64dba50e9cd295a139e9a79f628e8ca6557fe3612e57a7af30\" returns successfully" Sep 9 04:53:39.867270 kubelet[2672]: E0909 04:53:39.867229 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.870443 kubelet[2672]: W0909 04:53:39.868036 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.870443 kubelet[2672]: E0909 04:53:39.868070 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.872302 kubelet[2672]: E0909 04:53:39.872282 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.872412 kubelet[2672]: W0909 04:53:39.872399 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.872593 kubelet[2672]: E0909 04:53:39.872578 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.875584 kubelet[2672]: E0909 04:53:39.875567 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.875706 kubelet[2672]: W0909 04:53:39.875693 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.875791 kubelet[2672]: E0909 04:53:39.875777 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.876095 kubelet[2672]: E0909 04:53:39.876072 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.876195 kubelet[2672]: W0909 04:53:39.876183 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.876290 kubelet[2672]: E0909 04:53:39.876277 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.876574 kubelet[2672]: E0909 04:53:39.876560 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.876670 kubelet[2672]: W0909 04:53:39.876657 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.876734 kubelet[2672]: E0909 04:53:39.876716 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.877010 kubelet[2672]: E0909 04:53:39.876952 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.877010 kubelet[2672]: W0909 04:53:39.876965 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.877010 kubelet[2672]: E0909 04:53:39.876976 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.877304 kubelet[2672]: E0909 04:53:39.877279 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.877454 kubelet[2672]: W0909 04:53:39.877291 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.877454 kubelet[2672]: E0909 04:53:39.877393 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.877696 kubelet[2672]: E0909 04:53:39.877657 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.877696 kubelet[2672]: W0909 04:53:39.877670 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.879350 kubelet[2672]: E0909 04:53:39.877682 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.881523 kubelet[2672]: E0909 04:53:39.881394 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.881523 kubelet[2672]: W0909 04:53:39.881410 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.881523 kubelet[2672]: E0909 04:53:39.881423 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.881711 kubelet[2672]: E0909 04:53:39.881699 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.881766 kubelet[2672]: W0909 04:53:39.881755 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.881819 kubelet[2672]: E0909 04:53:39.881809 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.882153 kubelet[2672]: E0909 04:53:39.882044 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.882153 kubelet[2672]: W0909 04:53:39.882061 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.882153 kubelet[2672]: E0909 04:53:39.882071 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.883086 kubelet[2672]: E0909 04:53:39.883068 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.884230 kubelet[2672]: W0909 04:53:39.883170 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.884230 kubelet[2672]: E0909 04:53:39.883187 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.884412 kubelet[2672]: E0909 04:53:39.884397 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.884472 kubelet[2672]: W0909 04:53:39.884460 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.884528 kubelet[2672]: E0909 04:53:39.884518 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.884751 kubelet[2672]: E0909 04:53:39.884737 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.884954 kubelet[2672]: W0909 04:53:39.884804 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.884954 kubelet[2672]: E0909 04:53:39.884819 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.885397 kubelet[2672]: E0909 04:53:39.885368 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.886150 kubelet[2672]: W0909 04:53:39.885547 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.886150 kubelet[2672]: E0909 04:53:39.885565 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.886466 kubelet[2672]: E0909 04:53:39.886381 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.887599 kubelet[2672]: W0909 04:53:39.887546 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.887711 kubelet[2672]: E0909 04:53:39.887685 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.888025 kubelet[2672]: E0909 04:53:39.888010 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.889459 kubelet[2672]: W0909 04:53:39.888094 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.889459 kubelet[2672]: E0909 04:53:39.888127 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.889645 kubelet[2672]: E0909 04:53:39.889629 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.889697 kubelet[2672]: W0909 04:53:39.889686 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.889766 kubelet[2672]: E0909 04:53:39.889751 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.890020 kubelet[2672]: E0909 04:53:39.889976 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.890020 kubelet[2672]: W0909 04:53:39.890007 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.890092 kubelet[2672]: E0909 04:53:39.890031 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.890198 kubelet[2672]: E0909 04:53:39.890176 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.890198 kubelet[2672]: W0909 04:53:39.890188 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.890367 kubelet[2672]: E0909 04:53:39.890351 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.890367 kubelet[2672]: E0909 04:53:39.890359 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.890443 kubelet[2672]: W0909 04:53:39.890365 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.891305 kubelet[2672]: E0909 04:53:39.891276 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.891595 kubelet[2672]: E0909 04:53:39.891555 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.891595 kubelet[2672]: W0909 04:53:39.891574 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.891595 kubelet[2672]: E0909 04:53:39.891588 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.891864 kubelet[2672]: E0909 04:53:39.891748 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.891864 kubelet[2672]: W0909 04:53:39.891756 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.891864 kubelet[2672]: E0909 04:53:39.891802 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.891940 kubelet[2672]: E0909 04:53:39.891928 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.891940 kubelet[2672]: W0909 04:53:39.891936 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.892338 kubelet[2672]: E0909 04:53:39.891978 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.893187 kubelet[2672]: E0909 04:53:39.893151 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.893187 kubelet[2672]: W0909 04:53:39.893178 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.893288 kubelet[2672]: E0909 04:53:39.893197 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.893804 kubelet[2672]: E0909 04:53:39.893487 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.893804 kubelet[2672]: W0909 04:53:39.893503 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.893804 kubelet[2672]: E0909 04:53:39.893533 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.894380 kubelet[2672]: E0909 04:53:39.894344 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.894380 kubelet[2672]: W0909 04:53:39.894361 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.894591 kubelet[2672]: E0909 04:53:39.894484 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.894781 kubelet[2672]: E0909 04:53:39.894715 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.894781 kubelet[2672]: W0909 04:53:39.894726 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.895298 kubelet[2672]: E0909 04:53:39.895277 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.896083 kubelet[2672]: E0909 04:53:39.896042 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.896083 kubelet[2672]: W0909 04:53:39.896056 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.896499 kubelet[2672]: E0909 04:53:39.896371 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.898397 kubelet[2672]: E0909 04:53:39.898346 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.898397 kubelet[2672]: W0909 04:53:39.898383 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.899159 kubelet[2672]: E0909 04:53:39.898622 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.899611 kubelet[2672]: E0909 04:53:39.899571 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.899611 kubelet[2672]: W0909 04:53:39.899590 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.899700 kubelet[2672]: E0909 04:53:39.899645 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.900449 kubelet[2672]: E0909 04:53:39.900419 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.900449 kubelet[2672]: W0909 04:53:39.900441 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.900449 kubelet[2672]: E0909 04:53:39.900482 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:39.903814 kubelet[2672]: E0909 04:53:39.903597 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:39.903814 kubelet[2672]: W0909 04:53:39.903808 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:39.903907 kubelet[2672]: E0909 04:53:39.903826 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.739264 kubelet[2672]: E0909 04:53:40.739061 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dgbgk" podUID="552937ce-0bd6-4992-a22b-ff41c9705435" Sep 9 04:53:40.776240 containerd[1526]: time="2025-09-09T04:53:40.776186543Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:40.777458 containerd[1526]: time="2025-09-09T04:53:40.777427482Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 04:53:40.778240 containerd[1526]: time="2025-09-09T04:53:40.778174030Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:40.780070 containerd[1526]: time="2025-09-09T04:53:40.780021040Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:40.781207 containerd[1526]: time="2025-09-09T04:53:40.781146382Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.08996762s" Sep 9 04:53:40.781207 containerd[1526]: time="2025-09-09T04:53:40.781186301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 04:53:40.784296 containerd[1526]: time="2025-09-09T04:53:40.784267411Z" level=info msg="CreateContainer within sandbox \"b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 04:53:40.800063 containerd[1526]: time="2025-09-09T04:53:40.800020433Z" level=info msg="Container 5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:40.816243 containerd[1526]: time="2025-09-09T04:53:40.816167169Z" level=info msg="CreateContainer within sandbox \"b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca\"" Sep 9 04:53:40.817531 containerd[1526]: time="2025-09-09T04:53:40.817502507Z" level=info msg="StartContainer for \"5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca\"" Sep 9 04:53:40.819218 containerd[1526]: time="2025-09-09T04:53:40.819151280Z" level=info msg="connecting to shim 5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca" address="unix:///run/containerd/s/47f0c8cc67e69113475f62c5cea7f1d197f19f8b232b660db2dc406230f67956" protocol=ttrpc version=3 Sep 9 04:53:40.821359 kubelet[2672]: I0909 04:53:40.820820 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:53:40.840481 systemd[1]: Started cri-containerd-5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca.scope - libcontainer container 5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca. Sep 9 04:53:40.874962 containerd[1526]: time="2025-09-09T04:53:40.874919209Z" level=info msg="StartContainer for \"5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca\" returns successfully" Sep 9 04:53:40.893384 kubelet[2672]: E0909 04:53:40.893217 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.893384 kubelet[2672]: W0909 04:53:40.893244 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.893384 kubelet[2672]: E0909 04:53:40.893266 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896353 kubelet[2672]: E0909 04:53:40.895469 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896353 kubelet[2672]: W0909 04:53:40.895489 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896353 kubelet[2672]: E0909 04:53:40.895542 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896353 kubelet[2672]: E0909 04:53:40.895734 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896353 kubelet[2672]: W0909 04:53:40.895742 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896353 kubelet[2672]: E0909 04:53:40.895752 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896353 kubelet[2672]: E0909 04:53:40.895872 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896353 kubelet[2672]: W0909 04:53:40.895895 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896353 kubelet[2672]: E0909 04:53:40.895904 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896353 kubelet[2672]: E0909 04:53:40.896029 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896642 kubelet[2672]: W0909 04:53:40.896052 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896642 kubelet[2672]: E0909 04:53:40.896062 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896642 kubelet[2672]: E0909 04:53:40.896169 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896642 kubelet[2672]: W0909 04:53:40.896179 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896642 kubelet[2672]: E0909 04:53:40.896186 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896642 kubelet[2672]: E0909 04:53:40.896342 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896642 kubelet[2672]: W0909 04:53:40.896351 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896642 kubelet[2672]: E0909 04:53:40.896361 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896642 kubelet[2672]: E0909 04:53:40.896530 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896642 kubelet[2672]: W0909 04:53:40.896538 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896831 kubelet[2672]: E0909 04:53:40.896546 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.896831 kubelet[2672]: E0909 04:53:40.896748 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.896831 kubelet[2672]: W0909 04:53:40.896757 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.896831 kubelet[2672]: E0909 04:53:40.896766 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.897079 kubelet[2672]: E0909 04:53:40.897062 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.897079 kubelet[2672]: W0909 04:53:40.897073 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.897079 kubelet[2672]: E0909 04:53:40.897082 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.897225 kubelet[2672]: E0909 04:53:40.897208 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.897225 kubelet[2672]: W0909 04:53:40.897219 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.897225 kubelet[2672]: E0909 04:53:40.897227 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.897397 kubelet[2672]: E0909 04:53:40.897367 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.897397 kubelet[2672]: W0909 04:53:40.897377 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.897397 kubelet[2672]: E0909 04:53:40.897390 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.897553 kubelet[2672]: E0909 04:53:40.897534 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.897553 kubelet[2672]: W0909 04:53:40.897544 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.897553 kubelet[2672]: E0909 04:53:40.897552 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.897689 kubelet[2672]: E0909 04:53:40.897669 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.897689 kubelet[2672]: W0909 04:53:40.897682 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.897689 kubelet[2672]: E0909 04:53:40.897690 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.897892 kubelet[2672]: E0909 04:53:40.897876 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.897892 kubelet[2672]: W0909 04:53:40.897886 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.897892 kubelet[2672]: E0909 04:53:40.897895 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.899700 kubelet[2672]: E0909 04:53:40.899680 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.899700 kubelet[2672]: W0909 04:53:40.899695 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.903301 kubelet[2672]: E0909 04:53:40.899707 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.903301 kubelet[2672]: E0909 04:53:40.903089 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.903301 kubelet[2672]: W0909 04:53:40.903105 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.903301 kubelet[2672]: E0909 04:53:40.903133 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.903759 kubelet[2672]: E0909 04:53:40.903736 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.903759 kubelet[2672]: W0909 04:53:40.903752 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.903759 kubelet[2672]: E0909 04:53:40.903791 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.904013 kubelet[2672]: E0909 04:53:40.903986 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.904013 kubelet[2672]: W0909 04:53:40.903997 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.904244 kubelet[2672]: E0909 04:53:40.904222 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.904503 kubelet[2672]: E0909 04:53:40.904479 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.904503 kubelet[2672]: W0909 04:53:40.904492 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.904579 kubelet[2672]: E0909 04:53:40.904507 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.907803 kubelet[2672]: E0909 04:53:40.907579 2672 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:53:40.907803 kubelet[2672]: W0909 04:53:40.907602 2672 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:53:40.907803 kubelet[2672]: E0909 04:53:40.907627 2672 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:53:40.909030 systemd[1]: cri-containerd-5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca.scope: Deactivated successfully. Sep 9 04:53:40.909301 systemd[1]: cri-containerd-5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca.scope: Consumed 29ms CPU time, 6M memory peak, 4.5M written to disk. Sep 9 04:53:40.936048 containerd[1526]: time="2025-09-09T04:53:40.935995130Z" level=info msg="received exit event container_id:\"5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca\" id:\"5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca\" pid:3387 exited_at:{seconds:1757393620 nanos:926222090}" Sep 9 04:53:40.942413 containerd[1526]: time="2025-09-09T04:53:40.942312667Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca\" id:\"5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca\" pid:3387 exited_at:{seconds:1757393620 nanos:926222090}" Sep 9 04:53:40.966952 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5743adf4f0eb5d4bd6749795e54b0db21218266c97d4ec136500525a6b70d5ca-rootfs.mount: Deactivated successfully. Sep 9 04:53:41.822976 containerd[1526]: time="2025-09-09T04:53:41.822936110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 04:53:41.841908 kubelet[2672]: I0909 04:53:41.841557 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8dff5c8f4-ndc5j" podStartSLOduration=3.5068900149999997 podStartE2EDuration="4.84153594s" podCreationTimestamp="2025-09-09 04:53:37 +0000 UTC" firstStartedPulling="2025-09-09 04:53:38.356040845 +0000 UTC m=+18.697811972" lastFinishedPulling="2025-09-09 04:53:39.69068677 +0000 UTC m=+20.032457897" observedRunningTime="2025-09-09 04:53:39.824502639 +0000 UTC m=+20.166273806" watchObservedRunningTime="2025-09-09 04:53:41.84153594 +0000 UTC m=+22.183307067" Sep 9 04:53:42.740008 kubelet[2672]: E0909 04:53:42.739922 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dgbgk" podUID="552937ce-0bd6-4992-a22b-ff41c9705435" Sep 9 04:53:44.513381 containerd[1526]: time="2025-09-09T04:53:44.512885696Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:44.516467 containerd[1526]: time="2025-09-09T04:53:44.516430527Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 04:53:44.519339 containerd[1526]: time="2025-09-09T04:53:44.519121451Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:44.527212 containerd[1526]: time="2025-09-09T04:53:44.527176260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:44.527807 containerd[1526]: time="2025-09-09T04:53:44.527774212Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.704801942s" Sep 9 04:53:44.527807 containerd[1526]: time="2025-09-09T04:53:44.527804732Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 04:53:44.530139 containerd[1526]: time="2025-09-09T04:53:44.530078461Z" level=info msg="CreateContainer within sandbox \"b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 04:53:44.547191 containerd[1526]: time="2025-09-09T04:53:44.547123187Z" level=info msg="Container 9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:44.566954 containerd[1526]: time="2025-09-09T04:53:44.566896997Z" level=info msg="CreateContainer within sandbox \"b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc\"" Sep 9 04:53:44.567397 containerd[1526]: time="2025-09-09T04:53:44.567374990Z" level=info msg="StartContainer for \"9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc\"" Sep 9 04:53:44.568873 containerd[1526]: time="2025-09-09T04:53:44.568845170Z" level=info msg="connecting to shim 9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc" address="unix:///run/containerd/s/47f0c8cc67e69113475f62c5cea7f1d197f19f8b232b660db2dc406230f67956" protocol=ttrpc version=3 Sep 9 04:53:44.585491 systemd[1]: Started cri-containerd-9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc.scope - libcontainer container 9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc. Sep 9 04:53:44.623965 containerd[1526]: time="2025-09-09T04:53:44.623859657Z" level=info msg="StartContainer for \"9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc\" returns successfully" Sep 9 04:53:44.739733 kubelet[2672]: E0909 04:53:44.739682 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dgbgk" podUID="552937ce-0bd6-4992-a22b-ff41c9705435" Sep 9 04:53:45.257953 systemd[1]: cri-containerd-9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc.scope: Deactivated successfully. Sep 9 04:53:45.259106 systemd[1]: cri-containerd-9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc.scope: Consumed 484ms CPU time, 177.1M memory peak, 1.7M read from disk, 165.8M written to disk. Sep 9 04:53:45.270397 containerd[1526]: time="2025-09-09T04:53:45.270357278Z" level=info msg="received exit event container_id:\"9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc\" id:\"9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc\" pid:3471 exited_at:{seconds:1757393625 nanos:270059562}" Sep 9 04:53:45.270886 containerd[1526]: time="2025-09-09T04:53:45.270855671Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc\" id:\"9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc\" pid:3471 exited_at:{seconds:1757393625 nanos:270059562}" Sep 9 04:53:45.287917 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9e6379d2836fc9145359b0e69ee42d8d237b811cfb85accf98df5f993bddfecc-rootfs.mount: Deactivated successfully. Sep 9 04:53:45.351120 kubelet[2672]: I0909 04:53:45.351052 2672 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 04:53:45.392153 systemd[1]: Created slice kubepods-besteffort-pod8cff6f75_e1a9_4839_a313_d99705c12062.slice - libcontainer container kubepods-besteffort-pod8cff6f75_e1a9_4839_a313_d99705c12062.slice. Sep 9 04:53:45.403136 systemd[1]: Created slice kubepods-besteffort-pod51869746_303b_4256_91b8_4606d0ce74fe.slice - libcontainer container kubepods-besteffort-pod51869746_303b_4256_91b8_4606d0ce74fe.slice. Sep 9 04:53:45.409617 systemd[1]: Created slice kubepods-burstable-pode63f9b21_1c03_49de_a212_ae6c3461a913.slice - libcontainer container kubepods-burstable-pode63f9b21_1c03_49de_a212_ae6c3461a913.slice. Sep 9 04:53:45.417597 systemd[1]: Created slice kubepods-besteffort-podf3dfd24e_f850_4a8c_99c9_946b72b2a033.slice - libcontainer container kubepods-besteffort-podf3dfd24e_f850_4a8c_99c9_946b72b2a033.slice. Sep 9 04:53:45.426764 systemd[1]: Created slice kubepods-besteffort-podcf82b986_3cd0_45d0_9764_a0d291ff5cfd.slice - libcontainer container kubepods-besteffort-podcf82b986_3cd0_45d0_9764_a0d291ff5cfd.slice. Sep 9 04:53:45.435471 systemd[1]: Created slice kubepods-besteffort-pod1ed773d9_9708_4a97_bf36_4f8170376c2b.slice - libcontainer container kubepods-besteffort-pod1ed773d9_9708_4a97_bf36_4f8170376c2b.slice. Sep 9 04:53:45.441740 kubelet[2672]: I0909 04:53:45.441695 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e63f9b21-1c03-49de-a212-ae6c3461a913-config-volume\") pod \"coredns-668d6bf9bc-tzvmr\" (UID: \"e63f9b21-1c03-49de-a212-ae6c3461a913\") " pod="kube-system/coredns-668d6bf9bc-tzvmr" Sep 9 04:53:45.441972 kubelet[2672]: I0909 04:53:45.441754 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhmdm\" (UniqueName: \"kubernetes.io/projected/751e08fd-9b29-4e93-b1fe-d6b3e4e781cb-kube-api-access-mhmdm\") pod \"coredns-668d6bf9bc-mwg62\" (UID: \"751e08fd-9b29-4e93-b1fe-d6b3e4e781cb\") " pod="kube-system/coredns-668d6bf9bc-mwg62" Sep 9 04:53:45.441833 systemd[1]: Created slice kubepods-burstable-pod751e08fd_9b29_4e93_b1fe_d6b3e4e781cb.slice - libcontainer container kubepods-burstable-pod751e08fd_9b29_4e93_b1fe_d6b3e4e781cb.slice. Sep 9 04:53:45.442850 kubelet[2672]: I0909 04:53:45.442306 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-backend-key-pair\") pod \"whisker-6f65c8fbb9-nhxv8\" (UID: \"07263a06-50c4-4b24-8fcd-c7abc03a155b\") " pod="calico-system/whisker-6f65c8fbb9-nhxv8" Sep 9 04:53:45.442850 kubelet[2672]: I0909 04:53:45.442353 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-ca-bundle\") pod \"whisker-6f65c8fbb9-nhxv8\" (UID: \"07263a06-50c4-4b24-8fcd-c7abc03a155b\") " pod="calico-system/whisker-6f65c8fbb9-nhxv8" Sep 9 04:53:45.442850 kubelet[2672]: I0909 04:53:45.442371 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l76np\" (UniqueName: \"kubernetes.io/projected/51869746-303b-4256-91b8-4606d0ce74fe-kube-api-access-l76np\") pod \"calico-apiserver-cb85c6f7d-pxpgr\" (UID: \"51869746-303b-4256-91b8-4606d0ce74fe\") " pod="calico-apiserver/calico-apiserver-cb85c6f7d-pxpgr" Sep 9 04:53:45.442850 kubelet[2672]: I0909 04:53:45.442394 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/751e08fd-9b29-4e93-b1fe-d6b3e4e781cb-config-volume\") pod \"coredns-668d6bf9bc-mwg62\" (UID: \"751e08fd-9b29-4e93-b1fe-d6b3e4e781cb\") " pod="kube-system/coredns-668d6bf9bc-mwg62" Sep 9 04:53:45.442850 kubelet[2672]: I0909 04:53:45.442413 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjzw\" (UniqueName: \"kubernetes.io/projected/e63f9b21-1c03-49de-a212-ae6c3461a913-kube-api-access-ggjzw\") pod \"coredns-668d6bf9bc-tzvmr\" (UID: \"e63f9b21-1c03-49de-a212-ae6c3461a913\") " pod="kube-system/coredns-668d6bf9bc-tzvmr" Sep 9 04:53:45.443005 kubelet[2672]: I0909 04:53:45.442432 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f3dfd24e-f850-4a8c-99c9-946b72b2a033-calico-apiserver-certs\") pod \"calico-apiserver-cb85c6f7d-zfbw4\" (UID: \"f3dfd24e-f850-4a8c-99c9-946b72b2a033\") " pod="calico-apiserver/calico-apiserver-cb85c6f7d-zfbw4" Sep 9 04:53:45.443005 kubelet[2672]: I0909 04:53:45.442447 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/51869746-303b-4256-91b8-4606d0ce74fe-calico-apiserver-certs\") pod \"calico-apiserver-cb85c6f7d-pxpgr\" (UID: \"51869746-303b-4256-91b8-4606d0ce74fe\") " pod="calico-apiserver/calico-apiserver-cb85c6f7d-pxpgr" Sep 9 04:53:45.443005 kubelet[2672]: I0909 04:53:45.442472 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbdf\" (UniqueName: \"kubernetes.io/projected/cf82b986-3cd0-45d0-9764-a0d291ff5cfd-kube-api-access-pjbdf\") pod \"goldmane-54d579b49d-gmx58\" (UID: \"cf82b986-3cd0-45d0-9764-a0d291ff5cfd\") " pod="calico-system/goldmane-54d579b49d-gmx58" Sep 9 04:53:45.443005 kubelet[2672]: I0909 04:53:45.442491 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbm87\" (UniqueName: \"kubernetes.io/projected/8cff6f75-e1a9-4839-a313-d99705c12062-kube-api-access-mbm87\") pod \"calico-kube-controllers-697776f99-qp2dg\" (UID: \"8cff6f75-e1a9-4839-a313-d99705c12062\") " pod="calico-system/calico-kube-controllers-697776f99-qp2dg" Sep 9 04:53:45.443005 kubelet[2672]: I0909 04:53:45.442508 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/cf82b986-3cd0-45d0-9764-a0d291ff5cfd-goldmane-key-pair\") pod \"goldmane-54d579b49d-gmx58\" (UID: \"cf82b986-3cd0-45d0-9764-a0d291ff5cfd\") " pod="calico-system/goldmane-54d579b49d-gmx58" Sep 9 04:53:45.443104 kubelet[2672]: I0909 04:53:45.442527 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lwqp\" (UniqueName: \"kubernetes.io/projected/07263a06-50c4-4b24-8fcd-c7abc03a155b-kube-api-access-5lwqp\") pod \"whisker-6f65c8fbb9-nhxv8\" (UID: \"07263a06-50c4-4b24-8fcd-c7abc03a155b\") " pod="calico-system/whisker-6f65c8fbb9-nhxv8" Sep 9 04:53:45.443104 kubelet[2672]: I0909 04:53:45.442544 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf82b986-3cd0-45d0-9764-a0d291ff5cfd-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-gmx58\" (UID: \"cf82b986-3cd0-45d0-9764-a0d291ff5cfd\") " pod="calico-system/goldmane-54d579b49d-gmx58" Sep 9 04:53:45.443104 kubelet[2672]: I0909 04:53:45.442564 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1ed773d9-9708-4a97-bf36-4f8170376c2b-calico-apiserver-certs\") pod \"calico-apiserver-776f74d6f9-j57rd\" (UID: \"1ed773d9-9708-4a97-bf36-4f8170376c2b\") " pod="calico-apiserver/calico-apiserver-776f74d6f9-j57rd" Sep 9 04:53:45.443104 kubelet[2672]: I0909 04:53:45.442580 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cff6f75-e1a9-4839-a313-d99705c12062-tigera-ca-bundle\") pod \"calico-kube-controllers-697776f99-qp2dg\" (UID: \"8cff6f75-e1a9-4839-a313-d99705c12062\") " pod="calico-system/calico-kube-controllers-697776f99-qp2dg" Sep 9 04:53:45.443104 kubelet[2672]: I0909 04:53:45.442598 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv757\" (UniqueName: \"kubernetes.io/projected/f3dfd24e-f850-4a8c-99c9-946b72b2a033-kube-api-access-nv757\") pod \"calico-apiserver-cb85c6f7d-zfbw4\" (UID: \"f3dfd24e-f850-4a8c-99c9-946b72b2a033\") " pod="calico-apiserver/calico-apiserver-cb85c6f7d-zfbw4" Sep 9 04:53:45.443226 kubelet[2672]: I0909 04:53:45.442615 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf82b986-3cd0-45d0-9764-a0d291ff5cfd-config\") pod \"goldmane-54d579b49d-gmx58\" (UID: \"cf82b986-3cd0-45d0-9764-a0d291ff5cfd\") " pod="calico-system/goldmane-54d579b49d-gmx58" Sep 9 04:53:45.443226 kubelet[2672]: I0909 04:53:45.442635 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5b9\" (UniqueName: \"kubernetes.io/projected/1ed773d9-9708-4a97-bf36-4f8170376c2b-kube-api-access-pb5b9\") pod \"calico-apiserver-776f74d6f9-j57rd\" (UID: \"1ed773d9-9708-4a97-bf36-4f8170376c2b\") " pod="calico-apiserver/calico-apiserver-776f74d6f9-j57rd" Sep 9 04:53:45.446909 systemd[1]: Created slice kubepods-besteffort-pod07263a06_50c4_4b24_8fcd_c7abc03a155b.slice - libcontainer container kubepods-besteffort-pod07263a06_50c4_4b24_8fcd_c7abc03a155b.slice. Sep 9 04:53:45.701159 containerd[1526]: time="2025-09-09T04:53:45.701094784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697776f99-qp2dg,Uid:8cff6f75-e1a9-4839-a313-d99705c12062,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:45.707862 containerd[1526]: time="2025-09-09T04:53:45.707822856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-pxpgr,Uid:51869746-303b-4256-91b8-4606d0ce74fe,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:53:45.714213 containerd[1526]: time="2025-09-09T04:53:45.713980535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tzvmr,Uid:e63f9b21-1c03-49de-a212-ae6c3461a913,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:45.723889 containerd[1526]: time="2025-09-09T04:53:45.723837966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-zfbw4,Uid:f3dfd24e-f850-4a8c-99c9-946b72b2a033,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:53:45.731358 containerd[1526]: time="2025-09-09T04:53:45.731301188Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gmx58,Uid:cf82b986-3cd0-45d0-9764-a0d291ff5cfd,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:45.739084 containerd[1526]: time="2025-09-09T04:53:45.739044126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776f74d6f9-j57rd,Uid:1ed773d9-9708-4a97-bf36-4f8170376c2b,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:53:45.746820 containerd[1526]: time="2025-09-09T04:53:45.746581547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwg62,Uid:751e08fd-9b29-4e93-b1fe-d6b3e4e781cb,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:45.750489 containerd[1526]: time="2025-09-09T04:53:45.750447536Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f65c8fbb9-nhxv8,Uid:07263a06-50c4-4b24-8fcd-c7abc03a155b,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:45.839450 containerd[1526]: time="2025-09-09T04:53:45.839404889Z" level=error msg="Failed to destroy network for sandbox \"422fa8ced25c84ce2d07b40be0e73eb48120c7ce610dd0094fea097186eb585e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.846231 containerd[1526]: time="2025-09-09T04:53:45.845550328Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tzvmr,Uid:e63f9b21-1c03-49de-a212-ae6c3461a913,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"422fa8ced25c84ce2d07b40be0e73eb48120c7ce610dd0094fea097186eb585e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.846745 kubelet[2672]: E0909 04:53:45.846550 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"422fa8ced25c84ce2d07b40be0e73eb48120c7ce610dd0094fea097186eb585e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.846745 kubelet[2672]: E0909 04:53:45.846620 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"422fa8ced25c84ce2d07b40be0e73eb48120c7ce610dd0094fea097186eb585e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tzvmr" Sep 9 04:53:45.846745 kubelet[2672]: E0909 04:53:45.846640 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"422fa8ced25c84ce2d07b40be0e73eb48120c7ce610dd0094fea097186eb585e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tzvmr" Sep 9 04:53:45.848306 kubelet[2672]: E0909 04:53:45.846674 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-tzvmr_kube-system(e63f9b21-1c03-49de-a212-ae6c3461a913)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-tzvmr_kube-system(e63f9b21-1c03-49de-a212-ae6c3461a913)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"422fa8ced25c84ce2d07b40be0e73eb48120c7ce610dd0094fea097186eb585e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-tzvmr" podUID="e63f9b21-1c03-49de-a212-ae6c3461a913" Sep 9 04:53:45.848380 containerd[1526]: time="2025-09-09T04:53:45.847226186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 04:53:45.859489 containerd[1526]: time="2025-09-09T04:53:45.859440466Z" level=error msg="Failed to destroy network for sandbox \"316bd5f6fe30574b5b7d7d1a41193ee9cfd5ec6ae30f8a39f600305df9d3b2cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.862942 containerd[1526]: time="2025-09-09T04:53:45.862892340Z" level=error msg="Failed to destroy network for sandbox \"430af1edced7df0f7823054e0b499eaae6888939e239655d4cdaeb5676710e9f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.863089 containerd[1526]: time="2025-09-09T04:53:45.862983019Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-pxpgr,Uid:51869746-303b-4256-91b8-4606d0ce74fe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"316bd5f6fe30574b5b7d7d1a41193ee9cfd5ec6ae30f8a39f600305df9d3b2cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.863388 containerd[1526]: time="2025-09-09T04:53:45.863353734Z" level=error msg="Failed to destroy network for sandbox \"bda57a29546ef6fe81d82c66c04795d4938a33772d6939ed4cd62dcc20c02e52\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.863811 kubelet[2672]: E0909 04:53:45.863775 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"316bd5f6fe30574b5b7d7d1a41193ee9cfd5ec6ae30f8a39f600305df9d3b2cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.863869 kubelet[2672]: E0909 04:53:45.863833 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"316bd5f6fe30574b5b7d7d1a41193ee9cfd5ec6ae30f8a39f600305df9d3b2cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cb85c6f7d-pxpgr" Sep 9 04:53:45.863869 kubelet[2672]: E0909 04:53:45.863853 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"316bd5f6fe30574b5b7d7d1a41193ee9cfd5ec6ae30f8a39f600305df9d3b2cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cb85c6f7d-pxpgr" Sep 9 04:53:45.863922 kubelet[2672]: E0909 04:53:45.863892 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cb85c6f7d-pxpgr_calico-apiserver(51869746-303b-4256-91b8-4606d0ce74fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cb85c6f7d-pxpgr_calico-apiserver(51869746-303b-4256-91b8-4606d0ce74fe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"316bd5f6fe30574b5b7d7d1a41193ee9cfd5ec6ae30f8a39f600305df9d3b2cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cb85c6f7d-pxpgr" podUID="51869746-303b-4256-91b8-4606d0ce74fe" Sep 9 04:53:45.866839 containerd[1526]: time="2025-09-09T04:53:45.866792809Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697776f99-qp2dg,Uid:8cff6f75-e1a9-4839-a313-d99705c12062,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"430af1edced7df0f7823054e0b499eaae6888939e239655d4cdaeb5676710e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.867235 kubelet[2672]: E0909 04:53:45.866995 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"430af1edced7df0f7823054e0b499eaae6888939e239655d4cdaeb5676710e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.867235 kubelet[2672]: E0909 04:53:45.867045 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"430af1edced7df0f7823054e0b499eaae6888939e239655d4cdaeb5676710e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697776f99-qp2dg" Sep 9 04:53:45.867235 kubelet[2672]: E0909 04:53:45.867063 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"430af1edced7df0f7823054e0b499eaae6888939e239655d4cdaeb5676710e9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-697776f99-qp2dg" Sep 9 04:53:45.867351 kubelet[2672]: E0909 04:53:45.867098 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-697776f99-qp2dg_calico-system(8cff6f75-e1a9-4839-a313-d99705c12062)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-697776f99-qp2dg_calico-system(8cff6f75-e1a9-4839-a313-d99705c12062)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"430af1edced7df0f7823054e0b499eaae6888939e239655d4cdaeb5676710e9f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-697776f99-qp2dg" podUID="8cff6f75-e1a9-4839-a313-d99705c12062" Sep 9 04:53:45.869771 containerd[1526]: time="2025-09-09T04:53:45.868542746Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwg62,Uid:751e08fd-9b29-4e93-b1fe-d6b3e4e781cb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bda57a29546ef6fe81d82c66c04795d4938a33772d6939ed4cd62dcc20c02e52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.869887 kubelet[2672]: E0909 04:53:45.869134 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bda57a29546ef6fe81d82c66c04795d4938a33772d6939ed4cd62dcc20c02e52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.869887 kubelet[2672]: E0909 04:53:45.869173 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bda57a29546ef6fe81d82c66c04795d4938a33772d6939ed4cd62dcc20c02e52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mwg62" Sep 9 04:53:45.869887 kubelet[2672]: E0909 04:53:45.869188 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bda57a29546ef6fe81d82c66c04795d4938a33772d6939ed4cd62dcc20c02e52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-mwg62" Sep 9 04:53:45.869971 kubelet[2672]: E0909 04:53:45.869223 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-mwg62_kube-system(751e08fd-9b29-4e93-b1fe-d6b3e4e781cb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-mwg62_kube-system(751e08fd-9b29-4e93-b1fe-d6b3e4e781cb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bda57a29546ef6fe81d82c66c04795d4938a33772d6939ed4cd62dcc20c02e52\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-mwg62" podUID="751e08fd-9b29-4e93-b1fe-d6b3e4e781cb" Sep 9 04:53:45.873818 containerd[1526]: time="2025-09-09T04:53:45.873771758Z" level=error msg="Failed to destroy network for sandbox \"5ec400200e22c1945120ac047a572c4f47c61177d4dce34fa1d3da88b36ab189\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.875400 containerd[1526]: time="2025-09-09T04:53:45.875354657Z" level=error msg="Failed to destroy network for sandbox \"7a32f3409e69ab8ea90ccf4de47884dee2836fbf0ca229c564e0743869ed3901\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.876499 containerd[1526]: time="2025-09-09T04:53:45.876447042Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f65c8fbb9-nhxv8,Uid:07263a06-50c4-4b24-8fcd-c7abc03a155b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ec400200e22c1945120ac047a572c4f47c61177d4dce34fa1d3da88b36ab189\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.876920 kubelet[2672]: E0909 04:53:45.876872 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ec400200e22c1945120ac047a572c4f47c61177d4dce34fa1d3da88b36ab189\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.876965 kubelet[2672]: E0909 04:53:45.876932 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ec400200e22c1945120ac047a572c4f47c61177d4dce34fa1d3da88b36ab189\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f65c8fbb9-nhxv8" Sep 9 04:53:45.876965 kubelet[2672]: E0909 04:53:45.876953 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ec400200e22c1945120ac047a572c4f47c61177d4dce34fa1d3da88b36ab189\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f65c8fbb9-nhxv8" Sep 9 04:53:45.877009 kubelet[2672]: E0909 04:53:45.876990 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6f65c8fbb9-nhxv8_calico-system(07263a06-50c4-4b24-8fcd-c7abc03a155b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6f65c8fbb9-nhxv8_calico-system(07263a06-50c4-4b24-8fcd-c7abc03a155b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5ec400200e22c1945120ac047a572c4f47c61177d4dce34fa1d3da88b36ab189\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6f65c8fbb9-nhxv8" podUID="07263a06-50c4-4b24-8fcd-c7abc03a155b" Sep 9 04:53:45.877593 containerd[1526]: time="2025-09-09T04:53:45.877448869Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gmx58,Uid:cf82b986-3cd0-45d0-9764-a0d291ff5cfd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a32f3409e69ab8ea90ccf4de47884dee2836fbf0ca229c564e0743869ed3901\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.878358 kubelet[2672]: E0909 04:53:45.877733 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a32f3409e69ab8ea90ccf4de47884dee2836fbf0ca229c564e0743869ed3901\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.878358 kubelet[2672]: E0909 04:53:45.877822 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a32f3409e69ab8ea90ccf4de47884dee2836fbf0ca229c564e0743869ed3901\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-gmx58" Sep 9 04:53:45.878358 kubelet[2672]: E0909 04:53:45.877838 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a32f3409e69ab8ea90ccf4de47884dee2836fbf0ca229c564e0743869ed3901\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-gmx58" Sep 9 04:53:45.878486 kubelet[2672]: E0909 04:53:45.877868 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-gmx58_calico-system(cf82b986-3cd0-45d0-9764-a0d291ff5cfd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-gmx58_calico-system(cf82b986-3cd0-45d0-9764-a0d291ff5cfd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a32f3409e69ab8ea90ccf4de47884dee2836fbf0ca229c564e0743869ed3901\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-gmx58" podUID="cf82b986-3cd0-45d0-9764-a0d291ff5cfd" Sep 9 04:53:45.890255 containerd[1526]: time="2025-09-09T04:53:45.890096023Z" level=error msg="Failed to destroy network for sandbox \"61933dfd2e716f6ad38ca827db1b62499dda162dc4de5bb8a24181620e386b3d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.891716 containerd[1526]: time="2025-09-09T04:53:45.891671963Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-zfbw4,Uid:f3dfd24e-f850-4a8c-99c9-946b72b2a033,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"61933dfd2e716f6ad38ca827db1b62499dda162dc4de5bb8a24181620e386b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.891926 kubelet[2672]: E0909 04:53:45.891893 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61933dfd2e716f6ad38ca827db1b62499dda162dc4de5bb8a24181620e386b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.891981 kubelet[2672]: E0909 04:53:45.891946 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61933dfd2e716f6ad38ca827db1b62499dda162dc4de5bb8a24181620e386b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cb85c6f7d-zfbw4" Sep 9 04:53:45.891981 kubelet[2672]: E0909 04:53:45.891966 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61933dfd2e716f6ad38ca827db1b62499dda162dc4de5bb8a24181620e386b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cb85c6f7d-zfbw4" Sep 9 04:53:45.892036 kubelet[2672]: E0909 04:53:45.892005 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cb85c6f7d-zfbw4_calico-apiserver(f3dfd24e-f850-4a8c-99c9-946b72b2a033)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cb85c6f7d-zfbw4_calico-apiserver(f3dfd24e-f850-4a8c-99c9-946b72b2a033)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"61933dfd2e716f6ad38ca827db1b62499dda162dc4de5bb8a24181620e386b3d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cb85c6f7d-zfbw4" podUID="f3dfd24e-f850-4a8c-99c9-946b72b2a033" Sep 9 04:53:45.895841 containerd[1526]: time="2025-09-09T04:53:45.895804188Z" level=error msg="Failed to destroy network for sandbox \"75969c1c5550ca8a6b1bdd85c07c05d002815550b1ddbc9aeacc35cda615737d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.896834 containerd[1526]: time="2025-09-09T04:53:45.896733096Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776f74d6f9-j57rd,Uid:1ed773d9-9708-4a97-bf36-4f8170376c2b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"75969c1c5550ca8a6b1bdd85c07c05d002815550b1ddbc9aeacc35cda615737d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.896992 kubelet[2672]: E0909 04:53:45.896944 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75969c1c5550ca8a6b1bdd85c07c05d002815550b1ddbc9aeacc35cda615737d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:45.897041 kubelet[2672]: E0909 04:53:45.897013 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75969c1c5550ca8a6b1bdd85c07c05d002815550b1ddbc9aeacc35cda615737d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776f74d6f9-j57rd" Sep 9 04:53:45.897041 kubelet[2672]: E0909 04:53:45.897033 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"75969c1c5550ca8a6b1bdd85c07c05d002815550b1ddbc9aeacc35cda615737d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-776f74d6f9-j57rd" Sep 9 04:53:45.897091 kubelet[2672]: E0909 04:53:45.897066 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-776f74d6f9-j57rd_calico-apiserver(1ed773d9-9708-4a97-bf36-4f8170376c2b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-776f74d6f9-j57rd_calico-apiserver(1ed773d9-9708-4a97-bf36-4f8170376c2b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"75969c1c5550ca8a6b1bdd85c07c05d002815550b1ddbc9aeacc35cda615737d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-776f74d6f9-j57rd" podUID="1ed773d9-9708-4a97-bf36-4f8170376c2b" Sep 9 04:53:46.745239 systemd[1]: Created slice kubepods-besteffort-pod552937ce_0bd6_4992_a22b_ff41c9705435.slice - libcontainer container kubepods-besteffort-pod552937ce_0bd6_4992_a22b_ff41c9705435.slice. Sep 9 04:53:46.756972 containerd[1526]: time="2025-09-09T04:53:46.756832564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dgbgk,Uid:552937ce-0bd6-4992-a22b-ff41c9705435,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:46.872116 containerd[1526]: time="2025-09-09T04:53:46.872060233Z" level=error msg="Failed to destroy network for sandbox \"19fdde57deeb91772a6df53b97d5dba2dc06ba841921c6cbb356739802e2a484\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:46.876041 systemd[1]: run-netns-cni\x2d02579f6f\x2dbf3c\x2d2047\x2d4976\x2d15059df511b8.mount: Deactivated successfully. Sep 9 04:53:46.939280 containerd[1526]: time="2025-09-09T04:53:46.939135108Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dgbgk,Uid:552937ce-0bd6-4992-a22b-ff41c9705435,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"19fdde57deeb91772a6df53b97d5dba2dc06ba841921c6cbb356739802e2a484\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:46.939455 kubelet[2672]: E0909 04:53:46.939404 2672 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19fdde57deeb91772a6df53b97d5dba2dc06ba841921c6cbb356739802e2a484\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:53:46.939726 kubelet[2672]: E0909 04:53:46.939463 2672 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19fdde57deeb91772a6df53b97d5dba2dc06ba841921c6cbb356739802e2a484\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dgbgk" Sep 9 04:53:46.939726 kubelet[2672]: E0909 04:53:46.939483 2672 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19fdde57deeb91772a6df53b97d5dba2dc06ba841921c6cbb356739802e2a484\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dgbgk" Sep 9 04:53:46.939726 kubelet[2672]: E0909 04:53:46.939533 2672 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dgbgk_calico-system(552937ce-0bd6-4992-a22b-ff41c9705435)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dgbgk_calico-system(552937ce-0bd6-4992-a22b-ff41c9705435)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"19fdde57deeb91772a6df53b97d5dba2dc06ba841921c6cbb356739802e2a484\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dgbgk" podUID="552937ce-0bd6-4992-a22b-ff41c9705435" Sep 9 04:53:48.545756 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3656435593.mount: Deactivated successfully. Sep 9 04:53:48.805302 containerd[1526]: time="2025-09-09T04:53:48.804628395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:48.806729 containerd[1526]: time="2025-09-09T04:53:48.805865101Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 04:53:48.807797 containerd[1526]: time="2025-09-09T04:53:48.807733399Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:48.809756 containerd[1526]: time="2025-09-09T04:53:48.809697176Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:48.810347 containerd[1526]: time="2025-09-09T04:53:48.810278409Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 2.963020783s" Sep 9 04:53:48.810347 containerd[1526]: time="2025-09-09T04:53:48.810314009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 04:53:48.823118 containerd[1526]: time="2025-09-09T04:53:48.822838183Z" level=info msg="CreateContainer within sandbox \"b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 04:53:48.830652 containerd[1526]: time="2025-09-09T04:53:48.830614333Z" level=info msg="Container 642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:48.833966 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2583995631.mount: Deactivated successfully. Sep 9 04:53:48.850168 containerd[1526]: time="2025-09-09T04:53:48.850109506Z" level=info msg="CreateContainer within sandbox \"b1d9f48e5766e65ce46d0f3dcdf8e3f14d4bdbd958c974c5fa377fbdb1c44ee4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e\"" Sep 9 04:53:48.851180 containerd[1526]: time="2025-09-09T04:53:48.851152854Z" level=info msg="StartContainer for \"642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e\"" Sep 9 04:53:48.852866 containerd[1526]: time="2025-09-09T04:53:48.852791635Z" level=info msg="connecting to shim 642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e" address="unix:///run/containerd/s/47f0c8cc67e69113475f62c5cea7f1d197f19f8b232b660db2dc406230f67956" protocol=ttrpc version=3 Sep 9 04:53:48.874500 systemd[1]: Started cri-containerd-642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e.scope - libcontainer container 642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e. Sep 9 04:53:48.913368 containerd[1526]: time="2025-09-09T04:53:48.913303290Z" level=info msg="StartContainer for \"642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e\" returns successfully" Sep 9 04:53:49.050157 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 04:53:49.050253 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 04:53:49.270498 kubelet[2672]: I0909 04:53:49.270237 2672 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-backend-key-pair\") pod \"07263a06-50c4-4b24-8fcd-c7abc03a155b\" (UID: \"07263a06-50c4-4b24-8fcd-c7abc03a155b\") " Sep 9 04:53:49.271117 kubelet[2672]: I0909 04:53:49.270507 2672 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lwqp\" (UniqueName: \"kubernetes.io/projected/07263a06-50c4-4b24-8fcd-c7abc03a155b-kube-api-access-5lwqp\") pod \"07263a06-50c4-4b24-8fcd-c7abc03a155b\" (UID: \"07263a06-50c4-4b24-8fcd-c7abc03a155b\") " Sep 9 04:53:49.271117 kubelet[2672]: I0909 04:53:49.270551 2672 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-ca-bundle\") pod \"07263a06-50c4-4b24-8fcd-c7abc03a155b\" (UID: \"07263a06-50c4-4b24-8fcd-c7abc03a155b\") " Sep 9 04:53:49.281064 kubelet[2672]: I0909 04:53:49.280355 2672 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "07263a06-50c4-4b24-8fcd-c7abc03a155b" (UID: "07263a06-50c4-4b24-8fcd-c7abc03a155b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 04:53:49.281863 kubelet[2672]: I0909 04:53:49.281754 2672 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07263a06-50c4-4b24-8fcd-c7abc03a155b-kube-api-access-5lwqp" (OuterVolumeSpecName: "kube-api-access-5lwqp") pod "07263a06-50c4-4b24-8fcd-c7abc03a155b" (UID: "07263a06-50c4-4b24-8fcd-c7abc03a155b"). InnerVolumeSpecName "kube-api-access-5lwqp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 04:53:49.282065 kubelet[2672]: I0909 04:53:49.282019 2672 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "07263a06-50c4-4b24-8fcd-c7abc03a155b" (UID: "07263a06-50c4-4b24-8fcd-c7abc03a155b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 04:53:49.371566 kubelet[2672]: I0909 04:53:49.371518 2672 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 04:53:49.371566 kubelet[2672]: I0909 04:53:49.371553 2672 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lwqp\" (UniqueName: \"kubernetes.io/projected/07263a06-50c4-4b24-8fcd-c7abc03a155b-kube-api-access-5lwqp\") on node \"localhost\" DevicePath \"\"" Sep 9 04:53:49.371566 kubelet[2672]: I0909 04:53:49.371563 2672 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07263a06-50c4-4b24-8fcd-c7abc03a155b-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 04:53:49.546794 systemd[1]: var-lib-kubelet-pods-07263a06\x2d50c4\x2d4b24\x2d8fcd\x2dc7abc03a155b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5lwqp.mount: Deactivated successfully. Sep 9 04:53:49.546889 systemd[1]: var-lib-kubelet-pods-07263a06\x2d50c4\x2d4b24\x2d8fcd\x2dc7abc03a155b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 04:53:49.751836 systemd[1]: Removed slice kubepods-besteffort-pod07263a06_50c4_4b24_8fcd_c7abc03a155b.slice - libcontainer container kubepods-besteffort-pod07263a06_50c4_4b24_8fcd_c7abc03a155b.slice. Sep 9 04:53:49.879912 kubelet[2672]: I0909 04:53:49.879729 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-65gnj" podStartSLOduration=1.696799146 podStartE2EDuration="11.879711747s" podCreationTimestamp="2025-09-09 04:53:38 +0000 UTC" firstStartedPulling="2025-09-09 04:53:38.628139399 +0000 UTC m=+18.969910526" lastFinishedPulling="2025-09-09 04:53:48.811052 +0000 UTC m=+29.152823127" observedRunningTime="2025-09-09 04:53:49.878405162 +0000 UTC m=+30.220176289" watchObservedRunningTime="2025-09-09 04:53:49.879711747 +0000 UTC m=+30.221482834" Sep 9 04:53:49.951290 systemd[1]: Created slice kubepods-besteffort-pode1da2510_bf09_473b_9157_e69e24f9d4b6.slice - libcontainer container kubepods-besteffort-pode1da2510_bf09_473b_9157_e69e24f9d4b6.slice. Sep 9 04:53:49.974759 kubelet[2672]: I0909 04:53:49.974714 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1da2510-bf09-473b-9157-e69e24f9d4b6-whisker-ca-bundle\") pod \"whisker-5d67c6d988-4z4l5\" (UID: \"e1da2510-bf09-473b-9157-e69e24f9d4b6\") " pod="calico-system/whisker-5d67c6d988-4z4l5" Sep 9 04:53:49.974912 kubelet[2672]: I0909 04:53:49.974780 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e1da2510-bf09-473b-9157-e69e24f9d4b6-whisker-backend-key-pair\") pod \"whisker-5d67c6d988-4z4l5\" (UID: \"e1da2510-bf09-473b-9157-e69e24f9d4b6\") " pod="calico-system/whisker-5d67c6d988-4z4l5" Sep 9 04:53:49.974912 kubelet[2672]: I0909 04:53:49.974814 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58m4\" (UniqueName: \"kubernetes.io/projected/e1da2510-bf09-473b-9157-e69e24f9d4b6-kube-api-access-q58m4\") pod \"whisker-5d67c6d988-4z4l5\" (UID: \"e1da2510-bf09-473b-9157-e69e24f9d4b6\") " pod="calico-system/whisker-5d67c6d988-4z4l5" Sep 9 04:53:50.256512 containerd[1526]: time="2025-09-09T04:53:50.256396152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d67c6d988-4z4l5,Uid:e1da2510-bf09-473b-9157-e69e24f9d4b6,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:50.431830 systemd-networkd[1434]: calie9ab5e02250: Link UP Sep 9 04:53:50.433594 systemd-networkd[1434]: calie9ab5e02250: Gained carrier Sep 9 04:53:50.449293 containerd[1526]: 2025-09-09 04:53:50.276 [INFO][3882] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:53:50.449293 containerd[1526]: 2025-09-09 04:53:50.310 [INFO][3882] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5d67c6d988--4z4l5-eth0 whisker-5d67c6d988- calico-system e1da2510-bf09-473b-9157-e69e24f9d4b6 896 0 2025-09-09 04:53:49 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5d67c6d988 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5d67c6d988-4z4l5 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie9ab5e02250 [] [] }} ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-" Sep 9 04:53:50.449293 containerd[1526]: 2025-09-09 04:53:50.310 [INFO][3882] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" Sep 9 04:53:50.449293 containerd[1526]: 2025-09-09 04:53:50.372 [INFO][3894] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" HandleID="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Workload="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.372 [INFO][3894] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" HandleID="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Workload="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3330), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5d67c6d988-4z4l5", "timestamp":"2025-09-09 04:53:50.372590898 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.372 [INFO][3894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.372 [INFO][3894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.372 [INFO][3894] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.384 [INFO][3894] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" host="localhost" Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.393 [INFO][3894] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.399 [INFO][3894] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.401 [INFO][3894] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.403 [INFO][3894] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:50.449903 containerd[1526]: 2025-09-09 04:53:50.403 [INFO][3894] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" host="localhost" Sep 9 04:53:50.450123 containerd[1526]: 2025-09-09 04:53:50.405 [INFO][3894] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df Sep 9 04:53:50.450123 containerd[1526]: 2025-09-09 04:53:50.408 [INFO][3894] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" host="localhost" Sep 9 04:53:50.450123 containerd[1526]: 2025-09-09 04:53:50.417 [INFO][3894] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" host="localhost" Sep 9 04:53:50.450123 containerd[1526]: 2025-09-09 04:53:50.417 [INFO][3894] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" host="localhost" Sep 9 04:53:50.450123 containerd[1526]: 2025-09-09 04:53:50.417 [INFO][3894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:53:50.450123 containerd[1526]: 2025-09-09 04:53:50.417 [INFO][3894] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" HandleID="k8s-pod-network.56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Workload="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" Sep 9 04:53:50.450229 containerd[1526]: 2025-09-09 04:53:50.420 [INFO][3882] cni-plugin/k8s.go 418: Populated endpoint ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5d67c6d988--4z4l5-eth0", GenerateName:"whisker-5d67c6d988-", Namespace:"calico-system", SelfLink:"", UID:"e1da2510-bf09-473b-9157-e69e24f9d4b6", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5d67c6d988", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5d67c6d988-4z4l5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9ab5e02250", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:50.450229 containerd[1526]: 2025-09-09 04:53:50.421 [INFO][3882] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" Sep 9 04:53:50.450295 containerd[1526]: 2025-09-09 04:53:50.421 [INFO][3882] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie9ab5e02250 ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" Sep 9 04:53:50.450295 containerd[1526]: 2025-09-09 04:53:50.433 [INFO][3882] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" Sep 9 04:53:50.450354 containerd[1526]: 2025-09-09 04:53:50.435 [INFO][3882] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5d67c6d988--4z4l5-eth0", GenerateName:"whisker-5d67c6d988-", Namespace:"calico-system", SelfLink:"", UID:"e1da2510-bf09-473b-9157-e69e24f9d4b6", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5d67c6d988", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df", Pod:"whisker-5d67c6d988-4z4l5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie9ab5e02250", MAC:"2e:87:67:1c:34:ec", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:50.450402 containerd[1526]: 2025-09-09 04:53:50.444 [INFO][3882] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" Namespace="calico-system" Pod="whisker-5d67c6d988-4z4l5" WorkloadEndpoint="localhost-k8s-whisker--5d67c6d988--4z4l5-eth0" Sep 9 04:53:50.540749 containerd[1526]: time="2025-09-09T04:53:50.540633325Z" level=info msg="connecting to shim 56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df" address="unix:///run/containerd/s/b98848fdaa8cc8352c06e3d915584b3f30a33ed40dbb8f4ee7d161eaf87a319b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:50.579686 systemd[1]: Started cri-containerd-56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df.scope - libcontainer container 56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df. Sep 9 04:53:50.595713 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:53:50.666865 containerd[1526]: time="2025-09-09T04:53:50.666813803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d67c6d988-4z4l5,Uid:e1da2510-bf09-473b-9157-e69e24f9d4b6,Namespace:calico-system,Attempt:0,} returns sandbox id \"56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df\"" Sep 9 04:53:50.669469 containerd[1526]: time="2025-09-09T04:53:50.669442094Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 04:53:50.861365 kubelet[2672]: I0909 04:53:50.861223 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:53:51.523059 containerd[1526]: time="2025-09-09T04:53:51.522051251Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:51.523816 containerd[1526]: time="2025-09-09T04:53:51.523772793Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 04:53:51.523816 containerd[1526]: time="2025-09-09T04:53:51.523775673Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:51.526437 containerd[1526]: time="2025-09-09T04:53:51.526410725Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:51.527642 containerd[1526]: time="2025-09-09T04:53:51.527614113Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 858.136659ms" Sep 9 04:53:51.527742 containerd[1526]: time="2025-09-09T04:53:51.527727072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 04:53:51.531780 containerd[1526]: time="2025-09-09T04:53:51.531752830Z" level=info msg="CreateContainer within sandbox \"56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 04:53:51.537142 containerd[1526]: time="2025-09-09T04:53:51.537116094Z" level=info msg="Container 663a92a67163f03087428069726c5535d6224f23f360efe4020d021178484e3f: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:51.545400 containerd[1526]: time="2025-09-09T04:53:51.545319129Z" level=info msg="CreateContainer within sandbox \"56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"663a92a67163f03087428069726c5535d6224f23f360efe4020d021178484e3f\"" Sep 9 04:53:51.546223 containerd[1526]: time="2025-09-09T04:53:51.546196519Z" level=info msg="StartContainer for \"663a92a67163f03087428069726c5535d6224f23f360efe4020d021178484e3f\"" Sep 9 04:53:51.547193 containerd[1526]: time="2025-09-09T04:53:51.547168709Z" level=info msg="connecting to shim 663a92a67163f03087428069726c5535d6224f23f360efe4020d021178484e3f" address="unix:///run/containerd/s/b98848fdaa8cc8352c06e3d915584b3f30a33ed40dbb8f4ee7d161eaf87a319b" protocol=ttrpc version=3 Sep 9 04:53:51.564486 systemd-networkd[1434]: calie9ab5e02250: Gained IPv6LL Sep 9 04:53:51.568491 systemd[1]: Started cri-containerd-663a92a67163f03087428069726c5535d6224f23f360efe4020d021178484e3f.scope - libcontainer container 663a92a67163f03087428069726c5535d6224f23f360efe4020d021178484e3f. Sep 9 04:53:51.622783 containerd[1526]: time="2025-09-09T04:53:51.622684963Z" level=info msg="StartContainer for \"663a92a67163f03087428069726c5535d6224f23f360efe4020d021178484e3f\" returns successfully" Sep 9 04:53:51.625743 containerd[1526]: time="2025-09-09T04:53:51.625703852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 04:53:51.742011 kubelet[2672]: I0909 04:53:51.741975 2672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07263a06-50c4-4b24-8fcd-c7abc03a155b" path="/var/lib/kubelet/pods/07263a06-50c4-4b24-8fcd-c7abc03a155b/volumes" Sep 9 04:53:52.999553 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount873691167.mount: Deactivated successfully. Sep 9 04:53:53.021229 containerd[1526]: time="2025-09-09T04:53:53.020808501Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:53.023294 containerd[1526]: time="2025-09-09T04:53:53.023253437Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 04:53:53.024379 containerd[1526]: time="2025-09-09T04:53:53.024350067Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:53.026579 containerd[1526]: time="2025-09-09T04:53:53.026548485Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:53:53.027375 containerd[1526]: time="2025-09-09T04:53:53.027341398Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.401599547s" Sep 9 04:53:53.027375 containerd[1526]: time="2025-09-09T04:53:53.027376637Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 04:53:53.031295 containerd[1526]: time="2025-09-09T04:53:53.031144241Z" level=info msg="CreateContainer within sandbox \"56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 04:53:53.038351 containerd[1526]: time="2025-09-09T04:53:53.037837456Z" level=info msg="Container e425da6cb274490adfa0dd35781381855cc2a9830fefe59247e4f15693c1e129: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:53.048819 containerd[1526]: time="2025-09-09T04:53:53.048777269Z" level=info msg="CreateContainer within sandbox \"56f126f5f5b2ffb9bb48007ca76694b4a77c595eb638f5d6e6c5aba4fbb161df\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"e425da6cb274490adfa0dd35781381855cc2a9830fefe59247e4f15693c1e129\"" Sep 9 04:53:53.049486 containerd[1526]: time="2025-09-09T04:53:53.049450383Z" level=info msg="StartContainer for \"e425da6cb274490adfa0dd35781381855cc2a9830fefe59247e4f15693c1e129\"" Sep 9 04:53:53.050580 containerd[1526]: time="2025-09-09T04:53:53.050484773Z" level=info msg="connecting to shim e425da6cb274490adfa0dd35781381855cc2a9830fefe59247e4f15693c1e129" address="unix:///run/containerd/s/b98848fdaa8cc8352c06e3d915584b3f30a33ed40dbb8f4ee7d161eaf87a319b" protocol=ttrpc version=3 Sep 9 04:53:53.074534 systemd[1]: Started cri-containerd-e425da6cb274490adfa0dd35781381855cc2a9830fefe59247e4f15693c1e129.scope - libcontainer container e425da6cb274490adfa0dd35781381855cc2a9830fefe59247e4f15693c1e129. Sep 9 04:53:53.108647 containerd[1526]: time="2025-09-09T04:53:53.108609048Z" level=info msg="StartContainer for \"e425da6cb274490adfa0dd35781381855cc2a9830fefe59247e4f15693c1e129\" returns successfully" Sep 9 04:53:54.749832 kubelet[2672]: I0909 04:53:54.749648 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:53:54.767107 kubelet[2672]: I0909 04:53:54.767046 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5d67c6d988-4z4l5" podStartSLOduration=3.406560903 podStartE2EDuration="5.76702654s" podCreationTimestamp="2025-09-09 04:53:49 +0000 UTC" firstStartedPulling="2025-09-09 04:53:50.66799651 +0000 UTC m=+31.009767637" lastFinishedPulling="2025-09-09 04:53:53.028462187 +0000 UTC m=+33.370233274" observedRunningTime="2025-09-09 04:53:53.883157044 +0000 UTC m=+34.224928211" watchObservedRunningTime="2025-09-09 04:53:54.76702654 +0000 UTC m=+35.108797667" Sep 9 04:53:56.004082 systemd-networkd[1434]: vxlan.calico: Link UP Sep 9 04:53:56.004092 systemd-networkd[1434]: vxlan.calico: Gained carrier Sep 9 04:53:56.739805 containerd[1526]: time="2025-09-09T04:53:56.739765277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tzvmr,Uid:e63f9b21-1c03-49de-a212-ae6c3461a913,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:56.878608 systemd-networkd[1434]: cali9d22d161a24: Link UP Sep 9 04:53:56.878880 systemd-networkd[1434]: cali9d22d161a24: Gained carrier Sep 9 04:53:56.891032 containerd[1526]: 2025-09-09 04:53:56.810 [INFO][4355] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0 coredns-668d6bf9bc- kube-system e63f9b21-1c03-49de-a212-ae6c3461a913 833 0 2025-09-09 04:53:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-tzvmr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9d22d161a24 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-" Sep 9 04:53:56.891032 containerd[1526]: 2025-09-09 04:53:56.810 [INFO][4355] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" Sep 9 04:53:56.891032 containerd[1526]: 2025-09-09 04:53:56.839 [INFO][4372] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" HandleID="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Workload="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.839 [INFO][4372] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" HandleID="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Workload="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd5f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-tzvmr", "timestamp":"2025-09-09 04:53:56.839509916 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.839 [INFO][4372] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.839 [INFO][4372] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.839 [INFO][4372] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.849 [INFO][4372] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" host="localhost" Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.853 [INFO][4372] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.857 [INFO][4372] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.858 [INFO][4372] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.860 [INFO][4372] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:56.891256 containerd[1526]: 2025-09-09 04:53:56.860 [INFO][4372] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" host="localhost" Sep 9 04:53:56.891480 containerd[1526]: 2025-09-09 04:53:56.862 [INFO][4372] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d Sep 9 04:53:56.891480 containerd[1526]: 2025-09-09 04:53:56.867 [INFO][4372] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" host="localhost" Sep 9 04:53:56.891480 containerd[1526]: 2025-09-09 04:53:56.873 [INFO][4372] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" host="localhost" Sep 9 04:53:56.891480 containerd[1526]: 2025-09-09 04:53:56.873 [INFO][4372] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" host="localhost" Sep 9 04:53:56.891480 containerd[1526]: 2025-09-09 04:53:56.873 [INFO][4372] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:53:56.891480 containerd[1526]: 2025-09-09 04:53:56.873 [INFO][4372] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" HandleID="k8s-pod-network.edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Workload="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" Sep 9 04:53:56.891595 containerd[1526]: 2025-09-09 04:53:56.875 [INFO][4355] cni-plugin/k8s.go 418: Populated endpoint ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e63f9b21-1c03-49de-a212-ae6c3461a913", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-tzvmr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d22d161a24", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:56.891667 containerd[1526]: 2025-09-09 04:53:56.875 [INFO][4355] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" Sep 9 04:53:56.891667 containerd[1526]: 2025-09-09 04:53:56.875 [INFO][4355] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9d22d161a24 ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" Sep 9 04:53:56.891667 containerd[1526]: 2025-09-09 04:53:56.879 [INFO][4355] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" Sep 9 04:53:56.891727 containerd[1526]: 2025-09-09 04:53:56.879 [INFO][4355] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e63f9b21-1c03-49de-a212-ae6c3461a913", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d", Pod:"coredns-668d6bf9bc-tzvmr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d22d161a24", MAC:"1e:bc:8c:5b:44:14", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:56.891727 containerd[1526]: 2025-09-09 04:53:56.888 [INFO][4355] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" Namespace="kube-system" Pod="coredns-668d6bf9bc-tzvmr" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--tzvmr-eth0" Sep 9 04:53:56.960496 containerd[1526]: time="2025-09-09T04:53:56.960453649Z" level=info msg="connecting to shim edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d" address="unix:///run/containerd/s/e11627ac407234eaff99e035f8d9019e814417ff3fc65ab797ec958756aeb2f0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:56.990523 systemd[1]: Started cri-containerd-edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d.scope - libcontainer container edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d. Sep 9 04:53:57.000634 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:53:57.024753 containerd[1526]: time="2025-09-09T04:53:57.024720968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tzvmr,Uid:e63f9b21-1c03-49de-a212-ae6c3461a913,Namespace:kube-system,Attempt:0,} returns sandbox id \"edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d\"" Sep 9 04:53:57.039522 containerd[1526]: time="2025-09-09T04:53:57.039464162Z" level=info msg="CreateContainer within sandbox \"edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:53:57.051087 containerd[1526]: time="2025-09-09T04:53:57.051029542Z" level=info msg="Container 64b3872475c098708f424a7c5f2f894e06d8d880e2fa0f0d18a94cca0c5aec43: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:57.054856 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3501137181.mount: Deactivated successfully. Sep 9 04:53:57.057144 containerd[1526]: time="2025-09-09T04:53:57.057088611Z" level=info msg="CreateContainer within sandbox \"edbdea2e2d9e40f489a793e5d178c400b15c4aa3a58ba4be2a69b658526e2e7d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"64b3872475c098708f424a7c5f2f894e06d8d880e2fa0f0d18a94cca0c5aec43\"" Sep 9 04:53:57.057675 containerd[1526]: time="2025-09-09T04:53:57.057655406Z" level=info msg="StartContainer for \"64b3872475c098708f424a7c5f2f894e06d8d880e2fa0f0d18a94cca0c5aec43\"" Sep 9 04:53:57.058692 containerd[1526]: time="2025-09-09T04:53:57.058663757Z" level=info msg="connecting to shim 64b3872475c098708f424a7c5f2f894e06d8d880e2fa0f0d18a94cca0c5aec43" address="unix:///run/containerd/s/e11627ac407234eaff99e035f8d9019e814417ff3fc65ab797ec958756aeb2f0" protocol=ttrpc version=3 Sep 9 04:53:57.068481 systemd-networkd[1434]: vxlan.calico: Gained IPv6LL Sep 9 04:53:57.082122 systemd[1]: Started cri-containerd-64b3872475c098708f424a7c5f2f894e06d8d880e2fa0f0d18a94cca0c5aec43.scope - libcontainer container 64b3872475c098708f424a7c5f2f894e06d8d880e2fa0f0d18a94cca0c5aec43. Sep 9 04:53:57.107923 containerd[1526]: time="2025-09-09T04:53:57.107837696Z" level=info msg="StartContainer for \"64b3872475c098708f424a7c5f2f894e06d8d880e2fa0f0d18a94cca0c5aec43\" returns successfully" Sep 9 04:53:57.892671 kubelet[2672]: I0909 04:53:57.892612 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-tzvmr" podStartSLOduration=32.892594734 podStartE2EDuration="32.892594734s" podCreationTimestamp="2025-09-09 04:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:53:57.892255297 +0000 UTC m=+38.234026384" watchObservedRunningTime="2025-09-09 04:53:57.892594734 +0000 UTC m=+38.234365861" Sep 9 04:53:58.740012 containerd[1526]: time="2025-09-09T04:53:58.739963416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwg62,Uid:751e08fd-9b29-4e93-b1fe-d6b3e4e781cb,Namespace:kube-system,Attempt:0,}" Sep 9 04:53:58.848148 systemd-networkd[1434]: calie951c6dace7: Link UP Sep 9 04:53:58.848354 systemd-networkd[1434]: calie951c6dace7: Gained carrier Sep 9 04:53:58.860863 systemd-networkd[1434]: cali9d22d161a24: Gained IPv6LL Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.779 [INFO][4475] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--mwg62-eth0 coredns-668d6bf9bc- kube-system 751e08fd-9b29-4e93-b1fe-d6b3e4e781cb 835 0 2025-09-09 04:53:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-mwg62 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie951c6dace7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.779 [INFO][4475] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.806 [INFO][4491] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" HandleID="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Workload="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.807 [INFO][4491] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" HandleID="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Workload="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034afe0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-mwg62", "timestamp":"2025-09-09 04:53:58.806943618 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.807 [INFO][4491] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.807 [INFO][4491] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.807 [INFO][4491] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.816 [INFO][4491] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.822 [INFO][4491] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.826 [INFO][4491] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.828 [INFO][4491] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.830 [INFO][4491] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.830 [INFO][4491] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.832 [INFO][4491] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1 Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.836 [INFO][4491] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.842 [INFO][4491] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.842 [INFO][4491] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" host="localhost" Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.842 [INFO][4491] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:53:58.861479 containerd[1526]: 2025-09-09 04:53:58.842 [INFO][4491] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" HandleID="k8s-pod-network.7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Workload="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" Sep 9 04:53:58.861933 containerd[1526]: 2025-09-09 04:53:58.844 [INFO][4475] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--mwg62-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"751e08fd-9b29-4e93-b1fe-d6b3e4e781cb", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-mwg62", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie951c6dace7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:58.861933 containerd[1526]: 2025-09-09 04:53:58.845 [INFO][4475] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" Sep 9 04:53:58.861933 containerd[1526]: 2025-09-09 04:53:58.845 [INFO][4475] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie951c6dace7 ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" Sep 9 04:53:58.861933 containerd[1526]: 2025-09-09 04:53:58.848 [INFO][4475] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" Sep 9 04:53:58.861933 containerd[1526]: 2025-09-09 04:53:58.848 [INFO][4475] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--mwg62-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"751e08fd-9b29-4e93-b1fe-d6b3e4e781cb", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1", Pod:"coredns-668d6bf9bc-mwg62", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie951c6dace7", MAC:"f6:61:ea:18:08:c5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:58.861933 containerd[1526]: 2025-09-09 04:53:58.858 [INFO][4475] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" Namespace="kube-system" Pod="coredns-668d6bf9bc-mwg62" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--mwg62-eth0" Sep 9 04:53:58.883541 containerd[1526]: time="2025-09-09T04:53:58.883495981Z" level=info msg="connecting to shim 7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1" address="unix:///run/containerd/s/7ca26a62f79b47529e00ba8cd602a1e7251159d6134d5ec9533344afdc15540b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:53:58.912532 systemd[1]: Started cri-containerd-7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1.scope - libcontainer container 7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1. Sep 9 04:53:58.923525 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:53:58.951233 containerd[1526]: time="2025-09-09T04:53:58.951191098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-mwg62,Uid:751e08fd-9b29-4e93-b1fe-d6b3e4e781cb,Namespace:kube-system,Attempt:0,} returns sandbox id \"7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1\"" Sep 9 04:53:58.958160 containerd[1526]: time="2025-09-09T04:53:58.958038361Z" level=info msg="CreateContainer within sandbox \"7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:53:58.969755 containerd[1526]: time="2025-09-09T04:53:58.969713064Z" level=info msg="Container 38a0d4d0ab88fc2a140868674dac18e9985fbab3efcca3387f83e35a9b3b8dd6: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:53:58.974749 containerd[1526]: time="2025-09-09T04:53:58.974710142Z" level=info msg="CreateContainer within sandbox \"7f34e5d430a81b1b496c14d374604e3e118ee53e2849c46ef1a085dd2a0e86d1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"38a0d4d0ab88fc2a140868674dac18e9985fbab3efcca3387f83e35a9b3b8dd6\"" Sep 9 04:53:58.975424 containerd[1526]: time="2025-09-09T04:53:58.975390297Z" level=info msg="StartContainer for \"38a0d4d0ab88fc2a140868674dac18e9985fbab3efcca3387f83e35a9b3b8dd6\"" Sep 9 04:53:58.976225 containerd[1526]: time="2025-09-09T04:53:58.976198930Z" level=info msg="connecting to shim 38a0d4d0ab88fc2a140868674dac18e9985fbab3efcca3387f83e35a9b3b8dd6" address="unix:///run/containerd/s/7ca26a62f79b47529e00ba8cd602a1e7251159d6134d5ec9533344afdc15540b" protocol=ttrpc version=3 Sep 9 04:53:59.001575 systemd[1]: Started cri-containerd-38a0d4d0ab88fc2a140868674dac18e9985fbab3efcca3387f83e35a9b3b8dd6.scope - libcontainer container 38a0d4d0ab88fc2a140868674dac18e9985fbab3efcca3387f83e35a9b3b8dd6. Sep 9 04:53:59.031788 containerd[1526]: time="2025-09-09T04:53:59.031681555Z" level=info msg="StartContainer for \"38a0d4d0ab88fc2a140868674dac18e9985fbab3efcca3387f83e35a9b3b8dd6\" returns successfully" Sep 9 04:53:59.740053 containerd[1526]: time="2025-09-09T04:53:59.740004582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-zfbw4,Uid:f3dfd24e-f850-4a8c-99c9-946b72b2a033,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:53:59.741188 containerd[1526]: time="2025-09-09T04:53:59.740961015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697776f99-qp2dg,Uid:8cff6f75-e1a9-4839-a313-d99705c12062,Namespace:calico-system,Attempt:0,}" Sep 9 04:53:59.741188 containerd[1526]: time="2025-09-09T04:53:59.741106573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-pxpgr,Uid:51869746-303b-4256-91b8-4606d0ce74fe,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:53:59.940490 systemd-networkd[1434]: cali30f8fc76a9d: Link UP Sep 9 04:53:59.942622 systemd-networkd[1434]: cali30f8fc76a9d: Gained carrier Sep 9 04:53:59.956401 kubelet[2672]: I0909 04:53:59.956297 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-mwg62" podStartSLOduration=34.956278552 podStartE2EDuration="34.956278552s" podCreationTimestamp="2025-09-09 04:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:53:59.905058887 +0000 UTC m=+40.246830014" watchObservedRunningTime="2025-09-09 04:53:59.956278552 +0000 UTC m=+40.298049679" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.791 [INFO][4590] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0 calico-apiserver-cb85c6f7d- calico-apiserver f3dfd24e-f850-4a8c-99c9-946b72b2a033 834 0 2025-09-09 04:53:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cb85c6f7d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-cb85c6f7d-zfbw4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali30f8fc76a9d [] [] }} ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.792 [INFO][4590] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.872 [INFO][4638] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" HandleID="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.872 [INFO][4638] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" HandleID="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000116e80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-cb85c6f7d-zfbw4", "timestamp":"2025-09-09 04:53:59.87252899 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.872 [INFO][4638] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.872 [INFO][4638] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.872 [INFO][4638] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.886 [INFO][4638] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.896 [INFO][4638] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.905 [INFO][4638] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.911 [INFO][4638] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.915 [INFO][4638] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.915 [INFO][4638] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.919 [INFO][4638] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.924 [INFO][4638] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.930 [INFO][4638] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.930 [INFO][4638] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" host="localhost" Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.930 [INFO][4638] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:53:59.965664 containerd[1526]: 2025-09-09 04:53:59.931 [INFO][4638] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" HandleID="k8s-pod-network.eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" Sep 9 04:53:59.966163 containerd[1526]: 2025-09-09 04:53:59.934 [INFO][4590] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0", GenerateName:"calico-apiserver-cb85c6f7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"f3dfd24e-f850-4a8c-99c9-946b72b2a033", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cb85c6f7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-cb85c6f7d-zfbw4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali30f8fc76a9d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:59.966163 containerd[1526]: 2025-09-09 04:53:59.935 [INFO][4590] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" Sep 9 04:53:59.966163 containerd[1526]: 2025-09-09 04:53:59.935 [INFO][4590] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali30f8fc76a9d ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" Sep 9 04:53:59.966163 containerd[1526]: 2025-09-09 04:53:59.942 [INFO][4590] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" Sep 9 04:53:59.966163 containerd[1526]: 2025-09-09 04:53:59.943 [INFO][4590] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0", GenerateName:"calico-apiserver-cb85c6f7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"f3dfd24e-f850-4a8c-99c9-946b72b2a033", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cb85c6f7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb", Pod:"calico-apiserver-cb85c6f7d-zfbw4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali30f8fc76a9d", MAC:"66:37:bf:8c:c9:99", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:53:59.966163 containerd[1526]: 2025-09-09 04:53:59.957 [INFO][4590] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-zfbw4" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--zfbw4-eth0" Sep 9 04:53:59.995418 containerd[1526]: time="2025-09-09T04:53:59.994459323Z" level=info msg="connecting to shim eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb" address="unix:///run/containerd/s/10833319c985fe719fc91df0f4eea5bcc1bb4352bb48e459ce7c381fbe184203" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:54:00.022487 systemd[1]: Started cri-containerd-eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb.scope - libcontainer container eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb. Sep 9 04:54:00.034198 systemd-networkd[1434]: calia0bf8bc03ba: Link UP Sep 9 04:54:00.034743 systemd-networkd[1434]: calia0bf8bc03ba: Gained carrier Sep 9 04:54:00.047039 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.851 [INFO][4613] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0 calico-apiserver-cb85c6f7d- calico-apiserver 51869746-303b-4256-91b8-4606d0ce74fe 829 0 2025-09-09 04:53:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cb85c6f7d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-cb85c6f7d-pxpgr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia0bf8bc03ba [] [] }} ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.851 [INFO][4613] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.902 [INFO][4655] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.902 [INFO][4655] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004921f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-cb85c6f7d-pxpgr", "timestamp":"2025-09-09 04:53:59.902554827 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.902 [INFO][4655] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.930 [INFO][4655] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.930 [INFO][4655] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.987 [INFO][4655] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:53:59.997 [INFO][4655] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.004 [INFO][4655] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.006 [INFO][4655] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.009 [INFO][4655] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.009 [INFO][4655] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.011 [INFO][4655] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.015 [INFO][4655] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.025 [INFO][4655] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.025 [INFO][4655] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" host="localhost" Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.025 [INFO][4655] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:00.053319 containerd[1526]: 2025-09-09 04:54:00.025 [INFO][4655] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:00.054174 containerd[1526]: 2025-09-09 04:54:00.030 [INFO][4613] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0", GenerateName:"calico-apiserver-cb85c6f7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"51869746-303b-4256-91b8-4606d0ce74fe", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cb85c6f7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-cb85c6f7d-pxpgr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia0bf8bc03ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:00.054174 containerd[1526]: 2025-09-09 04:54:00.030 [INFO][4613] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:00.054174 containerd[1526]: 2025-09-09 04:54:00.030 [INFO][4613] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia0bf8bc03ba ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:00.054174 containerd[1526]: 2025-09-09 04:54:00.036 [INFO][4613] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:00.054174 containerd[1526]: 2025-09-09 04:54:00.037 [INFO][4613] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0", GenerateName:"calico-apiserver-cb85c6f7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"51869746-303b-4256-91b8-4606d0ce74fe", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cb85c6f7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef", Pod:"calico-apiserver-cb85c6f7d-pxpgr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia0bf8bc03ba", MAC:"1e:7d:cd:30:5f:bf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:00.054174 containerd[1526]: 2025-09-09 04:54:00.051 [INFO][4613] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Namespace="calico-apiserver" Pod="calico-apiserver-cb85c6f7d-pxpgr" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:00.082572 containerd[1526]: time="2025-09-09T04:54:00.082526508Z" level=info msg="connecting to shim 1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" address="unix:///run/containerd/s/2274967065a7af493df11156bc3671ce193983394eb013daeee02317c8db82a7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:54:00.089557 containerd[1526]: time="2025-09-09T04:54:00.086883513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-zfbw4,Uid:f3dfd24e-f850-4a8c-99c9-946b72b2a033,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb\"" Sep 9 04:54:00.094563 containerd[1526]: time="2025-09-09T04:54:00.093208303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:54:00.111490 systemd[1]: Started cri-containerd-1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef.scope - libcontainer container 1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef. Sep 9 04:54:00.134603 systemd-networkd[1434]: calie46a12bcccd: Link UP Sep 9 04:54:00.134736 systemd-networkd[1434]: calie46a12bcccd: Gained carrier Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:53:59.850 [INFO][4601] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0 calico-kube-controllers-697776f99- calico-system 8cff6f75-e1a9-4839-a313-d99705c12062 826 0 2025-09-09 04:53:38 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:697776f99 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-697776f99-qp2dg eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie46a12bcccd [] [] }} ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:53:59.851 [INFO][4601] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:53:59.905 [INFO][4648] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" HandleID="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Workload="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:53:59.906 [INFO][4648] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" HandleID="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Workload="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004db10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-697776f99-qp2dg", "timestamp":"2025-09-09 04:53:59.905963599 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:53:59.906 [INFO][4648] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.025 [INFO][4648] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.025 [INFO][4648] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.090 [INFO][4648] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.100 [INFO][4648] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.106 [INFO][4648] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.109 [INFO][4648] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.111 [INFO][4648] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.111 [INFO][4648] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.112 [INFO][4648] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.119 [INFO][4648] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.128 [INFO][4648] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.128 [INFO][4648] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" host="localhost" Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.128 [INFO][4648] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:00.154653 containerd[1526]: 2025-09-09 04:54:00.128 [INFO][4648] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" HandleID="k8s-pod-network.ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Workload="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" Sep 9 04:54:00.155143 containerd[1526]: 2025-09-09 04:54:00.131 [INFO][4601] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0", GenerateName:"calico-kube-controllers-697776f99-", Namespace:"calico-system", SelfLink:"", UID:"8cff6f75-e1a9-4839-a313-d99705c12062", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697776f99", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-697776f99-qp2dg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie46a12bcccd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:00.155143 containerd[1526]: 2025-09-09 04:54:00.132 [INFO][4601] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" Sep 9 04:54:00.155143 containerd[1526]: 2025-09-09 04:54:00.132 [INFO][4601] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie46a12bcccd ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" Sep 9 04:54:00.155143 containerd[1526]: 2025-09-09 04:54:00.135 [INFO][4601] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" Sep 9 04:54:00.155143 containerd[1526]: 2025-09-09 04:54:00.136 [INFO][4601] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0", GenerateName:"calico-kube-controllers-697776f99-", Namespace:"calico-system", SelfLink:"", UID:"8cff6f75-e1a9-4839-a313-d99705c12062", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"697776f99", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb", Pod:"calico-kube-controllers-697776f99-qp2dg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie46a12bcccd", MAC:"36:b5:c7:80:a1:a1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:00.155143 containerd[1526]: 2025-09-09 04:54:00.148 [INFO][4601] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" Namespace="calico-system" Pod="calico-kube-controllers-697776f99-qp2dg" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--697776f99--qp2dg-eth0" Sep 9 04:54:00.173305 systemd[1]: Started sshd@7-10.0.0.40:22-10.0.0.1:59450.service - OpenSSH per-connection server daemon (10.0.0.1:59450). Sep 9 04:54:00.174403 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:54:00.184840 containerd[1526]: time="2025-09-09T04:54:00.184797062Z" level=info msg="connecting to shim ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb" address="unix:///run/containerd/s/b2faf986faec220ffee988798fb1e26e42a4056dbc42e2a573b15d5153d75aa7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:54:00.212544 containerd[1526]: time="2025-09-09T04:54:00.212504004Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cb85c6f7d-pxpgr,Uid:51869746-303b-4256-91b8-4606d0ce74fe,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\"" Sep 9 04:54:00.214486 systemd[1]: Started cri-containerd-ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb.scope - libcontainer container ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb. Sep 9 04:54:00.227451 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:54:00.250554 sshd[4781]: Accepted publickey for core from 10.0.0.1 port 59450 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:00.252472 sshd-session[4781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:00.253376 containerd[1526]: time="2025-09-09T04:54:00.253277042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-697776f99-qp2dg,Uid:8cff6f75-e1a9-4839-a313-d99705c12062,Namespace:calico-system,Attempt:0,} returns sandbox id \"ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb\"" Sep 9 04:54:00.258444 systemd-logind[1508]: New session 8 of user core. Sep 9 04:54:00.266511 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 04:54:00.455825 sshd[4837]: Connection closed by 10.0.0.1 port 59450 Sep 9 04:54:00.455705 sshd-session[4781]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:00.459032 systemd[1]: sshd@7-10.0.0.40:22-10.0.0.1:59450.service: Deactivated successfully. Sep 9 04:54:00.461006 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 04:54:00.461809 systemd-logind[1508]: Session 8 logged out. Waiting for processes to exit. Sep 9 04:54:00.463126 systemd-logind[1508]: Removed session 8. Sep 9 04:54:00.524606 systemd-networkd[1434]: calie951c6dace7: Gained IPv6LL Sep 9 04:54:00.740758 containerd[1526]: time="2025-09-09T04:54:00.740696682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gmx58,Uid:cf82b986-3cd0-45d0-9764-a0d291ff5cfd,Namespace:calico-system,Attempt:0,}" Sep 9 04:54:00.856391 systemd-networkd[1434]: cali36872d8d9e4: Link UP Sep 9 04:54:00.857613 systemd-networkd[1434]: cali36872d8d9e4: Gained carrier Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.773 [INFO][4858] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--gmx58-eth0 goldmane-54d579b49d- calico-system cf82b986-3cd0-45d0-9764-a0d291ff5cfd 837 0 2025-09-09 04:53:37 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-gmx58 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali36872d8d9e4 [] [] }} ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.773 [INFO][4858] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.796 [INFO][4873] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" HandleID="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Workload="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.796 [INFO][4873] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" HandleID="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Workload="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-gmx58", "timestamp":"2025-09-09 04:54:00.796479362 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.796 [INFO][4873] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.796 [INFO][4873] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.796 [INFO][4873] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.805 [INFO][4873] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.810 [INFO][4873] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.814 [INFO][4873] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.818 [INFO][4873] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.823 [INFO][4873] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.823 [INFO][4873] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.825 [INFO][4873] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.832 [INFO][4873] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.842 [INFO][4873] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.842 [INFO][4873] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" host="localhost" Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.842 [INFO][4873] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:00.885525 containerd[1526]: 2025-09-09 04:54:00.842 [INFO][4873] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" HandleID="k8s-pod-network.1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Workload="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" Sep 9 04:54:00.886020 containerd[1526]: 2025-09-09 04:54:00.849 [INFO][4858] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--gmx58-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"cf82b986-3cd0-45d0-9764-a0d291ff5cfd", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-gmx58", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali36872d8d9e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:00.886020 containerd[1526]: 2025-09-09 04:54:00.849 [INFO][4858] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" Sep 9 04:54:00.886020 containerd[1526]: 2025-09-09 04:54:00.849 [INFO][4858] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali36872d8d9e4 ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" Sep 9 04:54:00.886020 containerd[1526]: 2025-09-09 04:54:00.857 [INFO][4858] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" Sep 9 04:54:00.886020 containerd[1526]: 2025-09-09 04:54:00.858 [INFO][4858] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--gmx58-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"cf82b986-3cd0-45d0-9764-a0d291ff5cfd", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba", Pod:"goldmane-54d579b49d-gmx58", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali36872d8d9e4", MAC:"7a:31:b6:1f:d7:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:00.886020 containerd[1526]: 2025-09-09 04:54:00.879 [INFO][4858] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" Namespace="calico-system" Pod="goldmane-54d579b49d-gmx58" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--gmx58-eth0" Sep 9 04:54:00.928375 containerd[1526]: time="2025-09-09T04:54:00.928307484Z" level=info msg="connecting to shim 1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba" address="unix:///run/containerd/s/afd1bba34de5d8c6a768204285b01b946079f2c5abf26fb3a40f086c523bd07b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:54:00.982508 systemd[1]: Started cri-containerd-1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba.scope - libcontainer container 1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba. Sep 9 04:54:00.995122 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:54:01.023446 containerd[1526]: time="2025-09-09T04:54:01.023336979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gmx58,Uid:cf82b986-3cd0-45d0-9764-a0d291ff5cfd,Namespace:calico-system,Attempt:0,} returns sandbox id \"1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba\"" Sep 9 04:54:01.100540 systemd-networkd[1434]: cali30f8fc76a9d: Gained IPv6LL Sep 9 04:54:01.356555 systemd-networkd[1434]: calia0bf8bc03ba: Gained IPv6LL Sep 9 04:54:01.484443 systemd-networkd[1434]: calie46a12bcccd: Gained IPv6LL Sep 9 04:54:01.602920 containerd[1526]: time="2025-09-09T04:54:01.602379173Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:01.602920 containerd[1526]: time="2025-09-09T04:54:01.602840890Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 04:54:01.603786 containerd[1526]: time="2025-09-09T04:54:01.603763443Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:01.605723 containerd[1526]: time="2025-09-09T04:54:01.605630548Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:01.606393 containerd[1526]: time="2025-09-09T04:54:01.606311823Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 1.51307096s" Sep 9 04:54:01.606393 containerd[1526]: time="2025-09-09T04:54:01.606356623Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:54:01.608400 containerd[1526]: time="2025-09-09T04:54:01.608085729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:54:01.609421 containerd[1526]: time="2025-09-09T04:54:01.609387479Z" level=info msg="CreateContainer within sandbox \"eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:54:01.616951 containerd[1526]: time="2025-09-09T04:54:01.616907222Z" level=info msg="Container dbaad88b002903cbed8bac1de9eec9ef3c43886586418f22deb354bdc907f34c: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:01.633354 containerd[1526]: time="2025-09-09T04:54:01.633284856Z" level=info msg="CreateContainer within sandbox \"eb47ff48fddf1e161d77b4df8e5bff1ff4d7a53d146188f4c1447ede0c7498bb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"dbaad88b002903cbed8bac1de9eec9ef3c43886586418f22deb354bdc907f34c\"" Sep 9 04:54:01.633821 containerd[1526]: time="2025-09-09T04:54:01.633798052Z" level=info msg="StartContainer for \"dbaad88b002903cbed8bac1de9eec9ef3c43886586418f22deb354bdc907f34c\"" Sep 9 04:54:01.634916 containerd[1526]: time="2025-09-09T04:54:01.634882364Z" level=info msg="connecting to shim dbaad88b002903cbed8bac1de9eec9ef3c43886586418f22deb354bdc907f34c" address="unix:///run/containerd/s/10833319c985fe719fc91df0f4eea5bcc1bb4352bb48e459ce7c381fbe184203" protocol=ttrpc version=3 Sep 9 04:54:01.653471 systemd[1]: Started cri-containerd-dbaad88b002903cbed8bac1de9eec9ef3c43886586418f22deb354bdc907f34c.scope - libcontainer container dbaad88b002903cbed8bac1de9eec9ef3c43886586418f22deb354bdc907f34c. Sep 9 04:54:01.691372 containerd[1526]: time="2025-09-09T04:54:01.691335490Z" level=info msg="StartContainer for \"dbaad88b002903cbed8bac1de9eec9ef3c43886586418f22deb354bdc907f34c\" returns successfully" Sep 9 04:54:01.740242 containerd[1526]: time="2025-09-09T04:54:01.740182475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776f74d6f9-j57rd,Uid:1ed773d9-9708-4a97-bf36-4f8170376c2b,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:54:01.740509 containerd[1526]: time="2025-09-09T04:54:01.740480913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dgbgk,Uid:552937ce-0bd6-4992-a22b-ff41c9705435,Namespace:calico-system,Attempt:0,}" Sep 9 04:54:01.867349 systemd-networkd[1434]: cali9a7a684ae2c: Link UP Sep 9 04:54:01.867843 systemd-networkd[1434]: cali9a7a684ae2c: Gained carrier Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.785 [INFO][4999] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--dgbgk-eth0 csi-node-driver- calico-system 552937ce-0bd6-4992-a22b-ff41c9705435 732 0 2025-09-09 04:53:38 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-dgbgk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9a7a684ae2c [] [] }} ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.786 [INFO][4999] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-eth0" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.819 [INFO][5022] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" HandleID="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Workload="localhost-k8s-csi--node--driver--dgbgk-eth0" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.820 [INFO][5022] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" HandleID="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Workload="localhost-k8s-csi--node--driver--dgbgk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c790), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-dgbgk", "timestamp":"2025-09-09 04:54:01.819904063 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.820 [INFO][5022] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.820 [INFO][5022] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.820 [INFO][5022] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.830 [INFO][5022] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.835 [INFO][5022] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.839 [INFO][5022] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.840 [INFO][5022] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.842 [INFO][5022] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.842 [INFO][5022] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.844 [INFO][5022] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.848 [INFO][5022] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.855 [INFO][5022] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.855 [INFO][5022] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" host="localhost" Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.855 [INFO][5022] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:01.885425 containerd[1526]: 2025-09-09 04:54:01.855 [INFO][5022] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" HandleID="k8s-pod-network.a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Workload="localhost-k8s-csi--node--driver--dgbgk-eth0" Sep 9 04:54:01.887705 containerd[1526]: 2025-09-09 04:54:01.857 [INFO][4999] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--dgbgk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"552937ce-0bd6-4992-a22b-ff41c9705435", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-dgbgk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9a7a684ae2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:01.887705 containerd[1526]: 2025-09-09 04:54:01.858 [INFO][4999] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-eth0" Sep 9 04:54:01.887705 containerd[1526]: 2025-09-09 04:54:01.858 [INFO][4999] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9a7a684ae2c ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-eth0" Sep 9 04:54:01.887705 containerd[1526]: 2025-09-09 04:54:01.867 [INFO][4999] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-eth0" Sep 9 04:54:01.887705 containerd[1526]: 2025-09-09 04:54:01.868 [INFO][4999] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--dgbgk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"552937ce-0bd6-4992-a22b-ff41c9705435", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c", Pod:"csi-node-driver-dgbgk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9a7a684ae2c", MAC:"0a:45:77:48:16:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:01.887705 containerd[1526]: 2025-09-09 04:54:01.881 [INFO][4999] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" Namespace="calico-system" Pod="csi-node-driver-dgbgk" WorkloadEndpoint="localhost-k8s-csi--node--driver--dgbgk-eth0" Sep 9 04:54:01.903978 containerd[1526]: time="2025-09-09T04:54:01.903933058Z" level=info msg="connecting to shim a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c" address="unix:///run/containerd/s/f6dc52d460ec59c9c70115c9c444e7f654e620a3ef4b3202cc081fcba467f63c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:54:01.932521 systemd[1]: Started cri-containerd-a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c.scope - libcontainer container a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c. Sep 9 04:54:01.934628 kubelet[2672]: I0909 04:54:01.934463 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cb85c6f7d-zfbw4" podStartSLOduration=26.420130633 podStartE2EDuration="27.934442863s" podCreationTimestamp="2025-09-09 04:53:34 +0000 UTC" firstStartedPulling="2025-09-09 04:54:00.092933506 +0000 UTC m=+40.434704593" lastFinishedPulling="2025-09-09 04:54:01.607245696 +0000 UTC m=+41.949016823" observedRunningTime="2025-09-09 04:54:01.934170186 +0000 UTC m=+42.275941273" watchObservedRunningTime="2025-09-09 04:54:01.934442863 +0000 UTC m=+42.276213990" Sep 9 04:54:01.975473 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:54:01.976518 containerd[1526]: time="2025-09-09T04:54:01.975733986Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:01.977619 systemd-networkd[1434]: cali2bd3f327f52: Link UP Sep 9 04:54:01.978495 containerd[1526]: time="2025-09-09T04:54:01.978467685Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 04:54:01.979101 systemd-networkd[1434]: cali2bd3f327f52: Gained carrier Sep 9 04:54:01.987341 containerd[1526]: time="2025-09-09T04:54:01.986291705Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 378.168736ms" Sep 9 04:54:01.987341 containerd[1526]: time="2025-09-09T04:54:01.986801821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:54:01.989728 containerd[1526]: time="2025-09-09T04:54:01.989594120Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 04:54:01.990623 containerd[1526]: time="2025-09-09T04:54:01.990439194Z" level=info msg="CreateContainer within sandbox \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.792 [INFO][4993] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0 calico-apiserver-776f74d6f9- calico-apiserver 1ed773d9-9708-4a97-bf36-4f8170376c2b 838 0 2025-09-09 04:53:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:776f74d6f9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-776f74d6f9-j57rd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2bd3f327f52 [] [] }} ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.794 [INFO][4993] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.821 [INFO][5028] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" HandleID="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Workload="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.821 [INFO][5028] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" HandleID="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Workload="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001366d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-776f74d6f9-j57rd", "timestamp":"2025-09-09 04:54:01.821364652 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.821 [INFO][5028] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.857 [INFO][5028] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.857 [INFO][5028] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.931 [INFO][5028] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.939 [INFO][5028] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.945 [INFO][5028] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.947 [INFO][5028] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.951 [INFO][5028] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.951 [INFO][5028] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.952 [INFO][5028] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632 Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.960 [INFO][5028] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.968 [INFO][5028] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.968 [INFO][5028] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" host="localhost" Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.968 [INFO][5028] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:01.999977 containerd[1526]: 2025-09-09 04:54:01.968 [INFO][5028] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" HandleID="k8s-pod-network.d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Workload="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" Sep 9 04:54:02.000550 containerd[1526]: 2025-09-09 04:54:01.973 [INFO][4993] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0", GenerateName:"calico-apiserver-776f74d6f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"1ed773d9-9708-4a97-bf36-4f8170376c2b", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776f74d6f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-776f74d6f9-j57rd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2bd3f327f52", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:02.000550 containerd[1526]: 2025-09-09 04:54:01.974 [INFO][4993] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" Sep 9 04:54:02.000550 containerd[1526]: 2025-09-09 04:54:01.974 [INFO][4993] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2bd3f327f52 ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" Sep 9 04:54:02.000550 containerd[1526]: 2025-09-09 04:54:01.980 [INFO][4993] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" Sep 9 04:54:02.000550 containerd[1526]: 2025-09-09 04:54:01.981 [INFO][4993] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0", GenerateName:"calico-apiserver-776f74d6f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"1ed773d9-9708-4a97-bf36-4f8170376c2b", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 53, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776f74d6f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632", Pod:"calico-apiserver-776f74d6f9-j57rd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2bd3f327f52", MAC:"7a:71:75:8c:a4:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:02.000550 containerd[1526]: 2025-09-09 04:54:01.992 [INFO][4993] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-j57rd" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--j57rd-eth0" Sep 9 04:54:02.004603 containerd[1526]: time="2025-09-09T04:54:02.004258688Z" level=info msg="Container 3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:02.010108 containerd[1526]: time="2025-09-09T04:54:02.010075244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dgbgk,Uid:552937ce-0bd6-4992-a22b-ff41c9705435,Namespace:calico-system,Attempt:0,} returns sandbox id \"a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c\"" Sep 9 04:54:02.018838 containerd[1526]: time="2025-09-09T04:54:02.018801099Z" level=info msg="CreateContainer within sandbox \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\"" Sep 9 04:54:02.019293 containerd[1526]: time="2025-09-09T04:54:02.019267336Z" level=info msg="StartContainer for \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\"" Sep 9 04:54:02.020492 containerd[1526]: time="2025-09-09T04:54:02.020462687Z" level=info msg="connecting to shim 3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1" address="unix:///run/containerd/s/2274967065a7af493df11156bc3671ce193983394eb013daeee02317c8db82a7" protocol=ttrpc version=3 Sep 9 04:54:02.031362 containerd[1526]: time="2025-09-09T04:54:02.031285126Z" level=info msg="connecting to shim d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632" address="unix:///run/containerd/s/56cf3e72063ab30c768eb985ed832bdaeab8f37b0dcd933f2e15863801f9df1e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:54:02.044489 systemd[1]: Started cri-containerd-3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1.scope - libcontainer container 3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1. Sep 9 04:54:02.056469 systemd[1]: Started cri-containerd-d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632.scope - libcontainer container d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632. Sep 9 04:54:02.071157 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:54:02.100499 containerd[1526]: time="2025-09-09T04:54:02.100412288Z" level=info msg="StartContainer for \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" returns successfully" Sep 9 04:54:02.105684 containerd[1526]: time="2025-09-09T04:54:02.105644809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776f74d6f9-j57rd,Uid:1ed773d9-9708-4a97-bf36-4f8170376c2b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632\"" Sep 9 04:54:02.109567 containerd[1526]: time="2025-09-09T04:54:02.109499140Z" level=info msg="CreateContainer within sandbox \"d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:54:02.120432 containerd[1526]: time="2025-09-09T04:54:02.119562384Z" level=info msg="Container fb4ab79ae80187ae29b9910fc2aa0277f0f6bd808219271cdf86e50251421858: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:02.127616 containerd[1526]: time="2025-09-09T04:54:02.127581404Z" level=info msg="CreateContainer within sandbox \"d369eac49be51e92d232b33f65685313b3802233ff75f6233cbe608bce96b632\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fb4ab79ae80187ae29b9910fc2aa0277f0f6bd808219271cdf86e50251421858\"" Sep 9 04:54:02.129146 containerd[1526]: time="2025-09-09T04:54:02.129116553Z" level=info msg="StartContainer for \"fb4ab79ae80187ae29b9910fc2aa0277f0f6bd808219271cdf86e50251421858\"" Sep 9 04:54:02.130535 containerd[1526]: time="2025-09-09T04:54:02.130510582Z" level=info msg="connecting to shim fb4ab79ae80187ae29b9910fc2aa0277f0f6bd808219271cdf86e50251421858" address="unix:///run/containerd/s/56cf3e72063ab30c768eb985ed832bdaeab8f37b0dcd933f2e15863801f9df1e" protocol=ttrpc version=3 Sep 9 04:54:02.165497 systemd[1]: Started cri-containerd-fb4ab79ae80187ae29b9910fc2aa0277f0f6bd808219271cdf86e50251421858.scope - libcontainer container fb4ab79ae80187ae29b9910fc2aa0277f0f6bd808219271cdf86e50251421858. Sep 9 04:54:02.211178 containerd[1526]: time="2025-09-09T04:54:02.211136258Z" level=info msg="StartContainer for \"fb4ab79ae80187ae29b9910fc2aa0277f0f6bd808219271cdf86e50251421858\" returns successfully" Sep 9 04:54:02.316429 systemd-networkd[1434]: cali36872d8d9e4: Gained IPv6LL Sep 9 04:54:02.951346 kubelet[2672]: I0909 04:54:02.951142 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cb85c6f7d-pxpgr" podStartSLOduration=27.176904057 podStartE2EDuration="28.951120316s" podCreationTimestamp="2025-09-09 04:53:34 +0000 UTC" firstStartedPulling="2025-09-09 04:54:00.213580395 +0000 UTC m=+40.555351522" lastFinishedPulling="2025-09-09 04:54:01.987796654 +0000 UTC m=+42.329567781" observedRunningTime="2025-09-09 04:54:02.949918965 +0000 UTC m=+43.291690092" watchObservedRunningTime="2025-09-09 04:54:02.951120316 +0000 UTC m=+43.292891443" Sep 9 04:54:02.979954 kubelet[2672]: I0909 04:54:02.979884 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-776f74d6f9-j57rd" podStartSLOduration=28.979856141 podStartE2EDuration="28.979856141s" podCreationTimestamp="2025-09-09 04:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:54:02.979816861 +0000 UTC m=+43.321588028" watchObservedRunningTime="2025-09-09 04:54:02.979856141 +0000 UTC m=+43.321627268" Sep 9 04:54:03.277474 systemd-networkd[1434]: cali9a7a684ae2c: Gained IPv6LL Sep 9 04:54:03.673629 containerd[1526]: time="2025-09-09T04:54:03.673551103Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:03.674568 containerd[1526]: time="2025-09-09T04:54:03.674408777Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 04:54:03.675899 containerd[1526]: time="2025-09-09T04:54:03.675835807Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:03.679923 containerd[1526]: time="2025-09-09T04:54:03.679346261Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:03.679923 containerd[1526]: time="2025-09-09T04:54:03.679782738Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 1.690149058s" Sep 9 04:54:03.679923 containerd[1526]: time="2025-09-09T04:54:03.679813458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 04:54:03.680987 containerd[1526]: time="2025-09-09T04:54:03.680955169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 04:54:03.692660 containerd[1526]: time="2025-09-09T04:54:03.692619324Z" level=info msg="CreateContainer within sandbox \"ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 04:54:03.707536 containerd[1526]: time="2025-09-09T04:54:03.706548062Z" level=info msg="Container 57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:03.721497 containerd[1526]: time="2025-09-09T04:54:03.721365434Z" level=info msg="CreateContainer within sandbox \"ece7384277396f0a99fdc252f4fd63d4f494fba3d8080ed4ece2c2addcc925fb\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5\"" Sep 9 04:54:03.722299 containerd[1526]: time="2025-09-09T04:54:03.721852150Z" level=info msg="StartContainer for \"57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5\"" Sep 9 04:54:03.723078 containerd[1526]: time="2025-09-09T04:54:03.723048661Z" level=info msg="connecting to shim 57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5" address="unix:///run/containerd/s/b2faf986faec220ffee988798fb1e26e42a4056dbc42e2a573b15d5153d75aa7" protocol=ttrpc version=3 Sep 9 04:54:03.750536 systemd[1]: Started cri-containerd-57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5.scope - libcontainer container 57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5. Sep 9 04:54:03.795439 containerd[1526]: time="2025-09-09T04:54:03.795347093Z" level=info msg="StartContainer for \"57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5\" returns successfully" Sep 9 04:54:03.916506 systemd-networkd[1434]: cali2bd3f327f52: Gained IPv6LL Sep 9 04:54:03.945586 kubelet[2672]: I0909 04:54:03.945479 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:54:04.298310 containerd[1526]: time="2025-09-09T04:54:04.298166304Z" level=info msg="TaskExit event in podsandbox handler container_id:\"57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5\" id:\"6bc1f809f2010c45274f7120c5e81333156159012cac12a431d5c31a02efd063\" pid:5293 exited_at:{seconds:1757393644 nanos:287152983}" Sep 9 04:54:04.334494 kubelet[2672]: I0909 04:54:04.334435 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-697776f99-qp2dg" podStartSLOduration=22.908890101 podStartE2EDuration="26.334415965s" podCreationTimestamp="2025-09-09 04:53:38 +0000 UTC" firstStartedPulling="2025-09-09 04:54:00.255288386 +0000 UTC m=+40.597059513" lastFinishedPulling="2025-09-09 04:54:03.68081425 +0000 UTC m=+44.022585377" observedRunningTime="2025-09-09 04:54:03.962424631 +0000 UTC m=+44.304195718" watchObservedRunningTime="2025-09-09 04:54:04.334415965 +0000 UTC m=+44.676187092" Sep 9 04:54:04.386465 containerd[1526]: time="2025-09-09T04:54:04.386392154Z" level=info msg="TaskExit event in podsandbox handler container_id:\"57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5\" id:\"a704bbaafe0fdbfb06b883a8d47dd07a78938059f2df9e53664eea4e27c75c40\" pid:5316 exited_at:{seconds:1757393644 nanos:386081996}" Sep 9 04:54:04.663471 systemd[1]: Created slice kubepods-besteffort-podba439d1b_08c1_4625_80d3_79c9dc9b0b31.slice - libcontainer container kubepods-besteffort-podba439d1b_08c1_4625_80d3_79c9dc9b0b31.slice. Sep 9 04:54:04.779902 kubelet[2672]: I0909 04:54:04.779781 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfz57\" (UniqueName: \"kubernetes.io/projected/ba439d1b-08c1-4625-80d3-79c9dc9b0b31-kube-api-access-lfz57\") pod \"calico-apiserver-776f74d6f9-gv99v\" (UID: \"ba439d1b-08c1-4625-80d3-79c9dc9b0b31\") " pod="calico-apiserver/calico-apiserver-776f74d6f9-gv99v" Sep 9 04:54:04.779902 kubelet[2672]: I0909 04:54:04.779925 2672 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ba439d1b-08c1-4625-80d3-79c9dc9b0b31-calico-apiserver-certs\") pod \"calico-apiserver-776f74d6f9-gv99v\" (UID: \"ba439d1b-08c1-4625-80d3-79c9dc9b0b31\") " pod="calico-apiserver/calico-apiserver-776f74d6f9-gv99v" Sep 9 04:54:04.946157 kubelet[2672]: I0909 04:54:04.945826 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:54:04.956969 containerd[1526]: time="2025-09-09T04:54:04.956765157Z" level=info msg="StopContainer for \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" with timeout 30 (s)" Sep 9 04:54:04.957538 containerd[1526]: time="2025-09-09T04:54:04.957514151Z" level=info msg="Stop container \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" with signal terminated" Sep 9 04:54:04.970866 containerd[1526]: time="2025-09-09T04:54:04.970818096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776f74d6f9-gv99v,Uid:ba439d1b-08c1-4625-80d3-79c9dc9b0b31,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:54:04.984421 systemd[1]: cri-containerd-3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1.scope: Deactivated successfully. Sep 9 04:54:04.984686 systemd[1]: cri-containerd-3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1.scope: Consumed 1.664s CPU time, 40.2M memory peak. Sep 9 04:54:04.991586 containerd[1526]: time="2025-09-09T04:54:04.991486869Z" level=info msg="received exit event container_id:\"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" id:\"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" pid:5147 exit_status:1 exited_at:{seconds:1757393644 nanos:990964912}" Sep 9 04:54:04.991698 containerd[1526]: time="2025-09-09T04:54:04.991637147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" id:\"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" pid:5147 exit_status:1 exited_at:{seconds:1757393644 nanos:990964912}" Sep 9 04:54:05.034240 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1-rootfs.mount: Deactivated successfully. Sep 9 04:54:05.164950 containerd[1526]: time="2025-09-09T04:54:05.164909134Z" level=info msg="StopContainer for \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" returns successfully" Sep 9 04:54:05.169978 containerd[1526]: time="2025-09-09T04:54:05.169933299Z" level=info msg="StopPodSandbox for \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\"" Sep 9 04:54:05.183051 containerd[1526]: time="2025-09-09T04:54:05.182802809Z" level=info msg="Container to stop \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 9 04:54:05.191155 systemd[1]: cri-containerd-1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef.scope: Deactivated successfully. Sep 9 04:54:05.191464 systemd[1]: cri-containerd-1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef.scope: Consumed 29ms CPU time, 6.1M memory peak, 6.1M read from disk. Sep 9 04:54:05.197098 containerd[1526]: time="2025-09-09T04:54:05.196960150Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" id:\"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" pid:4765 exit_status:137 exited_at:{seconds:1757393645 nanos:196597713}" Sep 9 04:54:05.215271 systemd-networkd[1434]: calic26950f668d: Link UP Sep 9 04:54:05.216349 systemd-networkd[1434]: calic26950f668d: Gained carrier Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.067 [INFO][5337] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0 calico-apiserver-776f74d6f9- calico-apiserver ba439d1b-08c1-4625-80d3-79c9dc9b0b31 1105 0 2025-09-09 04:54:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:776f74d6f9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-776f74d6f9-gv99v eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic26950f668d [] [] }} ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.070 [INFO][5337] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.160 [INFO][5364] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" HandleID="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Workload="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.160 [INFO][5364] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" HandleID="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Workload="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004355e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-776f74d6f9-gv99v", "timestamp":"2025-09-09 04:54:05.160475685 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.160 [INFO][5364] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.160 [INFO][5364] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.160 [INFO][5364] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.171 [INFO][5364] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.177 [INFO][5364] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.182 [INFO][5364] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.186 [INFO][5364] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.190 [INFO][5364] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.190 [INFO][5364] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.193 [INFO][5364] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.199 [INFO][5364] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.209 [INFO][5364] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.138/26] block=192.168.88.128/26 handle="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.209 [INFO][5364] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.138/26] handle="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" host="localhost" Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.209 [INFO][5364] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:05.233373 containerd[1526]: 2025-09-09 04:54:05.209 [INFO][5364] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.138/26] IPv6=[] ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" HandleID="k8s-pod-network.8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Workload="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" Sep 9 04:54:05.233977 containerd[1526]: 2025-09-09 04:54:05.211 [INFO][5337] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0", GenerateName:"calico-apiserver-776f74d6f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba439d1b-08c1-4625-80d3-79c9dc9b0b31", ResourceVersion:"1105", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776f74d6f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-776f74d6f9-gv99v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic26950f668d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:05.233977 containerd[1526]: 2025-09-09 04:54:05.212 [INFO][5337] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.138/32] ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" Sep 9 04:54:05.233977 containerd[1526]: 2025-09-09 04:54:05.212 [INFO][5337] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic26950f668d ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" Sep 9 04:54:05.233977 containerd[1526]: 2025-09-09 04:54:05.217 [INFO][5337] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" Sep 9 04:54:05.233977 containerd[1526]: 2025-09-09 04:54:05.217 [INFO][5337] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0", GenerateName:"calico-apiserver-776f74d6f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba439d1b-08c1-4625-80d3-79c9dc9b0b31", ResourceVersion:"1105", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"776f74d6f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e", Pod:"calico-apiserver-776f74d6f9-gv99v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic26950f668d", MAC:"56:fa:04:d6:9c:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:54:05.233977 containerd[1526]: 2025-09-09 04:54:05.228 [INFO][5337] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" Namespace="calico-apiserver" Pod="calico-apiserver-776f74d6f9-gv99v" WorkloadEndpoint="localhost-k8s-calico--apiserver--776f74d6f9--gv99v-eth0" Sep 9 04:54:05.251221 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef-rootfs.mount: Deactivated successfully. Sep 9 04:54:05.269219 containerd[1526]: time="2025-09-09T04:54:05.269153765Z" level=info msg="shim disconnected" id=1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef namespace=k8s.io Sep 9 04:54:05.275393 containerd[1526]: time="2025-09-09T04:54:05.269215445Z" level=warning msg="cleaning up after shim disconnected" id=1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef namespace=k8s.io Sep 9 04:54:05.275498 containerd[1526]: time="2025-09-09T04:54:05.275392082Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 9 04:54:05.281700 containerd[1526]: time="2025-09-09T04:54:05.281653038Z" level=info msg="connecting to shim 8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e" address="unix:///run/containerd/s/6b4713510756e867d21c55b7d39b8e62c0b839e6ca261ae0b4cb7b9378f66e1a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:54:05.303839 containerd[1526]: time="2025-09-09T04:54:05.303758963Z" level=info msg="received exit event sandbox_id:\"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" exit_status:137 exited_at:{seconds:1757393645 nanos:196597713}" Sep 9 04:54:05.313539 systemd[1]: Started cri-containerd-8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e.scope - libcontainer container 8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e. Sep 9 04:54:05.329376 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:54:05.436213 containerd[1526]: time="2025-09-09T04:54:05.436169637Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-776f74d6f9-gv99v,Uid:ba439d1b-08c1-4625-80d3-79c9dc9b0b31,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e\"" Sep 9 04:54:05.441178 systemd-networkd[1434]: calia0bf8bc03ba: Link DOWN Sep 9 04:54:05.441352 systemd-networkd[1434]: calia0bf8bc03ba: Lost carrier Sep 9 04:54:05.443985 containerd[1526]: time="2025-09-09T04:54:05.443792184Z" level=info msg="CreateContainer within sandbox \"8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:54:05.466620 systemd[1]: Started sshd@8-10.0.0.40:22-10.0.0.1:59452.service - OpenSSH per-connection server daemon (10.0.0.1:59452). Sep 9 04:54:05.487457 containerd[1526]: time="2025-09-09T04:54:05.487419359Z" level=info msg="Container ccb36ab160384a0966121c0b60d7a900e24bf2b0855f7fdd48e781f773b19346: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:05.502071 containerd[1526]: time="2025-09-09T04:54:05.501936857Z" level=info msg="CreateContainer within sandbox \"8278a8193e36348c171ec69c7c6a3ebd56552014bbf6cdeb2ffd3a50b547b98e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ccb36ab160384a0966121c0b60d7a900e24bf2b0855f7fdd48e781f773b19346\"" Sep 9 04:54:05.507046 containerd[1526]: time="2025-09-09T04:54:05.505796590Z" level=info msg="StartContainer for \"ccb36ab160384a0966121c0b60d7a900e24bf2b0855f7fdd48e781f773b19346\"" Sep 9 04:54:05.510797 containerd[1526]: time="2025-09-09T04:54:05.510691556Z" level=info msg="connecting to shim ccb36ab160384a0966121c0b60d7a900e24bf2b0855f7fdd48e781f773b19346" address="unix:///run/containerd/s/6b4713510756e867d21c55b7d39b8e62c0b839e6ca261ae0b4cb7b9378f66e1a" protocol=ttrpc version=3 Sep 9 04:54:05.551653 systemd[1]: Started cri-containerd-ccb36ab160384a0966121c0b60d7a900e24bf2b0855f7fdd48e781f773b19346.scope - libcontainer container ccb36ab160384a0966121c0b60d7a900e24bf2b0855f7fdd48e781f773b19346. Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.438 [INFO][5462] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.439 [INFO][5462] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" iface="eth0" netns="/var/run/netns/cni-ece1af41-13c6-6f34-324b-dd374c0deae0" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.439 [INFO][5462] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" iface="eth0" netns="/var/run/netns/cni-ece1af41-13c6-6f34-324b-dd374c0deae0" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.446 [INFO][5462] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" after=6.771393ms iface="eth0" netns="/var/run/netns/cni-ece1af41-13c6-6f34-324b-dd374c0deae0" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.446 [INFO][5462] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.446 [INFO][5462] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.477 [INFO][5485] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.477 [INFO][5485] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.477 [INFO][5485] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.536 [INFO][5485] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.536 [INFO][5485] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.538 [INFO][5485] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:05.555255 containerd[1526]: 2025-09-09 04:54:05.552 [INFO][5462] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:05.556454 containerd[1526]: time="2025-09-09T04:54:05.556406717Z" level=info msg="TearDown network for sandbox \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" successfully" Sep 9 04:54:05.556742 containerd[1526]: time="2025-09-09T04:54:05.556724234Z" level=info msg="StopPodSandbox for \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" returns successfully" Sep 9 04:54:05.573619 sshd[5496]: Accepted publickey for core from 10.0.0.1 port 59452 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:05.585866 sshd-session[5496]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:05.599720 systemd-logind[1508]: New session 9 of user core. Sep 9 04:54:05.609764 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 04:54:05.686606 kubelet[2672]: I0909 04:54:05.686560 2672 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l76np\" (UniqueName: \"kubernetes.io/projected/51869746-303b-4256-91b8-4606d0ce74fe-kube-api-access-l76np\") pod \"51869746-303b-4256-91b8-4606d0ce74fe\" (UID: \"51869746-303b-4256-91b8-4606d0ce74fe\") " Sep 9 04:54:05.686606 kubelet[2672]: I0909 04:54:05.686613 2672 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/51869746-303b-4256-91b8-4606d0ce74fe-calico-apiserver-certs\") pod \"51869746-303b-4256-91b8-4606d0ce74fe\" (UID: \"51869746-303b-4256-91b8-4606d0ce74fe\") " Sep 9 04:54:05.691592 containerd[1526]: time="2025-09-09T04:54:05.691556971Z" level=info msg="StartContainer for \"ccb36ab160384a0966121c0b60d7a900e24bf2b0855f7fdd48e781f773b19346\" returns successfully" Sep 9 04:54:05.709612 kubelet[2672]: I0909 04:54:05.709558 2672 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51869746-303b-4256-91b8-4606d0ce74fe-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "51869746-303b-4256-91b8-4606d0ce74fe" (UID: "51869746-303b-4256-91b8-4606d0ce74fe"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 04:54:05.715434 kubelet[2672]: I0909 04:54:05.715388 2672 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51869746-303b-4256-91b8-4606d0ce74fe-kube-api-access-l76np" (OuterVolumeSpecName: "kube-api-access-l76np") pod "51869746-303b-4256-91b8-4606d0ce74fe" (UID: "51869746-303b-4256-91b8-4606d0ce74fe"). InnerVolumeSpecName "kube-api-access-l76np". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 04:54:05.755803 systemd[1]: Removed slice kubepods-besteffort-pod51869746_303b_4256_91b8_4606d0ce74fe.slice - libcontainer container kubepods-besteffort-pod51869746_303b_4256_91b8_4606d0ce74fe.slice. Sep 9 04:54:05.756461 systemd[1]: kubepods-besteffort-pod51869746_303b_4256_91b8_4606d0ce74fe.slice: Consumed 1.694s CPU time, 42.1M memory peak, 6.1M read from disk. Sep 9 04:54:05.787191 kubelet[2672]: I0909 04:54:05.787141 2672 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l76np\" (UniqueName: \"kubernetes.io/projected/51869746-303b-4256-91b8-4606d0ce74fe-kube-api-access-l76np\") on node \"localhost\" DevicePath \"\"" Sep 9 04:54:05.787191 kubelet[2672]: I0909 04:54:05.787176 2672 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/51869746-303b-4256-91b8-4606d0ce74fe-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 9 04:54:05.897431 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef-shm.mount: Deactivated successfully. Sep 9 04:54:05.897552 systemd[1]: run-netns-cni\x2dece1af41\x2d13c6\x2d6f34\x2d324b\x2ddd374c0deae0.mount: Deactivated successfully. Sep 9 04:54:05.897620 systemd[1]: var-lib-kubelet-pods-51869746\x2d303b\x2d4256\x2d91b8\x2d4606d0ce74fe-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dl76np.mount: Deactivated successfully. Sep 9 04:54:05.897673 systemd[1]: var-lib-kubelet-pods-51869746\x2d303b\x2d4256\x2d91b8\x2d4606d0ce74fe-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 9 04:54:05.961026 kubelet[2672]: I0909 04:54:05.960983 2672 scope.go:117] "RemoveContainer" containerID="3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1" Sep 9 04:54:05.972589 kubelet[2672]: I0909 04:54:05.972514 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-776f74d6f9-gv99v" podStartSLOduration=1.972485247 podStartE2EDuration="1.972485247s" podCreationTimestamp="2025-09-09 04:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:54:05.971753852 +0000 UTC m=+46.313525019" watchObservedRunningTime="2025-09-09 04:54:05.972485247 +0000 UTC m=+46.314256374" Sep 9 04:54:05.973711 containerd[1526]: time="2025-09-09T04:54:05.973666279Z" level=info msg="RemoveContainer for \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\"" Sep 9 04:54:05.991149 containerd[1526]: time="2025-09-09T04:54:05.991108277Z" level=info msg="RemoveContainer for \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" returns successfully" Sep 9 04:54:05.991651 kubelet[2672]: I0909 04:54:05.991598 2672 scope.go:117] "RemoveContainer" containerID="3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1" Sep 9 04:54:05.992337 containerd[1526]: time="2025-09-09T04:54:05.991861071Z" level=error msg="ContainerStatus for \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\": not found" Sep 9 04:54:05.998140 kubelet[2672]: E0909 04:54:05.997993 2672 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\": not found" containerID="3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1" Sep 9 04:54:05.998913 sshd[5520]: Connection closed by 10.0.0.1 port 59452 Sep 9 04:54:05.999249 sshd-session[5496]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:06.004300 systemd[1]: sshd@8-10.0.0.40:22-10.0.0.1:59452.service: Deactivated successfully. Sep 9 04:54:06.008315 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 04:54:06.014776 systemd-logind[1508]: Session 9 logged out. Waiting for processes to exit. Sep 9 04:54:06.016302 systemd-logind[1508]: Removed session 9. Sep 9 04:54:06.018426 kubelet[2672]: I0909 04:54:06.016891 2672 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1"} err="failed to get container status \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\": rpc error: code = NotFound desc = an error occurred when try to find container \"3b9abd674ad5d8a65e080ae61a6eb0202e99ade87f89cba4bd03ab612ec699b1\": not found" Sep 9 04:54:06.313879 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount663933027.mount: Deactivated successfully. Sep 9 04:54:06.556152 containerd[1526]: time="2025-09-09T04:54:06.556107206Z" level=info msg="TaskExit event in podsandbox handler exit_status:137 exited_at:{seconds:1757393645 nanos:196597713}" Sep 9 04:54:06.832607 containerd[1526]: time="2025-09-09T04:54:06.832560273Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:06.833938 containerd[1526]: time="2025-09-09T04:54:06.833882264Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 04:54:06.834715 containerd[1526]: time="2025-09-09T04:54:06.834676499Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:06.838341 containerd[1526]: time="2025-09-09T04:54:06.838296114Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:06.839909 containerd[1526]: time="2025-09-09T04:54:06.839877263Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.158892214s" Sep 9 04:54:06.840177 containerd[1526]: time="2025-09-09T04:54:06.840158661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 04:54:06.842438 containerd[1526]: time="2025-09-09T04:54:06.842317647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 04:54:06.847739 containerd[1526]: time="2025-09-09T04:54:06.847680010Z" level=info msg="CreateContainer within sandbox \"1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 04:54:06.858300 containerd[1526]: time="2025-09-09T04:54:06.858197298Z" level=info msg="Container a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:06.877781 containerd[1526]: time="2025-09-09T04:54:06.877739324Z" level=info msg="CreateContainer within sandbox \"1242b7ef4c97e893199a52b84404a48ecf98576bc22c09ea9d53523a9dfa59ba\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f\"" Sep 9 04:54:06.879662 containerd[1526]: time="2025-09-09T04:54:06.879613631Z" level=info msg="StartContainer for \"a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f\"" Sep 9 04:54:06.880749 containerd[1526]: time="2025-09-09T04:54:06.880722344Z" level=info msg="connecting to shim a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f" address="unix:///run/containerd/s/afd1bba34de5d8c6a768204285b01b946079f2c5abf26fb3a40f086c523bd07b" protocol=ttrpc version=3 Sep 9 04:54:06.903765 systemd[1]: Started cri-containerd-a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f.scope - libcontainer container a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f. Sep 9 04:54:06.971990 kubelet[2672]: I0909 04:54:06.971951 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:54:07.015430 containerd[1526]: time="2025-09-09T04:54:07.015388503Z" level=info msg="StartContainer for \"a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f\" returns successfully" Sep 9 04:54:07.244496 systemd-networkd[1434]: calic26950f668d: Gained IPv6LL Sep 9 04:54:07.743712 kubelet[2672]: I0909 04:54:07.743672 2672 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51869746-303b-4256-91b8-4606d0ce74fe" path="/var/lib/kubelet/pods/51869746-303b-4256-91b8-4606d0ce74fe/volumes" Sep 9 04:54:07.803299 kubelet[2672]: I0909 04:54:07.803275 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:54:07.884009 containerd[1526]: time="2025-09-09T04:54:07.883963114Z" level=info msg="TaskExit event in podsandbox handler container_id:\"642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e\" id:\"e52cdd3205700976f8cd751c2c85ce71dafdc65ba6b98cd6782293fa36c10e41\" pid:5619 exited_at:{seconds:1757393647 nanos:882877642}" Sep 9 04:54:07.978098 containerd[1526]: time="2025-09-09T04:54:07.977827684Z" level=info msg="TaskExit event in podsandbox handler container_id:\"642a9776534dc1f27a518c655b090f100d5119b953238638111dfd14fd5fe45e\" id:\"fe8d09123325062c77fde63314e6dd256c52483c067a840028f907d07571cd6f\" pid:5645 exited_at:{seconds:1757393647 nanos:977563846}" Sep 9 04:54:07.993369 kubelet[2672]: I0909 04:54:07.992553 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-gmx58" podStartSLOduration=25.176159699 podStartE2EDuration="30.992534546s" podCreationTimestamp="2025-09-09 04:53:37 +0000 UTC" firstStartedPulling="2025-09-09 04:54:01.025577962 +0000 UTC m=+41.367349089" lastFinishedPulling="2025-09-09 04:54:06.841952809 +0000 UTC m=+47.183723936" observedRunningTime="2025-09-09 04:54:07.991268834 +0000 UTC m=+48.333039961" watchObservedRunningTime="2025-09-09 04:54:07.992534546 +0000 UTC m=+48.334305673" Sep 9 04:54:08.103360 containerd[1526]: time="2025-09-09T04:54:08.103047097Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:08.103746 containerd[1526]: time="2025-09-09T04:54:08.103711133Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 04:54:08.104665 containerd[1526]: time="2025-09-09T04:54:08.104605367Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:08.106570 containerd[1526]: time="2025-09-09T04:54:08.106523554Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:08.107319 containerd[1526]: time="2025-09-09T04:54:08.107277909Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.264902583s" Sep 9 04:54:08.107359 containerd[1526]: time="2025-09-09T04:54:08.107313869Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 04:54:08.110625 containerd[1526]: time="2025-09-09T04:54:08.110588887Z" level=info msg="CreateContainer within sandbox \"a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 04:54:08.123666 containerd[1526]: time="2025-09-09T04:54:08.123493962Z" level=info msg="Container 2c569dec60a5797bd206553a3a4cc683642c6375fb38e408d263239c3446c6f6: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:08.139610 containerd[1526]: time="2025-09-09T04:54:08.139546977Z" level=info msg="CreateContainer within sandbox \"a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2c569dec60a5797bd206553a3a4cc683642c6375fb38e408d263239c3446c6f6\"" Sep 9 04:54:08.140156 containerd[1526]: time="2025-09-09T04:54:08.140124893Z" level=info msg="StartContainer for \"2c569dec60a5797bd206553a3a4cc683642c6375fb38e408d263239c3446c6f6\"" Sep 9 04:54:08.141993 containerd[1526]: time="2025-09-09T04:54:08.141956521Z" level=info msg="connecting to shim 2c569dec60a5797bd206553a3a4cc683642c6375fb38e408d263239c3446c6f6" address="unix:///run/containerd/s/f6dc52d460ec59c9c70115c9c444e7f654e620a3ef4b3202cc081fcba467f63c" protocol=ttrpc version=3 Sep 9 04:54:08.169500 systemd[1]: Started cri-containerd-2c569dec60a5797bd206553a3a4cc683642c6375fb38e408d263239c3446c6f6.scope - libcontainer container 2c569dec60a5797bd206553a3a4cc683642c6375fb38e408d263239c3446c6f6. Sep 9 04:54:08.203158 containerd[1526]: time="2025-09-09T04:54:08.203122078Z" level=info msg="StartContainer for \"2c569dec60a5797bd206553a3a4cc683642c6375fb38e408d263239c3446c6f6\" returns successfully" Sep 9 04:54:08.204690 containerd[1526]: time="2025-09-09T04:54:08.204661388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 04:54:08.980513 kubelet[2672]: I0909 04:54:08.980481 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:54:09.402936 containerd[1526]: time="2025-09-09T04:54:09.402880508Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:09.403605 containerd[1526]: time="2025-09-09T04:54:09.403555143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 04:54:09.404132 containerd[1526]: time="2025-09-09T04:54:09.404105060Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:09.406409 containerd[1526]: time="2025-09-09T04:54:09.406374205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:54:09.407137 containerd[1526]: time="2025-09-09T04:54:09.407101481Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.202262654s" Sep 9 04:54:09.407137 containerd[1526]: time="2025-09-09T04:54:09.407136320Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 04:54:09.409790 containerd[1526]: time="2025-09-09T04:54:09.409715344Z" level=info msg="CreateContainer within sandbox \"a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 04:54:09.415558 containerd[1526]: time="2025-09-09T04:54:09.415465346Z" level=info msg="Container 6c5248757592e6e49de877eb6dfc6ab8347f5ee6c3518dd29f1424f3001a05b7: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:54:09.424859 containerd[1526]: time="2025-09-09T04:54:09.424800126Z" level=info msg="CreateContainer within sandbox \"a7eb29595913d1f7ea1a98405b5ce0a56bf45e9d193c4f97420be6bf8c10e51c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"6c5248757592e6e49de877eb6dfc6ab8347f5ee6c3518dd29f1424f3001a05b7\"" Sep 9 04:54:09.425275 containerd[1526]: time="2025-09-09T04:54:09.425250603Z" level=info msg="StartContainer for \"6c5248757592e6e49de877eb6dfc6ab8347f5ee6c3518dd29f1424f3001a05b7\"" Sep 9 04:54:09.426970 containerd[1526]: time="2025-09-09T04:54:09.426893873Z" level=info msg="connecting to shim 6c5248757592e6e49de877eb6dfc6ab8347f5ee6c3518dd29f1424f3001a05b7" address="unix:///run/containerd/s/f6dc52d460ec59c9c70115c9c444e7f654e620a3ef4b3202cc081fcba467f63c" protocol=ttrpc version=3 Sep 9 04:54:09.447580 systemd[1]: Started cri-containerd-6c5248757592e6e49de877eb6dfc6ab8347f5ee6c3518dd29f1424f3001a05b7.scope - libcontainer container 6c5248757592e6e49de877eb6dfc6ab8347f5ee6c3518dd29f1424f3001a05b7. Sep 9 04:54:09.493746 containerd[1526]: time="2025-09-09T04:54:09.493690121Z" level=info msg="StartContainer for \"6c5248757592e6e49de877eb6dfc6ab8347f5ee6c3518dd29f1424f3001a05b7\" returns successfully" Sep 9 04:54:09.824357 kubelet[2672]: I0909 04:54:09.824311 2672 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 04:54:09.829879 kubelet[2672]: I0909 04:54:09.829846 2672 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 04:54:11.018926 systemd[1]: Started sshd@9-10.0.0.40:22-10.0.0.1:36508.service - OpenSSH per-connection server daemon (10.0.0.1:36508). Sep 9 04:54:11.091641 sshd[5738]: Accepted publickey for core from 10.0.0.1 port 36508 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:11.093245 sshd-session[5738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:11.097421 systemd-logind[1508]: New session 10 of user core. Sep 9 04:54:11.103500 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 04:54:11.339636 sshd[5741]: Connection closed by 10.0.0.1 port 36508 Sep 9 04:54:11.341380 sshd-session[5738]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:11.347263 systemd[1]: sshd@9-10.0.0.40:22-10.0.0.1:36508.service: Deactivated successfully. Sep 9 04:54:11.348944 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 04:54:11.350396 systemd-logind[1508]: Session 10 logged out. Waiting for processes to exit. Sep 9 04:54:11.353192 systemd[1]: Started sshd@10-10.0.0.40:22-10.0.0.1:36510.service - OpenSSH per-connection server daemon (10.0.0.1:36510). Sep 9 04:54:11.353860 systemd-logind[1508]: Removed session 10. Sep 9 04:54:11.414814 sshd[5758]: Accepted publickey for core from 10.0.0.1 port 36510 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:11.416096 sshd-session[5758]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:11.419999 systemd-logind[1508]: New session 11 of user core. Sep 9 04:54:11.430478 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 04:54:11.595677 sshd[5761]: Connection closed by 10.0.0.1 port 36510 Sep 9 04:54:11.597914 sshd-session[5758]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:11.610426 systemd[1]: sshd@10-10.0.0.40:22-10.0.0.1:36510.service: Deactivated successfully. Sep 9 04:54:11.612758 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 04:54:11.615891 systemd-logind[1508]: Session 11 logged out. Waiting for processes to exit. Sep 9 04:54:11.621427 systemd[1]: Started sshd@11-10.0.0.40:22-10.0.0.1:36522.service - OpenSSH per-connection server daemon (10.0.0.1:36522). Sep 9 04:54:11.622121 systemd-logind[1508]: Removed session 11. Sep 9 04:54:11.674915 sshd[5773]: Accepted publickey for core from 10.0.0.1 port 36522 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:11.676044 sshd-session[5773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:11.680639 systemd-logind[1508]: New session 12 of user core. Sep 9 04:54:11.693476 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 04:54:11.864079 sshd[5776]: Connection closed by 10.0.0.1 port 36522 Sep 9 04:54:11.864535 sshd-session[5773]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:11.867988 systemd[1]: sshd@11-10.0.0.40:22-10.0.0.1:36522.service: Deactivated successfully. Sep 9 04:54:11.870248 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 04:54:11.871129 systemd-logind[1508]: Session 12 logged out. Waiting for processes to exit. Sep 9 04:54:11.872222 systemd-logind[1508]: Removed session 12. Sep 9 04:54:13.832880 containerd[1526]: time="2025-09-09T04:54:13.832611743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5\" id:\"3f0424017663e8025f1a5775ce8a74003c40e851cc9101c147d80d17f76c8f1b\" pid:5800 exited_at:{seconds:1757393653 nanos:832382464}" Sep 9 04:54:15.631485 kubelet[2672]: I0909 04:54:15.631445 2672 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:54:15.716395 containerd[1526]: time="2025-09-09T04:54:15.716351905Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f\" id:\"e2f968d85114f8fe410126a3862d12e95e2bec71f216097a3aa95399064cfcf2\" pid:5825 exited_at:{seconds:1757393655 nanos:715886827}" Sep 9 04:54:15.732439 kubelet[2672]: I0909 04:54:15.732383 2672 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dgbgk" podStartSLOduration=30.336610643 podStartE2EDuration="37.73236101s" podCreationTimestamp="2025-09-09 04:53:38 +0000 UTC" firstStartedPulling="2025-09-09 04:54:02.012092869 +0000 UTC m=+42.353863996" lastFinishedPulling="2025-09-09 04:54:09.407843236 +0000 UTC m=+49.749614363" observedRunningTime="2025-09-09 04:54:10.009592427 +0000 UTC m=+50.351363554" watchObservedRunningTime="2025-09-09 04:54:15.73236101 +0000 UTC m=+56.074132137" Sep 9 04:54:15.797619 containerd[1526]: time="2025-09-09T04:54:15.797541627Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a2a038ec9bc1a85c1df9c25c2fd07d3886094137ad58b429d4604dad216ce24f\" id:\"b39d2d2c21179f8c6e01775725e4ac4ec652233ca02b5b379216844fffeae6d3\" pid:5849 exited_at:{seconds:1757393655 nanos:797130469}" Sep 9 04:54:16.876703 systemd[1]: Started sshd@12-10.0.0.40:22-10.0.0.1:36526.service - OpenSSH per-connection server daemon (10.0.0.1:36526). Sep 9 04:54:16.926184 sshd[5871]: Accepted publickey for core from 10.0.0.1 port 36526 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:16.927512 sshd-session[5871]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:16.931526 systemd-logind[1508]: New session 13 of user core. Sep 9 04:54:16.940558 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 04:54:17.099672 sshd[5874]: Connection closed by 10.0.0.1 port 36526 Sep 9 04:54:17.100800 sshd-session[5871]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:17.115518 systemd[1]: sshd@12-10.0.0.40:22-10.0.0.1:36526.service: Deactivated successfully. Sep 9 04:54:17.117234 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 04:54:17.118467 systemd-logind[1508]: Session 13 logged out. Waiting for processes to exit. Sep 9 04:54:17.122425 systemd[1]: Started sshd@13-10.0.0.40:22-10.0.0.1:36534.service - OpenSSH per-connection server daemon (10.0.0.1:36534). Sep 9 04:54:17.123291 systemd-logind[1508]: Removed session 13. Sep 9 04:54:17.187357 sshd[5887]: Accepted publickey for core from 10.0.0.1 port 36534 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:17.188655 sshd-session[5887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:17.192881 systemd-logind[1508]: New session 14 of user core. Sep 9 04:54:17.202513 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 04:54:17.409080 sshd[5890]: Connection closed by 10.0.0.1 port 36534 Sep 9 04:54:17.409040 sshd-session[5887]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:17.415795 systemd[1]: sshd@13-10.0.0.40:22-10.0.0.1:36534.service: Deactivated successfully. Sep 9 04:54:17.418920 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 04:54:17.419853 systemd-logind[1508]: Session 14 logged out. Waiting for processes to exit. Sep 9 04:54:17.422110 systemd[1]: Started sshd@14-10.0.0.40:22-10.0.0.1:36546.service - OpenSSH per-connection server daemon (10.0.0.1:36546). Sep 9 04:54:17.424882 systemd-logind[1508]: Removed session 14. Sep 9 04:54:17.482681 sshd[5903]: Accepted publickey for core from 10.0.0.1 port 36546 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:17.484069 sshd-session[5903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:17.488432 systemd-logind[1508]: New session 15 of user core. Sep 9 04:54:17.499476 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 04:54:18.127850 sshd[5906]: Connection closed by 10.0.0.1 port 36546 Sep 9 04:54:18.128662 sshd-session[5903]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:18.136619 systemd[1]: sshd@14-10.0.0.40:22-10.0.0.1:36546.service: Deactivated successfully. Sep 9 04:54:18.138623 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 04:54:18.139639 systemd-logind[1508]: Session 15 logged out. Waiting for processes to exit. Sep 9 04:54:18.143151 systemd[1]: Started sshd@15-10.0.0.40:22-10.0.0.1:36554.service - OpenSSH per-connection server daemon (10.0.0.1:36554). Sep 9 04:54:18.146666 systemd-logind[1508]: Removed session 15. Sep 9 04:54:18.204266 sshd[5927]: Accepted publickey for core from 10.0.0.1 port 36554 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:18.205679 sshd-session[5927]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:18.209965 systemd-logind[1508]: New session 16 of user core. Sep 9 04:54:18.222619 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 04:54:18.574663 sshd[5930]: Connection closed by 10.0.0.1 port 36554 Sep 9 04:54:18.575022 sshd-session[5927]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:18.589745 systemd[1]: sshd@15-10.0.0.40:22-10.0.0.1:36554.service: Deactivated successfully. Sep 9 04:54:18.592274 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 04:54:18.594581 systemd-logind[1508]: Session 16 logged out. Waiting for processes to exit. Sep 9 04:54:18.595810 systemd[1]: Started sshd@16-10.0.0.40:22-10.0.0.1:36570.service - OpenSSH per-connection server daemon (10.0.0.1:36570). Sep 9 04:54:18.597764 systemd-logind[1508]: Removed session 16. Sep 9 04:54:18.658152 sshd[5942]: Accepted publickey for core from 10.0.0.1 port 36570 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:18.659446 sshd-session[5942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:18.663930 systemd-logind[1508]: New session 17 of user core. Sep 9 04:54:18.673545 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 04:54:18.810875 sshd[5945]: Connection closed by 10.0.0.1 port 36570 Sep 9 04:54:18.811402 sshd-session[5942]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:18.814783 systemd[1]: sshd@16-10.0.0.40:22-10.0.0.1:36570.service: Deactivated successfully. Sep 9 04:54:18.816591 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 04:54:18.819077 systemd-logind[1508]: Session 17 logged out. Waiting for processes to exit. Sep 9 04:54:18.820208 systemd-logind[1508]: Removed session 17. Sep 9 04:54:19.733956 containerd[1526]: time="2025-09-09T04:54:19.733920898Z" level=info msg="StopPodSandbox for \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\"" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.793 [WARNING][5967] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.794 [INFO][5967] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.794 [INFO][5967] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" iface="eth0" netns="" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.794 [INFO][5967] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.794 [INFO][5967] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.819 [INFO][5978] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.819 [INFO][5978] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.819 [INFO][5978] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.829 [WARNING][5978] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.829 [INFO][5978] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.831 [INFO][5978] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:19.834992 containerd[1526]: 2025-09-09 04:54:19.832 [INFO][5967] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.835627 containerd[1526]: time="2025-09-09T04:54:19.835077531Z" level=info msg="TearDown network for sandbox \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" successfully" Sep 9 04:54:19.835627 containerd[1526]: time="2025-09-09T04:54:19.835099771Z" level=info msg="StopPodSandbox for \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" returns successfully" Sep 9 04:54:19.835692 containerd[1526]: time="2025-09-09T04:54:19.835634928Z" level=info msg="RemovePodSandbox for \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\"" Sep 9 04:54:19.835692 containerd[1526]: time="2025-09-09T04:54:19.835664168Z" level=info msg="Forcibly stopping sandbox \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\"" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.874 [WARNING][5996] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" WorkloadEndpoint="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.874 [INFO][5996] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.874 [INFO][5996] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" iface="eth0" netns="" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.874 [INFO][5996] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.874 [INFO][5996] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.893 [INFO][6005] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.893 [INFO][6005] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.893 [INFO][6005] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.904 [WARNING][6005] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.904 [INFO][6005] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" HandleID="k8s-pod-network.1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Workload="localhost-k8s-calico--apiserver--cb85c6f7d--pxpgr-eth0" Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.906 [INFO][6005] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:54:19.910985 containerd[1526]: 2025-09-09 04:54:19.908 [INFO][5996] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef" Sep 9 04:54:19.911318 containerd[1526]: time="2025-09-09T04:54:19.911045745Z" level=info msg="TearDown network for sandbox \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" successfully" Sep 9 04:54:19.912649 containerd[1526]: time="2025-09-09T04:54:19.912613576Z" level=info msg="Ensure that sandbox 1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef in task-service has been cleanup successfully" Sep 9 04:54:20.048837 containerd[1526]: time="2025-09-09T04:54:20.048693575Z" level=info msg="RemovePodSandbox \"1fd2634bfcb329f45b6dd071f4cb722d4e3698bd008c045fd5e51dfb180731ef\" returns successfully" Sep 9 04:54:23.828616 systemd[1]: Started sshd@17-10.0.0.40:22-10.0.0.1:45100.service - OpenSSH per-connection server daemon (10.0.0.1:45100). Sep 9 04:54:23.892858 sshd[6017]: Accepted publickey for core from 10.0.0.1 port 45100 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:23.895073 sshd-session[6017]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:23.901561 systemd-logind[1508]: New session 18 of user core. Sep 9 04:54:23.906493 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 04:54:24.081310 sshd[6020]: Connection closed by 10.0.0.1 port 45100 Sep 9 04:54:24.081431 sshd-session[6017]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:24.087690 systemd[1]: sshd@17-10.0.0.40:22-10.0.0.1:45100.service: Deactivated successfully. Sep 9 04:54:24.092002 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 04:54:24.094496 systemd-logind[1508]: Session 18 logged out. Waiting for processes to exit. Sep 9 04:54:24.096452 systemd-logind[1508]: Removed session 18. Sep 9 04:54:29.098847 systemd[1]: Started sshd@18-10.0.0.40:22-10.0.0.1:45102.service - OpenSSH per-connection server daemon (10.0.0.1:45102). Sep 9 04:54:29.164954 sshd[6039]: Accepted publickey for core from 10.0.0.1 port 45102 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:29.166506 sshd-session[6039]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:29.171174 systemd-logind[1508]: New session 19 of user core. Sep 9 04:54:29.180499 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 04:54:29.385984 sshd[6042]: Connection closed by 10.0.0.1 port 45102 Sep 9 04:54:29.386272 sshd-session[6039]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:29.390403 systemd[1]: sshd@18-10.0.0.40:22-10.0.0.1:45102.service: Deactivated successfully. Sep 9 04:54:29.392170 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 04:54:29.396368 systemd-logind[1508]: Session 19 logged out. Waiting for processes to exit. Sep 9 04:54:29.397497 systemd-logind[1508]: Removed session 19. Sep 9 04:54:34.343970 containerd[1526]: time="2025-09-09T04:54:34.343907700Z" level=info msg="TaskExit event in podsandbox handler container_id:\"57461c5fa8d60251ae6bfc3991528dbeef0d89f29afff40f0341402acf2dd3e5\" id:\"f19a41fc57977f02bdc97981ea94aecccd267617298874e43d8de0827dd4279e\" pid:6065 exited_at:{seconds:1757393674 nanos:343533615}" Sep 9 04:54:34.397838 systemd[1]: Started sshd@19-10.0.0.40:22-10.0.0.1:55980.service - OpenSSH per-connection server daemon (10.0.0.1:55980). Sep 9 04:54:34.458386 sshd[6076]: Accepted publickey for core from 10.0.0.1 port 55980 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:54:34.461639 sshd-session[6076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:54:34.466201 systemd-logind[1508]: New session 20 of user core. Sep 9 04:54:34.474523 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 04:54:34.658112 sshd[6079]: Connection closed by 10.0.0.1 port 55980 Sep 9 04:54:34.658541 sshd-session[6076]: pam_unix(sshd:session): session closed for user core Sep 9 04:54:34.663468 systemd[1]: sshd@19-10.0.0.40:22-10.0.0.1:55980.service: Deactivated successfully. Sep 9 04:54:34.665424 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 04:54:34.666112 systemd-logind[1508]: Session 20 logged out. Waiting for processes to exit. Sep 9 04:54:34.666962 systemd-logind[1508]: Removed session 20.