Sep 12 22:11:39.749552 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 12 22:11:39.749574 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Sep 12 20:38:46 -00 2025 Sep 12 22:11:39.749583 kernel: KASLR enabled Sep 12 22:11:39.749588 kernel: efi: EFI v2.7 by EDK II Sep 12 22:11:39.749594 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb832018 ACPI 2.0=0xdbfd0018 RNG=0xdbfd0a18 MEMRESERVE=0xdb838218 Sep 12 22:11:39.749599 kernel: random: crng init done Sep 12 22:11:39.749606 kernel: secureboot: Secure boot disabled Sep 12 22:11:39.749612 kernel: ACPI: Early table checksum verification disabled Sep 12 22:11:39.749618 kernel: ACPI: RSDP 0x00000000DBFD0018 000024 (v02 BOCHS ) Sep 12 22:11:39.749625 kernel: ACPI: XSDT 0x00000000DBFD0F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 12 22:11:39.749631 kernel: ACPI: FACP 0x00000000DBFD0B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749637 kernel: ACPI: DSDT 0x00000000DBF0E018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749642 kernel: ACPI: APIC 0x00000000DBFD0C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749648 kernel: ACPI: PPTT 0x00000000DBFD0098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749655 kernel: ACPI: GTDT 0x00000000DBFD0818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749662 kernel: ACPI: MCFG 0x00000000DBFD0A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749669 kernel: ACPI: SPCR 0x00000000DBFD0918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749675 kernel: ACPI: DBG2 0x00000000DBFD0998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749681 kernel: ACPI: IORT 0x00000000DBFD0198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:11:39.749698 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 12 22:11:39.749705 kernel: ACPI: Use ACPI SPCR as default console: No Sep 12 22:11:39.749712 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:11:39.749718 kernel: NODE_DATA(0) allocated [mem 0xdc965a00-0xdc96cfff] Sep 12 22:11:39.749724 kernel: Zone ranges: Sep 12 22:11:39.749731 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:11:39.749739 kernel: DMA32 empty Sep 12 22:11:39.749746 kernel: Normal empty Sep 12 22:11:39.749752 kernel: Device empty Sep 12 22:11:39.749758 kernel: Movable zone start for each node Sep 12 22:11:39.749764 kernel: Early memory node ranges Sep 12 22:11:39.749770 kernel: node 0: [mem 0x0000000040000000-0x00000000db81ffff] Sep 12 22:11:39.749776 kernel: node 0: [mem 0x00000000db820000-0x00000000db82ffff] Sep 12 22:11:39.749782 kernel: node 0: [mem 0x00000000db830000-0x00000000dc09ffff] Sep 12 22:11:39.749788 kernel: node 0: [mem 0x00000000dc0a0000-0x00000000dc2dffff] Sep 12 22:11:39.749794 kernel: node 0: [mem 0x00000000dc2e0000-0x00000000dc36ffff] Sep 12 22:11:39.749800 kernel: node 0: [mem 0x00000000dc370000-0x00000000dc45ffff] Sep 12 22:11:39.749806 kernel: node 0: [mem 0x00000000dc460000-0x00000000dc52ffff] Sep 12 22:11:39.749813 kernel: node 0: [mem 0x00000000dc530000-0x00000000dc5cffff] Sep 12 22:11:39.749819 kernel: node 0: [mem 0x00000000dc5d0000-0x00000000dce1ffff] Sep 12 22:11:39.749825 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Sep 12 22:11:39.749833 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Sep 12 22:11:39.749840 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Sep 12 22:11:39.749846 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 12 22:11:39.749853 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 12 22:11:39.749859 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 12 22:11:39.749866 kernel: cma: Reserved 16 MiB at 0x00000000d8000000 on node -1 Sep 12 22:11:39.749872 kernel: psci: probing for conduit method from ACPI. Sep 12 22:11:39.749878 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 22:11:39.749885 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 22:11:39.749891 kernel: psci: Trusted OS migration not required Sep 12 22:11:39.749897 kernel: psci: SMC Calling Convention v1.1 Sep 12 22:11:39.749904 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 12 22:11:39.749910 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 12 22:11:39.749918 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 12 22:11:39.749925 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 12 22:11:39.749931 kernel: Detected PIPT I-cache on CPU0 Sep 12 22:11:39.749937 kernel: CPU features: detected: GIC system register CPU interface Sep 12 22:11:39.749944 kernel: CPU features: detected: Spectre-v4 Sep 12 22:11:39.749950 kernel: CPU features: detected: Spectre-BHB Sep 12 22:11:39.749957 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 22:11:39.749963 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 22:11:39.749970 kernel: CPU features: detected: ARM erratum 1418040 Sep 12 22:11:39.749976 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 22:11:39.749982 kernel: alternatives: applying boot alternatives Sep 12 22:11:39.749990 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:11:39.749998 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:11:39.750005 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 22:11:39.750011 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 22:11:39.750018 kernel: Fallback order for Node 0: 0 Sep 12 22:11:39.750024 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 12 22:11:39.750031 kernel: Policy zone: DMA Sep 12 22:11:39.750037 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:11:39.750044 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 12 22:11:39.750050 kernel: software IO TLB: area num 4. Sep 12 22:11:39.750057 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 12 22:11:39.750064 kernel: software IO TLB: mapped [mem 0x00000000d7c00000-0x00000000d8000000] (4MB) Sep 12 22:11:39.750072 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 12 22:11:39.750078 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:11:39.750085 kernel: rcu: RCU event tracing is enabled. Sep 12 22:11:39.750092 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 12 22:11:39.750098 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:11:39.750105 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:11:39.750112 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:11:39.750118 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 12 22:11:39.750125 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 22:11:39.750132 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 22:11:39.750138 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 22:11:39.750146 kernel: GICv3: 256 SPIs implemented Sep 12 22:11:39.750152 kernel: GICv3: 0 Extended SPIs implemented Sep 12 22:11:39.750159 kernel: Root IRQ handler: gic_handle_irq Sep 12 22:11:39.750165 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 12 22:11:39.750172 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 12 22:11:39.750189 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 12 22:11:39.750197 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 12 22:11:39.750203 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 12 22:11:39.750210 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 12 22:11:39.750216 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 12 22:11:39.750223 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 12 22:11:39.750229 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 22:11:39.750237 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:11:39.750244 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 12 22:11:39.750250 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 12 22:11:39.750257 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 12 22:11:39.750263 kernel: arm-pv: using stolen time PV Sep 12 22:11:39.750270 kernel: Console: colour dummy device 80x25 Sep 12 22:11:39.750276 kernel: ACPI: Core revision 20240827 Sep 12 22:11:39.750283 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 12 22:11:39.750289 kernel: pid_max: default: 32768 minimum: 301 Sep 12 22:11:39.750296 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:11:39.750304 kernel: landlock: Up and running. Sep 12 22:11:39.750310 kernel: SELinux: Initializing. Sep 12 22:11:39.750316 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:11:39.750323 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:11:39.750329 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:11:39.750336 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:11:39.750343 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 22:11:39.750349 kernel: Remapping and enabling EFI services. Sep 12 22:11:39.750356 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:11:39.750368 kernel: Detected PIPT I-cache on CPU1 Sep 12 22:11:39.750376 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 12 22:11:39.750383 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 12 22:11:39.750391 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:11:39.750398 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 12 22:11:39.750405 kernel: Detected PIPT I-cache on CPU2 Sep 12 22:11:39.750413 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 12 22:11:39.750420 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 12 22:11:39.750429 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:11:39.750435 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 12 22:11:39.750442 kernel: Detected PIPT I-cache on CPU3 Sep 12 22:11:39.750449 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 12 22:11:39.750456 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 12 22:11:39.750463 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:11:39.750470 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 12 22:11:39.750477 kernel: smp: Brought up 1 node, 4 CPUs Sep 12 22:11:39.750484 kernel: SMP: Total of 4 processors activated. Sep 12 22:11:39.750492 kernel: CPU: All CPU(s) started at EL1 Sep 12 22:11:39.750498 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 22:11:39.750505 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 22:11:39.750512 kernel: CPU features: detected: Common not Private translations Sep 12 22:11:39.750519 kernel: CPU features: detected: CRC32 instructions Sep 12 22:11:39.750526 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 12 22:11:39.750533 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 22:11:39.750540 kernel: CPU features: detected: LSE atomic instructions Sep 12 22:11:39.750547 kernel: CPU features: detected: Privileged Access Never Sep 12 22:11:39.750554 kernel: CPU features: detected: RAS Extension Support Sep 12 22:11:39.750562 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 22:11:39.750570 kernel: alternatives: applying system-wide alternatives Sep 12 22:11:39.750578 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 12 22:11:39.750585 kernel: Memory: 2424480K/2572288K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 125472K reserved, 16384K cma-reserved) Sep 12 22:11:39.750592 kernel: devtmpfs: initialized Sep 12 22:11:39.750600 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:11:39.750607 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 12 22:11:39.750614 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 22:11:39.750623 kernel: 0 pages in range for non-PLT usage Sep 12 22:11:39.750630 kernel: 508560 pages in range for PLT usage Sep 12 22:11:39.750637 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:11:39.750644 kernel: SMBIOS 3.0.0 present. Sep 12 22:11:39.750651 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 12 22:11:39.750658 kernel: DMI: Memory slots populated: 1/1 Sep 12 22:11:39.750665 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:11:39.750672 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 22:11:39.750679 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 22:11:39.750691 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 22:11:39.750700 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:11:39.750708 kernel: audit: type=2000 audit(0.020:1): state=initialized audit_enabled=0 res=1 Sep 12 22:11:39.750715 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:11:39.750722 kernel: cpuidle: using governor menu Sep 12 22:11:39.750729 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 22:11:39.750735 kernel: ASID allocator initialised with 32768 entries Sep 12 22:11:39.750742 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:11:39.750749 kernel: Serial: AMBA PL011 UART driver Sep 12 22:11:39.750756 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:11:39.750764 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:11:39.750771 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 22:11:39.750778 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 22:11:39.750785 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:11:39.750792 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:11:39.750799 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 22:11:39.750806 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 22:11:39.750813 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:11:39.750820 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:11:39.750828 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:11:39.750835 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:11:39.750842 kernel: ACPI: Interpreter enabled Sep 12 22:11:39.750850 kernel: ACPI: Using GIC for interrupt routing Sep 12 22:11:39.750857 kernel: ACPI: MCFG table detected, 1 entries Sep 12 22:11:39.750863 kernel: ACPI: CPU0 has been hot-added Sep 12 22:11:39.750870 kernel: ACPI: CPU1 has been hot-added Sep 12 22:11:39.750877 kernel: ACPI: CPU2 has been hot-added Sep 12 22:11:39.750884 kernel: ACPI: CPU3 has been hot-added Sep 12 22:11:39.750892 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 12 22:11:39.750899 kernel: printk: legacy console [ttyAMA0] enabled Sep 12 22:11:39.750907 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 22:11:39.751046 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 22:11:39.751115 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 22:11:39.751177 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 22:11:39.751268 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 12 22:11:39.751330 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 12 22:11:39.751339 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 12 22:11:39.751347 kernel: PCI host bridge to bus 0000:00 Sep 12 22:11:39.751415 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 12 22:11:39.751474 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 22:11:39.751528 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 12 22:11:39.751580 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 22:11:39.751664 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 12 22:11:39.751749 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 12 22:11:39.751815 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 12 22:11:39.751878 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 12 22:11:39.751940 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 22:11:39.752001 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 12 22:11:39.752059 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 12 22:11:39.752120 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 12 22:11:39.752174 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 12 22:11:39.752239 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 22:11:39.752298 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 12 22:11:39.752307 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 22:11:39.752315 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 22:11:39.752323 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 22:11:39.752332 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 22:11:39.752339 kernel: iommu: Default domain type: Translated Sep 12 22:11:39.752346 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 22:11:39.752353 kernel: efivars: Registered efivars operations Sep 12 22:11:39.752360 kernel: vgaarb: loaded Sep 12 22:11:39.752367 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 22:11:39.752374 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:11:39.752381 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:11:39.752387 kernel: pnp: PnP ACPI init Sep 12 22:11:39.752456 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 12 22:11:39.752466 kernel: pnp: PnP ACPI: found 1 devices Sep 12 22:11:39.752473 kernel: NET: Registered PF_INET protocol family Sep 12 22:11:39.752480 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 22:11:39.752487 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 22:11:39.752494 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:11:39.752502 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 22:11:39.752509 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 22:11:39.752517 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 22:11:39.752526 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:11:39.752534 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:11:39.752541 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:11:39.752548 kernel: PCI: CLS 0 bytes, default 64 Sep 12 22:11:39.752555 kernel: kvm [1]: HYP mode not available Sep 12 22:11:39.752562 kernel: Initialise system trusted keyrings Sep 12 22:11:39.752569 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 22:11:39.752576 kernel: Key type asymmetric registered Sep 12 22:11:39.752583 kernel: Asymmetric key parser 'x509' registered Sep 12 22:11:39.752592 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 12 22:11:39.752599 kernel: io scheduler mq-deadline registered Sep 12 22:11:39.752606 kernel: io scheduler kyber registered Sep 12 22:11:39.752614 kernel: io scheduler bfq registered Sep 12 22:11:39.752621 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 22:11:39.752628 kernel: ACPI: button: Power Button [PWRB] Sep 12 22:11:39.752635 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 22:11:39.752708 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 12 22:11:39.752718 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:11:39.752727 kernel: thunder_xcv, ver 1.0 Sep 12 22:11:39.752734 kernel: thunder_bgx, ver 1.0 Sep 12 22:11:39.752741 kernel: nicpf, ver 1.0 Sep 12 22:11:39.752748 kernel: nicvf, ver 1.0 Sep 12 22:11:39.752817 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 22:11:39.752874 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T22:11:39 UTC (1757715099) Sep 12 22:11:39.752883 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 22:11:39.752890 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 12 22:11:39.752899 kernel: watchdog: NMI not fully supported Sep 12 22:11:39.752906 kernel: watchdog: Hard watchdog permanently disabled Sep 12 22:11:39.752913 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:11:39.752920 kernel: Segment Routing with IPv6 Sep 12 22:11:39.752927 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:11:39.752934 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:11:39.752941 kernel: Key type dns_resolver registered Sep 12 22:11:39.752948 kernel: registered taskstats version 1 Sep 12 22:11:39.752954 kernel: Loading compiled-in X.509 certificates Sep 12 22:11:39.752963 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 2d7730e6d35b3fbd1c590cd72a2500b2380c020e' Sep 12 22:11:39.752970 kernel: Demotion targets for Node 0: null Sep 12 22:11:39.752977 kernel: Key type .fscrypt registered Sep 12 22:11:39.752984 kernel: Key type fscrypt-provisioning registered Sep 12 22:11:39.752991 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:11:39.752998 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:11:39.753005 kernel: ima: No architecture policies found Sep 12 22:11:39.753012 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 22:11:39.753019 kernel: clk: Disabling unused clocks Sep 12 22:11:39.753027 kernel: PM: genpd: Disabling unused power domains Sep 12 22:11:39.753034 kernel: Warning: unable to open an initial console. Sep 12 22:11:39.753042 kernel: Freeing unused kernel memory: 38976K Sep 12 22:11:39.753049 kernel: Run /init as init process Sep 12 22:11:39.753056 kernel: with arguments: Sep 12 22:11:39.753063 kernel: /init Sep 12 22:11:39.753070 kernel: with environment: Sep 12 22:11:39.753077 kernel: HOME=/ Sep 12 22:11:39.753084 kernel: TERM=linux Sep 12 22:11:39.753093 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:11:39.753102 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:11:39.753112 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:11:39.753120 systemd[1]: Detected virtualization kvm. Sep 12 22:11:39.753128 systemd[1]: Detected architecture arm64. Sep 12 22:11:39.753135 systemd[1]: Running in initrd. Sep 12 22:11:39.753143 systemd[1]: No hostname configured, using default hostname. Sep 12 22:11:39.753153 systemd[1]: Hostname set to . Sep 12 22:11:39.753161 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:11:39.753168 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:11:39.753176 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:11:39.753193 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:11:39.753201 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:11:39.753209 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:11:39.753217 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:11:39.753227 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:11:39.753235 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:11:39.753244 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:11:39.753251 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:11:39.753259 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:11:39.753267 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:11:39.753274 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:11:39.753283 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:11:39.753291 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:11:39.753298 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:11:39.753307 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:11:39.753315 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:11:39.753322 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:11:39.753330 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:11:39.753338 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:11:39.753347 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:11:39.753355 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:11:39.753363 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:11:39.753370 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:11:39.753378 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:11:39.753385 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:11:39.753393 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:11:39.753401 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:11:39.753409 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:11:39.753418 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:11:39.753426 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:11:39.753434 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:11:39.753442 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:11:39.753467 systemd-journald[245]: Collecting audit messages is disabled. Sep 12 22:11:39.753485 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:11:39.753494 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:11:39.753502 systemd-journald[245]: Journal started Sep 12 22:11:39.753523 systemd-journald[245]: Runtime Journal (/run/log/journal/f8795c7bd0ee493aa7a60a2efb74b167) is 6M, max 48.5M, 42.4M free. Sep 12 22:11:39.758306 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:11:39.744879 systemd-modules-load[246]: Inserted module 'overlay' Sep 12 22:11:39.761045 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:11:39.761610 systemd-modules-load[246]: Inserted module 'br_netfilter' Sep 12 22:11:39.762884 kernel: Bridge firewalling registered Sep 12 22:11:39.762898 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:11:39.773307 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:11:39.774366 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:11:39.779806 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:11:39.781780 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:11:39.789053 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:11:39.792402 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:11:39.795291 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:11:39.796320 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:11:39.798627 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:11:39.800735 systemd-tmpfiles[282]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:11:39.803868 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:11:39.806615 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:11:39.816579 dracut-cmdline[286]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:11:39.846253 systemd-resolved[295]: Positive Trust Anchors: Sep 12 22:11:39.846272 systemd-resolved[295]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:11:39.846304 systemd-resolved[295]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:11:39.851210 systemd-resolved[295]: Defaulting to hostname 'linux'. Sep 12 22:11:39.853427 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:11:39.856152 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:11:39.888205 kernel: SCSI subsystem initialized Sep 12 22:11:39.892193 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:11:39.900229 kernel: iscsi: registered transport (tcp) Sep 12 22:11:39.912194 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:11:39.912217 kernel: QLogic iSCSI HBA Driver Sep 12 22:11:39.929914 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:11:39.945853 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:11:39.948396 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:11:39.991795 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:11:39.993963 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:11:40.057217 kernel: raid6: neonx8 gen() 15771 MB/s Sep 12 22:11:40.074205 kernel: raid6: neonx4 gen() 15766 MB/s Sep 12 22:11:40.091199 kernel: raid6: neonx2 gen() 13224 MB/s Sep 12 22:11:40.108204 kernel: raid6: neonx1 gen() 10313 MB/s Sep 12 22:11:40.125201 kernel: raid6: int64x8 gen() 6890 MB/s Sep 12 22:11:40.142200 kernel: raid6: int64x4 gen() 7344 MB/s Sep 12 22:11:40.159200 kernel: raid6: int64x2 gen() 6099 MB/s Sep 12 22:11:40.176205 kernel: raid6: int64x1 gen() 5044 MB/s Sep 12 22:11:40.176227 kernel: raid6: using algorithm neonx8 gen() 15771 MB/s Sep 12 22:11:40.193207 kernel: raid6: .... xor() 12032 MB/s, rmw enabled Sep 12 22:11:40.193222 kernel: raid6: using neon recovery algorithm Sep 12 22:11:40.198277 kernel: xor: measuring software checksum speed Sep 12 22:11:40.198300 kernel: 8regs : 21584 MB/sec Sep 12 22:11:40.199346 kernel: 32regs : 21687 MB/sec Sep 12 22:11:40.199359 kernel: arm64_neon : 27993 MB/sec Sep 12 22:11:40.199368 kernel: xor: using function: arm64_neon (27993 MB/sec) Sep 12 22:11:40.252239 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:11:40.259255 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:11:40.261503 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:11:40.284560 systemd-udevd[501]: Using default interface naming scheme 'v255'. Sep 12 22:11:40.288628 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:11:40.290327 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:11:40.317337 dracut-pre-trigger[509]: rd.md=0: removing MD RAID activation Sep 12 22:11:40.338902 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:11:40.340979 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:11:40.393941 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:11:40.396514 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:11:40.445306 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 12 22:11:40.445463 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 12 22:11:40.452204 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:11:40.452327 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:11:40.457830 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 22:11:40.457850 kernel: GPT:9289727 != 19775487 Sep 12 22:11:40.457863 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 22:11:40.457872 kernel: GPT:9289727 != 19775487 Sep 12 22:11:40.458371 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 22:11:40.457992 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:11:40.459914 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:11:40.462077 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:11:40.488058 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 12 22:11:40.489397 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:11:40.492272 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:11:40.507046 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 12 22:11:40.519868 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 22:11:40.526426 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 12 22:11:40.527365 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 12 22:11:40.529736 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:11:40.531468 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:11:40.533191 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:11:40.535658 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:11:40.537228 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:11:40.554313 disk-uuid[591]: Primary Header is updated. Sep 12 22:11:40.554313 disk-uuid[591]: Secondary Entries is updated. Sep 12 22:11:40.554313 disk-uuid[591]: Secondary Header is updated. Sep 12 22:11:40.556958 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:11:40.560203 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:11:40.563200 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:11:41.565227 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 22:11:41.565313 disk-uuid[596]: The operation has completed successfully. Sep 12 22:11:41.593448 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:11:41.594247 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:11:41.620662 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:11:41.637160 sh[610]: Success Sep 12 22:11:41.649852 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:11:41.649900 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:11:41.649921 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:11:41.656200 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 12 22:11:41.683804 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:11:41.686428 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:11:41.698373 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:11:41.702223 kernel: BTRFS: device fsid 254e43f1-b609-42b8-bcc5-437252095415 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (623) Sep 12 22:11:41.702260 kernel: BTRFS info (device dm-0): first mount of filesystem 254e43f1-b609-42b8-bcc5-437252095415 Sep 12 22:11:41.703767 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:11:41.707319 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:11:41.707354 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:11:41.708271 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:11:41.709279 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:11:41.710330 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 22:11:41.711035 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:11:41.713845 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:11:41.742810 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (654) Sep 12 22:11:41.742858 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:11:41.742869 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:11:41.745705 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:11:41.745743 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:11:41.750191 kernel: BTRFS info (device vda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:11:41.751288 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:11:41.752935 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:11:41.819083 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:11:41.821921 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:11:41.855448 ignition[697]: Ignition 2.22.0 Sep 12 22:11:41.855463 ignition[697]: Stage: fetch-offline Sep 12 22:11:41.855497 ignition[697]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:11:41.855504 ignition[697]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:11:41.856252 ignition[697]: parsed url from cmdline: "" Sep 12 22:11:41.856257 ignition[697]: no config URL provided Sep 12 22:11:41.856264 ignition[697]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:11:41.856276 ignition[697]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:11:41.856305 ignition[697]: op(1): [started] loading QEMU firmware config module Sep 12 22:11:41.856310 ignition[697]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 12 22:11:41.861265 ignition[697]: op(1): [finished] loading QEMU firmware config module Sep 12 22:11:41.867335 systemd-networkd[801]: lo: Link UP Sep 12 22:11:41.867346 systemd-networkd[801]: lo: Gained carrier Sep 12 22:11:41.868317 systemd-networkd[801]: Enumeration completed Sep 12 22:11:41.868428 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:11:41.868835 systemd-networkd[801]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:11:41.868839 systemd-networkd[801]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:11:41.870034 systemd-networkd[801]: eth0: Link UP Sep 12 22:11:41.870202 systemd[1]: Reached target network.target - Network. Sep 12 22:11:41.870392 systemd-networkd[801]: eth0: Gained carrier Sep 12 22:11:41.870402 systemd-networkd[801]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:11:41.897226 systemd-networkd[801]: eth0: DHCPv4 address 10.0.0.68/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 22:11:41.911789 ignition[697]: parsing config with SHA512: 6d324948da5a9169f4c283a1e7ea60c57098cbfafffbcd2d6e350cb1a6d2ec93293de99020a978fee5db2a293be465bdff06820873bffb97689f3b31e5276872 Sep 12 22:11:41.916113 unknown[697]: fetched base config from "system" Sep 12 22:11:41.916817 unknown[697]: fetched user config from "qemu" Sep 12 22:11:41.917293 ignition[697]: fetch-offline: fetch-offline passed Sep 12 22:11:41.917349 ignition[697]: Ignition finished successfully Sep 12 22:11:41.918949 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:11:41.920283 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 22:11:41.921020 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:11:41.951857 ignition[809]: Ignition 2.22.0 Sep 12 22:11:41.951872 ignition[809]: Stage: kargs Sep 12 22:11:41.952015 ignition[809]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:11:41.952025 ignition[809]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:11:41.952795 ignition[809]: kargs: kargs passed Sep 12 22:11:41.955814 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:11:41.952839 ignition[809]: Ignition finished successfully Sep 12 22:11:41.957723 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:11:41.986932 ignition[817]: Ignition 2.22.0 Sep 12 22:11:41.986948 ignition[817]: Stage: disks Sep 12 22:11:41.987084 ignition[817]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:11:41.987092 ignition[817]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:11:41.987814 ignition[817]: disks: disks passed Sep 12 22:11:41.989619 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:11:41.987861 ignition[817]: Ignition finished successfully Sep 12 22:11:41.990721 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:11:41.991826 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:11:41.993349 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:11:41.994621 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:11:41.996124 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:11:41.998751 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:11:42.030075 systemd-fsck[828]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 22:11:42.035121 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:11:42.042374 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:11:42.113210 kernel: EXT4-fs (vda9): mounted filesystem a7b592ec-3c41-4dc2-88a7-056c1f18b418 r/w with ordered data mode. Quota mode: none. Sep 12 22:11:42.113915 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:11:42.115100 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:11:42.117162 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:11:42.118597 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:11:42.119447 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 22:11:42.119488 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:11:42.119510 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:11:42.129980 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:11:42.132251 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:11:42.135201 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (836) Sep 12 22:11:42.137197 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:11:42.137225 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:11:42.140745 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:11:42.140778 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:11:42.141957 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:11:42.168861 initrd-setup-root[860]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:11:42.173426 initrd-setup-root[867]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:11:42.177048 initrd-setup-root[874]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:11:42.180833 initrd-setup-root[881]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 22:11:42.250136 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 22:11:42.252173 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 22:11:42.253560 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 22:11:42.272204 kernel: BTRFS info (device vda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:11:42.286237 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 22:11:42.302207 ignition[950]: INFO : Ignition 2.22.0 Sep 12 22:11:42.302207 ignition[950]: INFO : Stage: mount Sep 12 22:11:42.303593 ignition[950]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:11:42.303593 ignition[950]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:11:42.303593 ignition[950]: INFO : mount: mount passed Sep 12 22:11:42.303593 ignition[950]: INFO : Ignition finished successfully Sep 12 22:11:42.305751 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 22:11:42.307928 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 22:11:42.844270 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 22:11:42.845762 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:11:42.865185 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (964) Sep 12 22:11:42.865223 kernel: BTRFS info (device vda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:11:42.865234 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:11:42.871203 kernel: BTRFS info (device vda6): turning on async discard Sep 12 22:11:42.871226 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 22:11:42.872484 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:11:42.902259 ignition[981]: INFO : Ignition 2.22.0 Sep 12 22:11:42.902259 ignition[981]: INFO : Stage: files Sep 12 22:11:42.903831 ignition[981]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:11:42.903831 ignition[981]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:11:42.903831 ignition[981]: DEBUG : files: compiled without relabeling support, skipping Sep 12 22:11:42.906693 ignition[981]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 22:11:42.906693 ignition[981]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 22:11:42.909608 ignition[981]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 22:11:42.909608 ignition[981]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 22:11:42.909608 ignition[981]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 22:11:42.909608 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 12 22:11:42.909608 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 12 22:11:42.907705 unknown[981]: wrote ssh authorized keys file for user: core Sep 12 22:11:42.950707 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 22:11:43.317566 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 12 22:11:43.317566 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 22:11:43.321102 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 22:11:43.321102 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:11:43.321102 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:11:43.321102 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:11:43.321102 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:11:43.321102 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:11:43.321102 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:11:43.331878 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:11:43.331878 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:11:43.331878 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 22:11:43.331878 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 22:11:43.331878 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 22:11:43.331878 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 12 22:11:43.347292 systemd-networkd[801]: eth0: Gained IPv6LL Sep 12 22:11:43.772266 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 22:11:44.190462 ignition[981]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 22:11:44.190462 ignition[981]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 22:11:44.194728 ignition[981]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 22:11:44.209380 ignition[981]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:11:44.212449 ignition[981]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 22:11:44.214016 ignition[981]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 22:11:44.214016 ignition[981]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 12 22:11:44.214016 ignition[981]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 22:11:44.214016 ignition[981]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:11:44.214016 ignition[981]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:11:44.214016 ignition[981]: INFO : files: files passed Sep 12 22:11:44.214016 ignition[981]: INFO : Ignition finished successfully Sep 12 22:11:44.215848 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 22:11:44.220430 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 22:11:44.223604 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 22:11:44.234214 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 22:11:44.235141 initrd-setup-root-after-ignition[1011]: grep: /sysroot/oem/oem-release: No such file or directory Sep 12 22:11:44.236232 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 22:11:44.238568 initrd-setup-root-after-ignition[1013]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:11:44.238568 initrd-setup-root-after-ignition[1013]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:11:44.241351 initrd-setup-root-after-ignition[1017]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:11:44.241231 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:11:44.242775 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 22:11:44.246023 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 22:11:44.280394 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 22:11:44.280514 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 22:11:44.282563 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 22:11:44.284395 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 22:11:44.286072 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 22:11:44.286808 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 22:11:44.310285 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:11:44.312682 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 22:11:44.331922 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:11:44.333240 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:11:44.335127 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 22:11:44.336832 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 22:11:44.336956 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:11:44.339251 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 22:11:44.341512 systemd[1]: Stopped target basic.target - Basic System. Sep 12 22:11:44.342972 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 22:11:44.344569 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:11:44.346389 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 22:11:44.348158 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:11:44.349933 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 22:11:44.351693 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:11:44.353271 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 22:11:44.354894 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 22:11:44.356228 systemd[1]: Stopped target swap.target - Swaps. Sep 12 22:11:44.357459 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 22:11:44.357586 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:11:44.359390 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:11:44.360902 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:11:44.362399 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 22:11:44.363897 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:11:44.366242 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 22:11:44.366408 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 22:11:44.368687 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 22:11:44.368866 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:11:44.370760 systemd[1]: Stopped target paths.target - Path Units. Sep 12 22:11:44.372164 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 22:11:44.372292 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:11:44.374022 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 22:11:44.375508 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 22:11:44.377792 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 22:11:44.377875 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:11:44.379212 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 22:11:44.379294 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:11:44.380897 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 22:11:44.381049 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:11:44.382395 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 22:11:44.382552 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 22:11:44.384778 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 22:11:44.385848 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 22:11:44.386026 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:11:44.401897 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 22:11:44.402797 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 22:11:44.402999 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:11:44.404732 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 22:11:44.404877 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:11:44.412465 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 22:11:44.414217 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 22:11:44.416417 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 22:11:44.418613 ignition[1037]: INFO : Ignition 2.22.0 Sep 12 22:11:44.418613 ignition[1037]: INFO : Stage: umount Sep 12 22:11:44.420945 ignition[1037]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:11:44.420945 ignition[1037]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 22:11:44.420945 ignition[1037]: INFO : umount: umount passed Sep 12 22:11:44.420945 ignition[1037]: INFO : Ignition finished successfully Sep 12 22:11:44.422520 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 22:11:44.422624 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 22:11:44.424302 systemd[1]: Stopped target network.target - Network. Sep 12 22:11:44.425640 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 22:11:44.425714 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 22:11:44.427102 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 22:11:44.427146 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 22:11:44.428761 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 22:11:44.428810 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 22:11:44.429829 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 22:11:44.429870 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 22:11:44.431539 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 22:11:44.432993 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 22:11:44.441286 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 22:11:44.441416 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 22:11:44.445129 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 22:11:44.445391 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 22:11:44.445426 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:11:44.448984 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:11:44.449250 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 22:11:44.449340 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 22:11:44.452527 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 22:11:44.452903 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 22:11:44.454540 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 22:11:44.454577 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:11:44.457005 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 22:11:44.458606 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 22:11:44.458661 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:11:44.460398 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 22:11:44.460441 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:11:44.462883 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 22:11:44.462923 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 22:11:44.464546 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:11:44.468027 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 22:11:44.479926 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 22:11:44.480035 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 22:11:44.483397 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 22:11:44.483518 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 22:11:44.485304 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 22:11:44.485425 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:11:44.487501 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 22:11:44.487570 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 22:11:44.488728 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 22:11:44.488764 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:11:44.490589 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 22:11:44.490638 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:11:44.493034 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 22:11:44.493082 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 22:11:44.496334 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 22:11:44.496395 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:11:44.499260 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 22:11:44.499313 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 22:11:44.503730 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 22:11:44.506364 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 22:11:44.506424 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:11:44.508940 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 22:11:44.508982 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:11:44.511705 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:11:44.511748 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:11:44.519992 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 22:11:44.520114 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 22:11:44.522596 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 22:11:44.524594 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 22:11:44.533119 systemd[1]: Switching root. Sep 12 22:11:44.569220 systemd-journald[245]: Journal stopped Sep 12 22:11:45.309563 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Sep 12 22:11:45.309616 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 22:11:45.309631 kernel: SELinux: policy capability open_perms=1 Sep 12 22:11:45.309641 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 22:11:45.309653 kernel: SELinux: policy capability always_check_network=0 Sep 12 22:11:45.309663 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 22:11:45.309684 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 22:11:45.309696 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 22:11:45.309709 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 22:11:45.309719 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 22:11:45.309730 kernel: audit: type=1403 audit(1757715104.744:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 22:11:45.309745 systemd[1]: Successfully loaded SELinux policy in 57.728ms. Sep 12 22:11:45.309761 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.665ms. Sep 12 22:11:45.309772 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:11:45.309783 systemd[1]: Detected virtualization kvm. Sep 12 22:11:45.309794 systemd[1]: Detected architecture arm64. Sep 12 22:11:45.309806 systemd[1]: Detected first boot. Sep 12 22:11:45.309816 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:11:45.309826 kernel: NET: Registered PF_VSOCK protocol family Sep 12 22:11:45.309836 zram_generator::config[1084]: No configuration found. Sep 12 22:11:45.309846 systemd[1]: Populated /etc with preset unit settings. Sep 12 22:11:45.309857 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 22:11:45.309867 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 22:11:45.309876 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 22:11:45.309886 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 22:11:45.309897 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 22:11:45.309908 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 22:11:45.309918 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 22:11:45.309927 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 22:11:45.309937 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 22:11:45.309947 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 22:11:45.309961 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 22:11:45.309971 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 22:11:45.309982 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:11:45.309993 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:11:45.310003 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 22:11:45.310013 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 22:11:45.310024 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 22:11:45.310034 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:11:45.310044 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 22:11:45.310054 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:11:45.310066 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:11:45.310077 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 22:11:45.310087 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 22:11:45.310097 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 22:11:45.310107 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 22:11:45.310116 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:11:45.310126 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:11:45.310137 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:11:45.310146 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:11:45.310157 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 22:11:45.310168 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 22:11:45.310178 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 22:11:45.311354 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:11:45.311380 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:11:45.311392 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:11:45.311403 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 22:11:45.311414 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 22:11:45.311689 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 22:11:45.311708 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 22:11:45.311719 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 22:11:45.314244 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 22:11:45.314263 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 22:11:45.314275 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 22:11:45.314286 systemd[1]: Reached target machines.target - Containers. Sep 12 22:11:45.314298 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 22:11:45.314309 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:11:45.314326 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:11:45.314339 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 22:11:45.314350 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:11:45.314362 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:11:45.314372 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:11:45.314383 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 22:11:45.314393 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:11:45.314403 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 22:11:45.314413 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 22:11:45.314426 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 22:11:45.314436 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 22:11:45.314446 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 22:11:45.314457 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:11:45.314467 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:11:45.314484 kernel: loop: module loaded Sep 12 22:11:45.314495 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:11:45.314505 kernel: fuse: init (API version 7.41) Sep 12 22:11:45.314516 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:11:45.314528 kernel: ACPI: bus type drm_connector registered Sep 12 22:11:45.314539 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 22:11:45.314552 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 22:11:45.314563 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:11:45.314574 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 22:11:45.314586 systemd[1]: Stopped verity-setup.service. Sep 12 22:11:45.314597 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 22:11:45.314607 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 22:11:45.314652 systemd-journald[1148]: Collecting audit messages is disabled. Sep 12 22:11:45.314700 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 22:11:45.314713 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 22:11:45.314724 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 22:11:45.314739 systemd-journald[1148]: Journal started Sep 12 22:11:45.314761 systemd-journald[1148]: Runtime Journal (/run/log/journal/f8795c7bd0ee493aa7a60a2efb74b167) is 6M, max 48.5M, 42.4M free. Sep 12 22:11:45.111795 systemd[1]: Queued start job for default target multi-user.target. Sep 12 22:11:45.134395 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 12 22:11:45.134790 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 22:11:45.317302 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:11:45.317862 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 22:11:45.320209 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 22:11:45.321358 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:11:45.322694 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 22:11:45.322896 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 22:11:45.324118 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:11:45.324326 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:11:45.325387 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:11:45.325536 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:11:45.326612 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:11:45.326785 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:11:45.328029 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 22:11:45.328208 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 22:11:45.329303 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:11:45.329465 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:11:45.330597 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:11:45.332003 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:11:45.333393 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 22:11:45.334587 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 22:11:45.346633 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:11:45.348734 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 22:11:45.350624 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 22:11:45.351690 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 22:11:45.351740 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:11:45.353507 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 22:11:45.357419 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 22:11:45.358390 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:11:45.359531 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 22:11:45.361433 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 22:11:45.362396 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:11:45.363477 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 22:11:45.364537 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:11:45.365677 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:11:45.370512 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 22:11:45.374221 systemd-journald[1148]: Time spent on flushing to /var/log/journal/f8795c7bd0ee493aa7a60a2efb74b167 is 15.523ms for 885 entries. Sep 12 22:11:45.374221 systemd-journald[1148]: System Journal (/var/log/journal/f8795c7bd0ee493aa7a60a2efb74b167) is 8M, max 195.6M, 187.6M free. Sep 12 22:11:45.394910 systemd-journald[1148]: Received client request to flush runtime journal. Sep 12 22:11:45.394945 kernel: loop0: detected capacity change from 0 to 100632 Sep 12 22:11:45.374422 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 22:11:45.383568 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:11:45.388231 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 22:11:45.393330 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 22:11:45.397233 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 22:11:45.398914 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 22:11:45.400418 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:11:45.403897 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 22:11:45.408392 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 22:11:45.411209 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 22:11:45.421689 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 22:11:45.424069 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:11:45.430205 kernel: loop1: detected capacity change from 0 to 211168 Sep 12 22:11:45.443228 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 22:11:45.454203 kernel: loop2: detected capacity change from 0 to 119368 Sep 12 22:11:45.457329 systemd-tmpfiles[1216]: ACLs are not supported, ignoring. Sep 12 22:11:45.457349 systemd-tmpfiles[1216]: ACLs are not supported, ignoring. Sep 12 22:11:45.460868 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:11:45.482204 kernel: loop3: detected capacity change from 0 to 100632 Sep 12 22:11:45.487367 kernel: loop4: detected capacity change from 0 to 211168 Sep 12 22:11:45.493232 kernel: loop5: detected capacity change from 0 to 119368 Sep 12 22:11:45.496803 (sd-merge)[1222]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 12 22:11:45.497169 (sd-merge)[1222]: Merged extensions into '/usr'. Sep 12 22:11:45.501407 systemd[1]: Reload requested from client PID 1200 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 22:11:45.502232 systemd[1]: Reloading... Sep 12 22:11:45.556239 zram_generator::config[1244]: No configuration found. Sep 12 22:11:45.643945 ldconfig[1194]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 22:11:45.707574 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 22:11:45.707913 systemd[1]: Reloading finished in 205 ms. Sep 12 22:11:45.729799 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 22:11:45.732342 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 22:11:45.749344 systemd[1]: Starting ensure-sysext.service... Sep 12 22:11:45.750992 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:11:45.760031 systemd[1]: Reload requested from client PID 1284 ('systemctl') (unit ensure-sysext.service)... Sep 12 22:11:45.760045 systemd[1]: Reloading... Sep 12 22:11:45.764354 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 22:11:45.764385 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 22:11:45.764617 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 22:11:45.764828 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 22:11:45.765456 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 22:11:45.765664 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Sep 12 22:11:45.765726 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Sep 12 22:11:45.768577 systemd-tmpfiles[1285]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:11:45.768592 systemd-tmpfiles[1285]: Skipping /boot Sep 12 22:11:45.774483 systemd-tmpfiles[1285]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:11:45.774495 systemd-tmpfiles[1285]: Skipping /boot Sep 12 22:11:45.808219 zram_generator::config[1315]: No configuration found. Sep 12 22:11:45.936776 systemd[1]: Reloading finished in 176 ms. Sep 12 22:11:45.958641 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 22:11:45.980465 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:11:45.987815 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:11:45.990428 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 22:11:46.004816 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 22:11:46.010961 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:11:46.016473 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:11:46.018657 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 22:11:46.027150 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:11:46.030020 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:11:46.032337 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:11:46.034160 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:11:46.035716 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:11:46.035847 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:11:46.038857 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 22:11:46.041148 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 22:11:46.042786 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:11:46.042939 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:11:46.044437 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:11:46.044594 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:11:46.052030 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:11:46.055579 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:11:46.057765 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:11:46.058813 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:11:46.058983 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:11:46.059600 systemd-udevd[1353]: Using default interface naming scheme 'v255'. Sep 12 22:11:46.061324 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 22:11:46.063377 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:11:46.063570 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:11:46.063998 augenrules[1383]: No rules Sep 12 22:11:46.065274 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:11:46.065459 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:11:46.068707 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 22:11:46.072623 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 22:11:46.076558 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:11:46.078088 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:11:46.084341 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 22:11:46.087802 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:11:46.089340 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:11:46.089553 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:11:46.092557 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 22:11:46.101101 systemd[1]: Finished ensure-sysext.service. Sep 12 22:11:46.106731 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:11:46.107636 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:11:46.115524 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:11:46.124353 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:11:46.132491 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:11:46.137515 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:11:46.139469 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:11:46.139521 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:11:46.141703 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:11:46.146306 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 22:11:46.147764 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 22:11:46.164595 augenrules[1427]: /sbin/augenrules: No change Sep 12 22:11:46.166554 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:11:46.166761 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:11:46.168307 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:11:46.168474 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:11:46.169854 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:11:46.169995 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:11:46.171494 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:11:46.171887 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:11:46.181605 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 22:11:46.182058 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:11:46.182115 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:11:46.182211 augenrules[1456]: No rules Sep 12 22:11:46.184631 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:11:46.186289 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:11:46.233880 systemd-resolved[1351]: Positive Trust Anchors: Sep 12 22:11:46.234263 systemd-resolved[1351]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:11:46.234349 systemd-resolved[1351]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:11:46.241754 systemd-resolved[1351]: Defaulting to hostname 'linux'. Sep 12 22:11:46.243501 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:11:46.244174 systemd-networkd[1433]: lo: Link UP Sep 12 22:11:46.244215 systemd-networkd[1433]: lo: Gained carrier Sep 12 22:11:46.244787 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:11:46.244880 systemd-networkd[1433]: Enumeration completed Sep 12 22:11:46.245917 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:11:46.247247 systemd[1]: Reached target network.target - Network. Sep 12 22:11:46.249516 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 22:11:46.251812 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 22:11:46.252882 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 22:11:46.256038 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 22:11:46.257778 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:11:46.259019 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 22:11:46.259641 systemd-networkd[1433]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:11:46.259656 systemd-networkd[1433]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:11:46.260346 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 22:11:46.260696 systemd-networkd[1433]: eth0: Link UP Sep 12 22:11:46.260832 systemd-networkd[1433]: eth0: Gained carrier Sep 12 22:11:46.260848 systemd-networkd[1433]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:11:46.261884 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 22:11:46.263091 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 22:11:46.263123 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:11:46.264165 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 22:11:46.265407 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 22:11:46.266601 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 22:11:46.267855 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:11:46.269774 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 22:11:46.272019 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 22:11:46.274481 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 22:11:46.275788 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 22:11:46.276801 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 22:11:46.279263 systemd-networkd[1433]: eth0: DHCPv4 address 10.0.0.68/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 22:11:46.279832 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 22:11:46.280296 systemd-timesyncd[1434]: Network configuration changed, trying to establish connection. Sep 12 22:11:46.281247 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 22:11:46.281326 systemd-timesyncd[1434]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 12 22:11:46.281382 systemd-timesyncd[1434]: Initial clock synchronization to Fri 2025-09-12 22:11:46.339972 UTC. Sep 12 22:11:46.283623 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 22:11:46.287290 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 22:11:46.288414 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 22:11:46.292344 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:11:46.293096 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:11:46.293971 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:11:46.293999 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:11:46.295290 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 22:11:46.296976 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 22:11:46.300375 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 22:11:46.303626 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 22:11:46.305820 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 22:11:46.306588 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 22:11:46.308415 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 22:11:46.312295 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 22:11:46.316496 jq[1481]: false Sep 12 22:11:46.316319 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 22:11:46.322326 extend-filesystems[1482]: Found /dev/vda6 Sep 12 22:11:46.322266 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 22:11:46.325891 extend-filesystems[1482]: Found /dev/vda9 Sep 12 22:11:46.328255 extend-filesystems[1482]: Checking size of /dev/vda9 Sep 12 22:11:46.327564 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 22:11:46.329853 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 22:11:46.330275 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 22:11:46.331859 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 22:11:46.334356 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 22:11:46.336187 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 22:11:46.338153 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 22:11:46.339392 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 22:11:46.339570 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 22:11:46.340963 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 22:11:46.341133 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 22:11:46.342888 jq[1506]: true Sep 12 22:11:46.344137 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 22:11:46.344338 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 22:11:46.354336 extend-filesystems[1482]: Resized partition /dev/vda9 Sep 12 22:11:46.359807 extend-filesystems[1528]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 22:11:46.364243 tar[1510]: linux-arm64/LICENSE Sep 12 22:11:46.364243 tar[1510]: linux-arm64/helm Sep 12 22:11:46.364470 jq[1512]: true Sep 12 22:11:46.365425 (ntainerd)[1529]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 22:11:46.372203 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 12 22:11:46.377237 update_engine[1502]: I20250912 22:11:46.376535 1502 main.cc:92] Flatcar Update Engine starting Sep 12 22:11:46.396335 dbus-daemon[1478]: [system] SELinux support is enabled Sep 12 22:11:46.396584 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 22:11:46.401390 update_engine[1502]: I20250912 22:11:46.401335 1502 update_check_scheduler.cc:74] Next update check in 2m25s Sep 12 22:11:46.401959 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 22:11:46.401984 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 22:11:46.404094 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 22:11:46.404119 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 22:11:46.410373 systemd[1]: Started update-engine.service - Update Engine. Sep 12 22:11:46.416333 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:11:46.420252 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 22:11:46.434272 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 12 22:11:46.454333 extend-filesystems[1528]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 12 22:11:46.454333 extend-filesystems[1528]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 22:11:46.454333 extend-filesystems[1528]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 12 22:11:46.464406 extend-filesystems[1482]: Resized filesystem in /dev/vda9 Sep 12 22:11:46.456935 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 22:11:46.465877 bash[1552]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:11:46.458632 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 22:11:46.460146 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 22:11:46.465583 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 22:11:46.540216 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:11:46.566133 containerd[1529]: time="2025-09-12T22:11:46Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 22:11:46.569208 containerd[1529]: time="2025-09-12T22:11:46.569071080Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 22:11:46.584653 locksmithd[1553]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 22:11:46.587685 systemd-logind[1501]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 22:11:46.587902 systemd-logind[1501]: New seat seat0. Sep 12 22:11:46.589214 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 22:11:46.598178 containerd[1529]: time="2025-09-12T22:11:46.598112440Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.52µs" Sep 12 22:11:46.598178 containerd[1529]: time="2025-09-12T22:11:46.598165640Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 22:11:46.598308 containerd[1529]: time="2025-09-12T22:11:46.598209120Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 22:11:46.598420 containerd[1529]: time="2025-09-12T22:11:46.598393440Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 22:11:46.598452 containerd[1529]: time="2025-09-12T22:11:46.598421760Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 22:11:46.598472 containerd[1529]: time="2025-09-12T22:11:46.598462840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:11:46.598548 containerd[1529]: time="2025-09-12T22:11:46.598529120Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:11:46.598548 containerd[1529]: time="2025-09-12T22:11:46.598547080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:11:46.598858 containerd[1529]: time="2025-09-12T22:11:46.598822920Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:11:46.598858 containerd[1529]: time="2025-09-12T22:11:46.598855720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:11:46.598905 containerd[1529]: time="2025-09-12T22:11:46.598868000Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:11:46.598905 containerd[1529]: time="2025-09-12T22:11:46.598876320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 22:11:46.598993 containerd[1529]: time="2025-09-12T22:11:46.598969000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 22:11:46.599260 containerd[1529]: time="2025-09-12T22:11:46.599236040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:11:46.599296 containerd[1529]: time="2025-09-12T22:11:46.599284160Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:11:46.599317 containerd[1529]: time="2025-09-12T22:11:46.599296040Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 22:11:46.599348 containerd[1529]: time="2025-09-12T22:11:46.599331280Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 22:11:46.599624 containerd[1529]: time="2025-09-12T22:11:46.599600920Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 22:11:46.599714 containerd[1529]: time="2025-09-12T22:11:46.599693680Z" level=info msg="metadata content store policy set" policy=shared Sep 12 22:11:46.603649 containerd[1529]: time="2025-09-12T22:11:46.603605200Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 22:11:46.603759 containerd[1529]: time="2025-09-12T22:11:46.603696200Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 22:11:46.603759 containerd[1529]: time="2025-09-12T22:11:46.603713480Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 22:11:46.603759 containerd[1529]: time="2025-09-12T22:11:46.603728880Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 22:11:46.603817 containerd[1529]: time="2025-09-12T22:11:46.603774280Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 22:11:46.603817 containerd[1529]: time="2025-09-12T22:11:46.603790280Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 22:11:46.603817 containerd[1529]: time="2025-09-12T22:11:46.603803400Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 22:11:46.603869 containerd[1529]: time="2025-09-12T22:11:46.603828120Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 22:11:46.603869 containerd[1529]: time="2025-09-12T22:11:46.603840960Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 22:11:46.603869 containerd[1529]: time="2025-09-12T22:11:46.603851840Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 22:11:46.603869 containerd[1529]: time="2025-09-12T22:11:46.603861400Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 22:11:46.603930 containerd[1529]: time="2025-09-12T22:11:46.603878080Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604010640Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604040120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604057000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604067880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604077280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604087240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604098640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604109160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604120520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604132480Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 22:11:46.604201 containerd[1529]: time="2025-09-12T22:11:46.604142840Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 22:11:46.604405 containerd[1529]: time="2025-09-12T22:11:46.604354920Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 22:11:46.604405 containerd[1529]: time="2025-09-12T22:11:46.604377000Z" level=info msg="Start snapshots syncer" Sep 12 22:11:46.604442 containerd[1529]: time="2025-09-12T22:11:46.604406640Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 22:11:46.604703 containerd[1529]: time="2025-09-12T22:11:46.604650200Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 22:11:46.604913 containerd[1529]: time="2025-09-12T22:11:46.604717880Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 22:11:46.604913 containerd[1529]: time="2025-09-12T22:11:46.604814560Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.604962680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605002880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605015560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605026880Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605038760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605048640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605060320Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605086680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605104440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605114360Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605153160Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605167040Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:11:46.605203 containerd[1529]: time="2025-09-12T22:11:46.605175800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605209440Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605217880Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605229000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605254160Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605332200Z" level=info msg="runtime interface created" Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605337280Z" level=info msg="created NRI interface" Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605345440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605358640Z" level=info msg="Connect containerd service" Sep 12 22:11:46.605433 containerd[1529]: time="2025-09-12T22:11:46.605387640Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 22:11:46.606283 containerd[1529]: time="2025-09-12T22:11:46.606250400Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670476880Z" level=info msg="Start subscribing containerd event" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670558480Z" level=info msg="Start recovering state" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670655120Z" level=info msg="Start event monitor" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670678680Z" level=info msg="Start cni network conf syncer for default" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670687520Z" level=info msg="Start streaming server" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670696360Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670703000Z" level=info msg="runtime interface starting up..." Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670708120Z" level=info msg="starting plugins..." Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670722400Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 22:11:46.670834 containerd[1529]: time="2025-09-12T22:11:46.670806280Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 22:11:46.671087 containerd[1529]: time="2025-09-12T22:11:46.670857120Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 22:11:46.671087 containerd[1529]: time="2025-09-12T22:11:46.670912720Z" level=info msg="containerd successfully booted in 0.105466s" Sep 12 22:11:46.671050 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 22:11:46.758565 tar[1510]: linux-arm64/README.md Sep 12 22:11:46.785235 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 22:11:47.635333 systemd-networkd[1433]: eth0: Gained IPv6LL Sep 12 22:11:47.637796 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 22:11:47.639480 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 22:11:47.641986 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 12 22:11:47.644510 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:11:47.647610 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 22:11:47.667645 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 22:11:47.667862 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 12 22:11:47.669248 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 22:11:47.670826 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 22:11:47.686301 sshd_keygen[1530]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 22:11:47.705262 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 22:11:47.707785 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 22:11:47.727119 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 22:11:47.728346 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 22:11:47.732444 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 22:11:47.752077 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 22:11:47.754917 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 22:11:47.758450 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 22:11:47.759726 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 22:11:48.249674 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:48.251489 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 22:11:48.254701 (kubelet)[1631]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:11:48.256286 systemd[1]: Startup finished in 1.992s (kernel) + 5.137s (initrd) + 3.570s (userspace) = 10.700s. Sep 12 22:11:48.631048 kubelet[1631]: E0912 22:11:48.630924 1631 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:11:48.633578 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:11:48.633708 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:11:48.634087 systemd[1]: kubelet.service: Consumed 759ms CPU time, 257.8M memory peak. Sep 12 22:11:52.933526 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 22:11:52.934548 systemd[1]: Started sshd@0-10.0.0.68:22-10.0.0.1:50344.service - OpenSSH per-connection server daemon (10.0.0.1:50344). Sep 12 22:11:53.018274 sshd[1644]: Accepted publickey for core from 10.0.0.1 port 50344 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:11:53.019930 sshd-session[1644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:11:53.030292 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 22:11:53.031366 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 22:11:53.033165 systemd-logind[1501]: New session 1 of user core. Sep 12 22:11:53.058919 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 22:11:53.062205 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 22:11:53.078432 (systemd)[1649]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 22:11:53.082291 systemd-logind[1501]: New session c1 of user core. Sep 12 22:11:53.215929 systemd[1649]: Queued start job for default target default.target. Sep 12 22:11:53.226129 systemd[1649]: Created slice app.slice - User Application Slice. Sep 12 22:11:53.226161 systemd[1649]: Reached target paths.target - Paths. Sep 12 22:11:53.226219 systemd[1649]: Reached target timers.target - Timers. Sep 12 22:11:53.227387 systemd[1649]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 22:11:53.237949 systemd[1649]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 22:11:53.238057 systemd[1649]: Reached target sockets.target - Sockets. Sep 12 22:11:53.238104 systemd[1649]: Reached target basic.target - Basic System. Sep 12 22:11:53.238139 systemd[1649]: Reached target default.target - Main User Target. Sep 12 22:11:53.238166 systemd[1649]: Startup finished in 149ms. Sep 12 22:11:53.238301 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 22:11:53.240108 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 22:11:53.310582 systemd[1]: Started sshd@1-10.0.0.68:22-10.0.0.1:50350.service - OpenSSH per-connection server daemon (10.0.0.1:50350). Sep 12 22:11:53.378505 sshd[1660]: Accepted publickey for core from 10.0.0.1 port 50350 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:11:53.379649 sshd-session[1660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:11:53.383366 systemd-logind[1501]: New session 2 of user core. Sep 12 22:11:53.398368 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 22:11:53.449093 sshd[1663]: Connection closed by 10.0.0.1 port 50350 Sep 12 22:11:53.449584 sshd-session[1660]: pam_unix(sshd:session): session closed for user core Sep 12 22:11:53.459054 systemd[1]: sshd@1-10.0.0.68:22-10.0.0.1:50350.service: Deactivated successfully. Sep 12 22:11:53.460522 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 22:11:53.461136 systemd-logind[1501]: Session 2 logged out. Waiting for processes to exit. Sep 12 22:11:53.462935 systemd[1]: Started sshd@2-10.0.0.68:22-10.0.0.1:50362.service - OpenSSH per-connection server daemon (10.0.0.1:50362). Sep 12 22:11:53.463827 systemd-logind[1501]: Removed session 2. Sep 12 22:11:53.523262 sshd[1669]: Accepted publickey for core from 10.0.0.1 port 50362 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:11:53.524642 sshd-session[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:11:53.528213 systemd-logind[1501]: New session 3 of user core. Sep 12 22:11:53.537357 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 22:11:53.587873 sshd[1672]: Connection closed by 10.0.0.1 port 50362 Sep 12 22:11:53.588206 sshd-session[1669]: pam_unix(sshd:session): session closed for user core Sep 12 22:11:53.597052 systemd[1]: sshd@2-10.0.0.68:22-10.0.0.1:50362.service: Deactivated successfully. Sep 12 22:11:53.598478 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 22:11:53.599218 systemd-logind[1501]: Session 3 logged out. Waiting for processes to exit. Sep 12 22:11:53.601007 systemd[1]: Started sshd@3-10.0.0.68:22-10.0.0.1:50376.service - OpenSSH per-connection server daemon (10.0.0.1:50376). Sep 12 22:11:53.603948 systemd-logind[1501]: Removed session 3. Sep 12 22:11:53.664639 sshd[1678]: Accepted publickey for core from 10.0.0.1 port 50376 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:11:53.665881 sshd-session[1678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:11:53.670242 systemd-logind[1501]: New session 4 of user core. Sep 12 22:11:53.680361 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 22:11:53.733229 sshd[1681]: Connection closed by 10.0.0.1 port 50376 Sep 12 22:11:53.733086 sshd-session[1678]: pam_unix(sshd:session): session closed for user core Sep 12 22:11:53.753163 systemd[1]: sshd@3-10.0.0.68:22-10.0.0.1:50376.service: Deactivated successfully. Sep 12 22:11:53.754617 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 22:11:53.755318 systemd-logind[1501]: Session 4 logged out. Waiting for processes to exit. Sep 12 22:11:53.757175 systemd[1]: Started sshd@4-10.0.0.68:22-10.0.0.1:50388.service - OpenSSH per-connection server daemon (10.0.0.1:50388). Sep 12 22:11:53.758419 systemd-logind[1501]: Removed session 4. Sep 12 22:11:53.816813 sshd[1687]: Accepted publickey for core from 10.0.0.1 port 50388 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:11:53.818038 sshd-session[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:11:53.824464 systemd-logind[1501]: New session 5 of user core. Sep 12 22:11:53.839357 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 22:11:53.900392 sudo[1691]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 22:11:53.900663 sudo[1691]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:11:53.914125 sudo[1691]: pam_unix(sudo:session): session closed for user root Sep 12 22:11:53.915655 sshd[1690]: Connection closed by 10.0.0.1 port 50388 Sep 12 22:11:53.916178 sshd-session[1687]: pam_unix(sshd:session): session closed for user core Sep 12 22:11:53.926359 systemd[1]: sshd@4-10.0.0.68:22-10.0.0.1:50388.service: Deactivated successfully. Sep 12 22:11:53.929617 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 22:11:53.930366 systemd-logind[1501]: Session 5 logged out. Waiting for processes to exit. Sep 12 22:11:53.932817 systemd[1]: Started sshd@5-10.0.0.68:22-10.0.0.1:50396.service - OpenSSH per-connection server daemon (10.0.0.1:50396). Sep 12 22:11:53.933421 systemd-logind[1501]: Removed session 5. Sep 12 22:11:53.984979 sshd[1697]: Accepted publickey for core from 10.0.0.1 port 50396 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:11:53.986391 sshd-session[1697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:11:53.990115 systemd-logind[1501]: New session 6 of user core. Sep 12 22:11:53.998361 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 22:11:54.049797 sudo[1702]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 22:11:54.050056 sudo[1702]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:11:54.135220 sudo[1702]: pam_unix(sudo:session): session closed for user root Sep 12 22:11:54.140381 sudo[1701]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 22:11:54.140654 sudo[1701]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:11:54.149802 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:11:54.198553 augenrules[1724]: No rules Sep 12 22:11:54.199678 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:11:54.199894 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:11:54.200866 sudo[1701]: pam_unix(sudo:session): session closed for user root Sep 12 22:11:54.203521 sshd[1700]: Connection closed by 10.0.0.1 port 50396 Sep 12 22:11:54.202644 sshd-session[1697]: pam_unix(sshd:session): session closed for user core Sep 12 22:11:54.221356 systemd[1]: sshd@5-10.0.0.68:22-10.0.0.1:50396.service: Deactivated successfully. Sep 12 22:11:54.223668 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 22:11:54.224582 systemd-logind[1501]: Session 6 logged out. Waiting for processes to exit. Sep 12 22:11:54.229492 systemd[1]: Started sshd@6-10.0.0.68:22-10.0.0.1:50398.service - OpenSSH per-connection server daemon (10.0.0.1:50398). Sep 12 22:11:54.229937 systemd-logind[1501]: Removed session 6. Sep 12 22:11:54.286721 sshd[1733]: Accepted publickey for core from 10.0.0.1 port 50398 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:11:54.287968 sshd-session[1733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:11:54.292569 systemd-logind[1501]: New session 7 of user core. Sep 12 22:11:54.309378 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 22:11:54.360404 sudo[1737]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 22:11:54.360670 sudo[1737]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:11:54.624873 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 22:11:54.636547 (dockerd)[1757]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 22:11:54.831751 dockerd[1757]: time="2025-09-12T22:11:54.831682318Z" level=info msg="Starting up" Sep 12 22:11:54.832522 dockerd[1757]: time="2025-09-12T22:11:54.832502154Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 22:11:54.842800 dockerd[1757]: time="2025-09-12T22:11:54.842762081Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 22:11:54.877037 dockerd[1757]: time="2025-09-12T22:11:54.876604581Z" level=info msg="Loading containers: start." Sep 12 22:11:54.884225 kernel: Initializing XFRM netlink socket Sep 12 22:11:55.074146 systemd-networkd[1433]: docker0: Link UP Sep 12 22:11:55.077181 dockerd[1757]: time="2025-09-12T22:11:55.077111292Z" level=info msg="Loading containers: done." Sep 12 22:11:55.089005 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck857879221-merged.mount: Deactivated successfully. Sep 12 22:11:55.095060 dockerd[1757]: time="2025-09-12T22:11:55.095001096Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 22:11:55.095176 dockerd[1757]: time="2025-09-12T22:11:55.095091603Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 22:11:55.095219 dockerd[1757]: time="2025-09-12T22:11:55.095180186Z" level=info msg="Initializing buildkit" Sep 12 22:11:55.123022 dockerd[1757]: time="2025-09-12T22:11:55.122973542Z" level=info msg="Completed buildkit initialization" Sep 12 22:11:55.131118 dockerd[1757]: time="2025-09-12T22:11:55.130582082Z" level=info msg="Daemon has completed initialization" Sep 12 22:11:55.131118 dockerd[1757]: time="2025-09-12T22:11:55.130719027Z" level=info msg="API listen on /run/docker.sock" Sep 12 22:11:55.130839 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 22:11:55.738668 containerd[1529]: time="2025-09-12T22:11:55.738318570Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 12 22:11:56.301576 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount949880113.mount: Deactivated successfully. Sep 12 22:11:57.340276 containerd[1529]: time="2025-09-12T22:11:57.340209922Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:57.342018 containerd[1529]: time="2025-09-12T22:11:57.341986666Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=27390230" Sep 12 22:11:57.343982 containerd[1529]: time="2025-09-12T22:11:57.343946643Z" level=info msg="ImageCreate event name:\"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:57.349263 containerd[1529]: time="2025-09-12T22:11:57.349221769Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:57.350339 containerd[1529]: time="2025-09-12T22:11:57.350313112Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"27386827\" in 1.611953284s" Sep 12 22:11:57.350397 containerd[1529]: time="2025-09-12T22:11:57.350342609Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\"" Sep 12 22:11:57.351617 containerd[1529]: time="2025-09-12T22:11:57.351593300Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 12 22:11:58.572162 containerd[1529]: time="2025-09-12T22:11:58.571640487Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:58.572162 containerd[1529]: time="2025-09-12T22:11:58.572131114Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=23547919" Sep 12 22:11:58.573413 containerd[1529]: time="2025-09-12T22:11:58.573375813Z" level=info msg="ImageCreate event name:\"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:58.576376 containerd[1529]: time="2025-09-12T22:11:58.576337328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:58.578002 containerd[1529]: time="2025-09-12T22:11:58.577851723Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"25135832\" in 1.226227365s" Sep 12 22:11:58.578002 containerd[1529]: time="2025-09-12T22:11:58.577884017Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\"" Sep 12 22:11:58.578375 containerd[1529]: time="2025-09-12T22:11:58.578348721Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 12 22:11:58.718882 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 22:11:58.720628 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:11:58.853427 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:11:58.856838 (kubelet)[2045]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:11:58.955359 kubelet[2045]: E0912 22:11:58.955286 2045 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:11:58.958426 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:11:58.958557 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:11:58.958839 systemd[1]: kubelet.service: Consumed 142ms CPU time, 106.4M memory peak. Sep 12 22:11:59.744068 containerd[1529]: time="2025-09-12T22:11:59.744005490Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:59.745207 containerd[1529]: time="2025-09-12T22:11:59.745065254Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=18295979" Sep 12 22:11:59.746088 containerd[1529]: time="2025-09-12T22:11:59.746065611Z" level=info msg="ImageCreate event name:\"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:59.748796 containerd[1529]: time="2025-09-12T22:11:59.748769001Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:11:59.750572 containerd[1529]: time="2025-09-12T22:11:59.750444515Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"19883910\" in 1.172061778s" Sep 12 22:11:59.750572 containerd[1529]: time="2025-09-12T22:11:59.750487017Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\"" Sep 12 22:11:59.750962 containerd[1529]: time="2025-09-12T22:11:59.750914448Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 12 22:12:00.652337 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2877466421.mount: Deactivated successfully. Sep 12 22:12:00.874979 containerd[1529]: time="2025-09-12T22:12:00.874916941Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:00.876100 containerd[1529]: time="2025-09-12T22:12:00.876072233Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=28240108" Sep 12 22:12:00.877207 containerd[1529]: time="2025-09-12T22:12:00.877040364Z" level=info msg="ImageCreate event name:\"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:00.879082 containerd[1529]: time="2025-09-12T22:12:00.879051522Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:00.879982 containerd[1529]: time="2025-09-12T22:12:00.879946478Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"28239125\" in 1.129007916s" Sep 12 22:12:00.880167 containerd[1529]: time="2025-09-12T22:12:00.880069597Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\"" Sep 12 22:12:00.880717 containerd[1529]: time="2025-09-12T22:12:00.880665847Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 12 22:12:01.424882 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2645099308.mount: Deactivated successfully. Sep 12 22:12:02.339231 containerd[1529]: time="2025-09-12T22:12:02.339089261Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:02.339729 containerd[1529]: time="2025-09-12T22:12:02.339682088Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152119" Sep 12 22:12:02.340600 containerd[1529]: time="2025-09-12T22:12:02.340534812Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:02.343787 containerd[1529]: time="2025-09-12T22:12:02.343730694Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:02.344807 containerd[1529]: time="2025-09-12T22:12:02.344356793Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.46365378s" Sep 12 22:12:02.344807 containerd[1529]: time="2025-09-12T22:12:02.344391948Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 12 22:12:02.345165 containerd[1529]: time="2025-09-12T22:12:02.345143131Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 22:12:02.777630 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3153936846.mount: Deactivated successfully. Sep 12 22:12:02.781791 containerd[1529]: time="2025-09-12T22:12:02.781746933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:12:02.782448 containerd[1529]: time="2025-09-12T22:12:02.782238539Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 12 22:12:02.783108 containerd[1529]: time="2025-09-12T22:12:02.783073005Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:12:02.784914 containerd[1529]: time="2025-09-12T22:12:02.784881274Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:12:02.785777 containerd[1529]: time="2025-09-12T22:12:02.785703167Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 440.425743ms" Sep 12 22:12:02.785777 containerd[1529]: time="2025-09-12T22:12:02.785733397Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 22:12:02.786440 containerd[1529]: time="2025-09-12T22:12:02.786415952Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 12 22:12:03.211009 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3782416529.mount: Deactivated successfully. Sep 12 22:12:04.728587 containerd[1529]: time="2025-09-12T22:12:04.728538670Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:04.729309 containerd[1529]: time="2025-09-12T22:12:04.729015831Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465859" Sep 12 22:12:04.730329 containerd[1529]: time="2025-09-12T22:12:04.730295721Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:04.733055 containerd[1529]: time="2025-09-12T22:12:04.733026790Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:04.734228 containerd[1529]: time="2025-09-12T22:12:04.734196036Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 1.947751057s" Sep 12 22:12:04.734228 containerd[1529]: time="2025-09-12T22:12:04.734228821Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 12 22:12:08.968890 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 22:12:08.970843 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:12:09.135343 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:12:09.150494 (kubelet)[2207]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:12:09.183008 kubelet[2207]: E0912 22:12:09.182946 2207 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:12:09.185750 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:12:09.185896 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:12:09.187330 systemd[1]: kubelet.service: Consumed 132ms CPU time, 107.5M memory peak. Sep 12 22:12:09.361308 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:12:09.361446 systemd[1]: kubelet.service: Consumed 132ms CPU time, 107.5M memory peak. Sep 12 22:12:09.363330 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:12:09.381267 systemd[1]: Reload requested from client PID 2222 ('systemctl') (unit session-7.scope)... Sep 12 22:12:09.381280 systemd[1]: Reloading... Sep 12 22:12:09.445277 zram_generator::config[2262]: No configuration found. Sep 12 22:12:09.671639 systemd[1]: Reloading finished in 290 ms. Sep 12 22:12:09.741654 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 22:12:09.741728 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 22:12:09.741959 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:12:09.742003 systemd[1]: kubelet.service: Consumed 87ms CPU time, 95M memory peak. Sep 12 22:12:09.744443 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:12:09.902570 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:12:09.906663 (kubelet)[2310]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:12:09.937858 kubelet[2310]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:12:09.937858 kubelet[2310]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 22:12:09.937858 kubelet[2310]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:12:09.937858 kubelet[2310]: I0912 22:12:09.937826 2310 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:12:10.762195 kubelet[2310]: I0912 22:12:10.762157 2310 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 22:12:10.762308 kubelet[2310]: I0912 22:12:10.762215 2310 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:12:10.762427 kubelet[2310]: I0912 22:12:10.762414 2310 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 22:12:10.785589 kubelet[2310]: E0912 22:12:10.785510 2310 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.68:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.68:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 12 22:12:10.786630 kubelet[2310]: I0912 22:12:10.786603 2310 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:12:10.793563 kubelet[2310]: I0912 22:12:10.793530 2310 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:12:10.796363 kubelet[2310]: I0912 22:12:10.796337 2310 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:12:10.797391 kubelet[2310]: I0912 22:12:10.797349 2310 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:12:10.797556 kubelet[2310]: I0912 22:12:10.797389 2310 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:12:10.797650 kubelet[2310]: I0912 22:12:10.797623 2310 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:12:10.797650 kubelet[2310]: I0912 22:12:10.797633 2310 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 22:12:10.797856 kubelet[2310]: I0912 22:12:10.797830 2310 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:12:10.800287 kubelet[2310]: I0912 22:12:10.800263 2310 kubelet.go:480] "Attempting to sync node with API server" Sep 12 22:12:10.800287 kubelet[2310]: I0912 22:12:10.800287 2310 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:12:10.800382 kubelet[2310]: I0912 22:12:10.800315 2310 kubelet.go:386] "Adding apiserver pod source" Sep 12 22:12:10.801434 kubelet[2310]: I0912 22:12:10.801292 2310 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:12:10.802278 kubelet[2310]: I0912 22:12:10.802226 2310 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:12:10.803227 kubelet[2310]: I0912 22:12:10.802977 2310 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 22:12:10.803227 kubelet[2310]: W0912 22:12:10.803095 2310 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 22:12:10.803817 kubelet[2310]: E0912 22:12:10.803777 2310 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.68:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.68:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 22:12:10.803817 kubelet[2310]: E0912 22:12:10.803778 2310 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.68:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.68:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 22:12:10.805642 kubelet[2310]: I0912 22:12:10.805619 2310 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 22:12:10.805724 kubelet[2310]: I0912 22:12:10.805664 2310 server.go:1289] "Started kubelet" Sep 12 22:12:10.805819 kubelet[2310]: I0912 22:12:10.805790 2310 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:12:10.807476 kubelet[2310]: I0912 22:12:10.807449 2310 server.go:317] "Adding debug handlers to kubelet server" Sep 12 22:12:10.809721 kubelet[2310]: I0912 22:12:10.809653 2310 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:12:10.809943 kubelet[2310]: I0912 22:12:10.809919 2310 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:12:10.811042 kubelet[2310]: E0912 22:12:10.810108 2310 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.68:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.68:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864a89943ffcd8d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 22:12:10.805636493 +0000 UTC m=+0.896028417,LastTimestamp:2025-09-12 22:12:10.805636493 +0000 UTC m=+0.896028417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 22:12:10.812015 kubelet[2310]: I0912 22:12:10.811991 2310 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:12:10.812536 kubelet[2310]: I0912 22:12:10.812468 2310 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 22:12:10.812606 kubelet[2310]: I0912 22:12:10.812586 2310 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:12:10.812991 kubelet[2310]: E0912 22:12:10.812961 2310 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:12:10.813608 kubelet[2310]: I0912 22:12:10.813339 2310 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 22:12:10.813608 kubelet[2310]: I0912 22:12:10.813399 2310 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:12:10.813927 kubelet[2310]: E0912 22:12:10.813891 2310 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.68:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.68:6443: connect: connection refused" interval="200ms" Sep 12 22:12:10.814147 kubelet[2310]: E0912 22:12:10.813915 2310 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.68:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.68:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 22:12:10.815164 kubelet[2310]: I0912 22:12:10.815139 2310 factory.go:223] Registration of the containerd container factory successfully Sep 12 22:12:10.815164 kubelet[2310]: I0912 22:12:10.815157 2310 factory.go:223] Registration of the systemd container factory successfully Sep 12 22:12:10.815273 kubelet[2310]: I0912 22:12:10.815239 2310 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:12:10.816373 kubelet[2310]: E0912 22:12:10.816351 2310 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:12:10.824319 kubelet[2310]: I0912 22:12:10.824301 2310 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 22:12:10.824319 kubelet[2310]: I0912 22:12:10.824316 2310 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 22:12:10.824415 kubelet[2310]: I0912 22:12:10.824332 2310 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:12:10.913857 kubelet[2310]: E0912 22:12:10.913783 2310 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:12:11.009959 kubelet[2310]: I0912 22:12:11.009894 2310 policy_none.go:49] "None policy: Start" Sep 12 22:12:11.009959 kubelet[2310]: I0912 22:12:11.009936 2310 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 22:12:11.009959 kubelet[2310]: I0912 22:12:11.009965 2310 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:12:11.014001 kubelet[2310]: E0912 22:12:11.013918 2310 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:12:11.014535 kubelet[2310]: E0912 22:12:11.014504 2310 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.68:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.68:6443: connect: connection refused" interval="400ms" Sep 12 22:12:11.014827 kubelet[2310]: I0912 22:12:11.014794 2310 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 22:12:11.015881 kubelet[2310]: I0912 22:12:11.015851 2310 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 22:12:11.015881 kubelet[2310]: I0912 22:12:11.015873 2310 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 22:12:11.015948 kubelet[2310]: I0912 22:12:11.015890 2310 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 22:12:11.015948 kubelet[2310]: I0912 22:12:11.015897 2310 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 22:12:11.016884 kubelet[2310]: E0912 22:12:11.016731 2310 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.68:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.68:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 22:12:11.017055 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 22:12:11.017316 kubelet[2310]: E0912 22:12:11.016328 2310 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:12:11.025757 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 22:12:11.029020 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 22:12:11.046143 kubelet[2310]: E0912 22:12:11.046051 2310 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 22:12:11.046477 kubelet[2310]: I0912 22:12:11.046302 2310 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:12:11.046477 kubelet[2310]: I0912 22:12:11.046315 2310 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:12:11.046664 kubelet[2310]: I0912 22:12:11.046592 2310 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:12:11.047830 kubelet[2310]: E0912 22:12:11.047588 2310 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 22:12:11.047830 kubelet[2310]: E0912 22:12:11.047731 2310 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 22:12:11.128541 systemd[1]: Created slice kubepods-burstable-pod399211a1f5532b3485bd9a297c9b4958.slice - libcontainer container kubepods-burstable-pod399211a1f5532b3485bd9a297c9b4958.slice. Sep 12 22:12:11.149162 kubelet[2310]: I0912 22:12:11.149129 2310 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:12:11.149668 kubelet[2310]: E0912 22:12:11.149638 2310 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.68:6443/api/v1/nodes\": dial tcp 10.0.0.68:6443: connect: connection refused" node="localhost" Sep 12 22:12:11.153019 kubelet[2310]: E0912 22:12:11.152989 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:11.156575 systemd[1]: Created slice kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice - libcontainer container kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice. Sep 12 22:12:11.175551 kubelet[2310]: E0912 22:12:11.175507 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:11.178490 systemd[1]: Created slice kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice - libcontainer container kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice. Sep 12 22:12:11.180083 kubelet[2310]: E0912 22:12:11.180050 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:11.215460 kubelet[2310]: I0912 22:12:11.215419 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:11.215460 kubelet[2310]: I0912 22:12:11.215462 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:11.215594 kubelet[2310]: I0912 22:12:11.215483 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:11.215594 kubelet[2310]: I0912 22:12:11.215500 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/399211a1f5532b3485bd9a297c9b4958-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"399211a1f5532b3485bd9a297c9b4958\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:11.215594 kubelet[2310]: I0912 22:12:11.215557 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/399211a1f5532b3485bd9a297c9b4958-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"399211a1f5532b3485bd9a297c9b4958\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:11.215657 kubelet[2310]: I0912 22:12:11.215602 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/399211a1f5532b3485bd9a297c9b4958-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"399211a1f5532b3485bd9a297c9b4958\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:11.215657 kubelet[2310]: I0912 22:12:11.215638 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:11.215695 kubelet[2310]: I0912 22:12:11.215656 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:11.215695 kubelet[2310]: I0912 22:12:11.215673 2310 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:12:11.351241 kubelet[2310]: I0912 22:12:11.351126 2310 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:12:11.351626 kubelet[2310]: E0912 22:12:11.351503 2310 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.68:6443/api/v1/nodes\": dial tcp 10.0.0.68:6443: connect: connection refused" node="localhost" Sep 12 22:12:11.415273 kubelet[2310]: E0912 22:12:11.415224 2310 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.68:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.68:6443: connect: connection refused" interval="800ms" Sep 12 22:12:11.454191 containerd[1529]: time="2025-09-12T22:12:11.454139091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:399211a1f5532b3485bd9a297c9b4958,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:11.469116 containerd[1529]: time="2025-09-12T22:12:11.469079458Z" level=info msg="connecting to shim 60cb690d9a10c6112253682068914c6ad98dda8778228ac696d065b161b809bf" address="unix:///run/containerd/s/7cecfa41c555eeb787d79667fbebc9af3b3b7b32a5b15b6c513b035a14829280" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:11.477384 containerd[1529]: time="2025-09-12T22:12:11.477348400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:11.481075 containerd[1529]: time="2025-09-12T22:12:11.481039739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:11.499407 systemd[1]: Started cri-containerd-60cb690d9a10c6112253682068914c6ad98dda8778228ac696d065b161b809bf.scope - libcontainer container 60cb690d9a10c6112253682068914c6ad98dda8778228ac696d065b161b809bf. Sep 12 22:12:11.505475 containerd[1529]: time="2025-09-12T22:12:11.505413514Z" level=info msg="connecting to shim 3920f559ef672aa525f9410d6ccf5777e8591f45306180313fc983ef567e1458" address="unix:///run/containerd/s/bcb9d76c371525a589b91d57061480bc8f37f614b8d8b58e28a37b9310a8e190" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:11.509779 containerd[1529]: time="2025-09-12T22:12:11.509678183Z" level=info msg="connecting to shim fe89e04f1bc88cb3b7198a3ce5758d2490454a808ce30323b6f79f2f885b50f7" address="unix:///run/containerd/s/1ce4c22d47b118888a0a3c2b61e32e10a07d2e94360357ada30d030ee38fb767" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:11.540598 systemd[1]: Started cri-containerd-3920f559ef672aa525f9410d6ccf5777e8591f45306180313fc983ef567e1458.scope - libcontainer container 3920f559ef672aa525f9410d6ccf5777e8591f45306180313fc983ef567e1458. Sep 12 22:12:11.542175 systemd[1]: Started cri-containerd-fe89e04f1bc88cb3b7198a3ce5758d2490454a808ce30323b6f79f2f885b50f7.scope - libcontainer container fe89e04f1bc88cb3b7198a3ce5758d2490454a808ce30323b6f79f2f885b50f7. Sep 12 22:12:11.552566 containerd[1529]: time="2025-09-12T22:12:11.552515975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:399211a1f5532b3485bd9a297c9b4958,Namespace:kube-system,Attempt:0,} returns sandbox id \"60cb690d9a10c6112253682068914c6ad98dda8778228ac696d065b161b809bf\"" Sep 12 22:12:11.559339 containerd[1529]: time="2025-09-12T22:12:11.559296073Z" level=info msg="CreateContainer within sandbox \"60cb690d9a10c6112253682068914c6ad98dda8778228ac696d065b161b809bf\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 22:12:11.567863 containerd[1529]: time="2025-09-12T22:12:11.567822691Z" level=info msg="Container 4c9c08b85f5e5acd3cc44d586677072a8885f7cafdd83cdc0c239889cb92ec8c: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:11.576156 containerd[1529]: time="2025-09-12T22:12:11.576080549Z" level=info msg="CreateContainer within sandbox \"60cb690d9a10c6112253682068914c6ad98dda8778228ac696d065b161b809bf\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4c9c08b85f5e5acd3cc44d586677072a8885f7cafdd83cdc0c239889cb92ec8c\"" Sep 12 22:12:11.577685 containerd[1529]: time="2025-09-12T22:12:11.577628490Z" level=info msg="StartContainer for \"4c9c08b85f5e5acd3cc44d586677072a8885f7cafdd83cdc0c239889cb92ec8c\"" Sep 12 22:12:11.579855 containerd[1529]: time="2025-09-12T22:12:11.579763325Z" level=info msg="connecting to shim 4c9c08b85f5e5acd3cc44d586677072a8885f7cafdd83cdc0c239889cb92ec8c" address="unix:///run/containerd/s/7cecfa41c555eeb787d79667fbebc9af3b3b7b32a5b15b6c513b035a14829280" protocol=ttrpc version=3 Sep 12 22:12:11.588984 containerd[1529]: time="2025-09-12T22:12:11.588927093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,} returns sandbox id \"fe89e04f1bc88cb3b7198a3ce5758d2490454a808ce30323b6f79f2f885b50f7\"" Sep 12 22:12:11.590815 containerd[1529]: time="2025-09-12T22:12:11.590768321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,} returns sandbox id \"3920f559ef672aa525f9410d6ccf5777e8591f45306180313fc983ef567e1458\"" Sep 12 22:12:11.593565 containerd[1529]: time="2025-09-12T22:12:11.593537025Z" level=info msg="CreateContainer within sandbox \"fe89e04f1bc88cb3b7198a3ce5758d2490454a808ce30323b6f79f2f885b50f7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 22:12:11.595645 containerd[1529]: time="2025-09-12T22:12:11.595612403Z" level=info msg="CreateContainer within sandbox \"3920f559ef672aa525f9410d6ccf5777e8591f45306180313fc983ef567e1458\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 22:12:11.602282 containerd[1529]: time="2025-09-12T22:12:11.602199444Z" level=info msg="Container 1e996e14f1534f6abdbf6ef97534d5b41777e7f5483cd6c5e7f4e7170c9e76c4: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:11.606395 systemd[1]: Started cri-containerd-4c9c08b85f5e5acd3cc44d586677072a8885f7cafdd83cdc0c239889cb92ec8c.scope - libcontainer container 4c9c08b85f5e5acd3cc44d586677072a8885f7cafdd83cdc0c239889cb92ec8c. Sep 12 22:12:11.615212 containerd[1529]: time="2025-09-12T22:12:11.615128292Z" level=info msg="Container 509f7e4b9b93066a6fdda6286cb4b70208e290695613e266eb6151026c53eb25: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:11.616695 containerd[1529]: time="2025-09-12T22:12:11.616660629Z" level=info msg="CreateContainer within sandbox \"fe89e04f1bc88cb3b7198a3ce5758d2490454a808ce30323b6f79f2f885b50f7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1e996e14f1534f6abdbf6ef97534d5b41777e7f5483cd6c5e7f4e7170c9e76c4\"" Sep 12 22:12:11.618260 containerd[1529]: time="2025-09-12T22:12:11.617346273Z" level=info msg="StartContainer for \"1e996e14f1534f6abdbf6ef97534d5b41777e7f5483cd6c5e7f4e7170c9e76c4\"" Sep 12 22:12:11.618712 containerd[1529]: time="2025-09-12T22:12:11.618682190Z" level=info msg="connecting to shim 1e996e14f1534f6abdbf6ef97534d5b41777e7f5483cd6c5e7f4e7170c9e76c4" address="unix:///run/containerd/s/1ce4c22d47b118888a0a3c2b61e32e10a07d2e94360357ada30d030ee38fb767" protocol=ttrpc version=3 Sep 12 22:12:11.623844 containerd[1529]: time="2025-09-12T22:12:11.623805795Z" level=info msg="CreateContainer within sandbox \"3920f559ef672aa525f9410d6ccf5777e8591f45306180313fc983ef567e1458\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"509f7e4b9b93066a6fdda6286cb4b70208e290695613e266eb6151026c53eb25\"" Sep 12 22:12:11.624437 containerd[1529]: time="2025-09-12T22:12:11.624409295Z" level=info msg="StartContainer for \"509f7e4b9b93066a6fdda6286cb4b70208e290695613e266eb6151026c53eb25\"" Sep 12 22:12:11.625671 containerd[1529]: time="2025-09-12T22:12:11.625645063Z" level=info msg="connecting to shim 509f7e4b9b93066a6fdda6286cb4b70208e290695613e266eb6151026c53eb25" address="unix:///run/containerd/s/bcb9d76c371525a589b91d57061480bc8f37f614b8d8b58e28a37b9310a8e190" protocol=ttrpc version=3 Sep 12 22:12:11.642353 systemd[1]: Started cri-containerd-1e996e14f1534f6abdbf6ef97534d5b41777e7f5483cd6c5e7f4e7170c9e76c4.scope - libcontainer container 1e996e14f1534f6abdbf6ef97534d5b41777e7f5483cd6c5e7f4e7170c9e76c4. Sep 12 22:12:11.645761 systemd[1]: Started cri-containerd-509f7e4b9b93066a6fdda6286cb4b70208e290695613e266eb6151026c53eb25.scope - libcontainer container 509f7e4b9b93066a6fdda6286cb4b70208e290695613e266eb6151026c53eb25. Sep 12 22:12:11.653223 containerd[1529]: time="2025-09-12T22:12:11.653158853Z" level=info msg="StartContainer for \"4c9c08b85f5e5acd3cc44d586677072a8885f7cafdd83cdc0c239889cb92ec8c\" returns successfully" Sep 12 22:12:11.690405 containerd[1529]: time="2025-09-12T22:12:11.690294947Z" level=info msg="StartContainer for \"1e996e14f1534f6abdbf6ef97534d5b41777e7f5483cd6c5e7f4e7170c9e76c4\" returns successfully" Sep 12 22:12:11.696069 containerd[1529]: time="2025-09-12T22:12:11.696036576Z" level=info msg="StartContainer for \"509f7e4b9b93066a6fdda6286cb4b70208e290695613e266eb6151026c53eb25\" returns successfully" Sep 12 22:12:11.709136 kubelet[2310]: E0912 22:12:11.709095 2310 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.68:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.68:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 22:12:11.754425 kubelet[2310]: I0912 22:12:11.754386 2310 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:12:12.023373 kubelet[2310]: E0912 22:12:12.023343 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:12.028433 kubelet[2310]: E0912 22:12:12.028407 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:12.029877 kubelet[2310]: E0912 22:12:12.029858 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:13.032743 kubelet[2310]: E0912 22:12:13.032698 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:13.033064 kubelet[2310]: E0912 22:12:13.032999 2310 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 22:12:13.273510 kubelet[2310]: E0912 22:12:13.273462 2310 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 22:12:13.367514 kubelet[2310]: I0912 22:12:13.367393 2310 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 22:12:13.414163 kubelet[2310]: I0912 22:12:13.414120 2310 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:13.419446 kubelet[2310]: E0912 22:12:13.419406 2310 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:13.419446 kubelet[2310]: I0912 22:12:13.419440 2310 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:13.422029 kubelet[2310]: E0912 22:12:13.421995 2310 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:13.422192 kubelet[2310]: I0912 22:12:13.422108 2310 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 22:12:13.426459 kubelet[2310]: E0912 22:12:13.426404 2310 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 12 22:12:13.803638 kubelet[2310]: I0912 22:12:13.803359 2310 apiserver.go:52] "Watching apiserver" Sep 12 22:12:13.813986 kubelet[2310]: I0912 22:12:13.813965 2310 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 22:12:14.935867 kubelet[2310]: I0912 22:12:14.935831 2310 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 22:12:15.350363 systemd[1]: Reload requested from client PID 2596 ('systemctl') (unit session-7.scope)... Sep 12 22:12:15.350382 systemd[1]: Reloading... Sep 12 22:12:15.426223 zram_generator::config[2642]: No configuration found. Sep 12 22:12:15.608322 systemd[1]: Reloading finished in 257 ms. Sep 12 22:12:15.640763 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:12:15.650498 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:12:15.650701 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:12:15.650746 systemd[1]: kubelet.service: Consumed 1.267s CPU time, 128M memory peak. Sep 12 22:12:15.652897 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:12:15.791299 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:12:15.794854 (kubelet)[2681]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:12:15.827766 kubelet[2681]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:12:15.827766 kubelet[2681]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 22:12:15.827766 kubelet[2681]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:12:15.828094 kubelet[2681]: I0912 22:12:15.827816 2681 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:12:15.834091 kubelet[2681]: I0912 22:12:15.833796 2681 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 22:12:15.834091 kubelet[2681]: I0912 22:12:15.833824 2681 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:12:15.834091 kubelet[2681]: I0912 22:12:15.834027 2681 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 22:12:15.835346 kubelet[2681]: I0912 22:12:15.835301 2681 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 12 22:12:15.839775 kubelet[2681]: I0912 22:12:15.839744 2681 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:12:15.842959 kubelet[2681]: I0912 22:12:15.842938 2681 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:12:15.845612 kubelet[2681]: I0912 22:12:15.845594 2681 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:12:15.845835 kubelet[2681]: I0912 22:12:15.845815 2681 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:12:15.846044 kubelet[2681]: I0912 22:12:15.845839 2681 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:12:15.846044 kubelet[2681]: I0912 22:12:15.845997 2681 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:12:15.846044 kubelet[2681]: I0912 22:12:15.846005 2681 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 22:12:15.846044 kubelet[2681]: I0912 22:12:15.846045 2681 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:12:15.846227 kubelet[2681]: I0912 22:12:15.846179 2681 kubelet.go:480] "Attempting to sync node with API server" Sep 12 22:12:15.846227 kubelet[2681]: I0912 22:12:15.846208 2681 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:12:15.846268 kubelet[2681]: I0912 22:12:15.846234 2681 kubelet.go:386] "Adding apiserver pod source" Sep 12 22:12:15.846268 kubelet[2681]: I0912 22:12:15.846249 2681 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:12:15.846919 kubelet[2681]: I0912 22:12:15.846870 2681 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:12:15.849283 kubelet[2681]: I0912 22:12:15.847397 2681 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 22:12:15.854234 kubelet[2681]: I0912 22:12:15.854164 2681 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 22:12:15.854394 kubelet[2681]: I0912 22:12:15.854366 2681 server.go:1289] "Started kubelet" Sep 12 22:12:15.854828 kubelet[2681]: I0912 22:12:15.854691 2681 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:12:15.854972 kubelet[2681]: I0912 22:12:15.854930 2681 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:12:15.858482 kubelet[2681]: I0912 22:12:15.858411 2681 server.go:317] "Adding debug handlers to kubelet server" Sep 12 22:12:15.859557 kubelet[2681]: I0912 22:12:15.859375 2681 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:12:15.860333 kubelet[2681]: I0912 22:12:15.859300 2681 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:12:15.861201 kubelet[2681]: I0912 22:12:15.859283 2681 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:12:15.861407 kubelet[2681]: I0912 22:12:15.861386 2681 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 22:12:15.862295 kubelet[2681]: E0912 22:12:15.862079 2681 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 22:12:15.862295 kubelet[2681]: I0912 22:12:15.862113 2681 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:12:15.862409 kubelet[2681]: I0912 22:12:15.862384 2681 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 22:12:15.863102 kubelet[2681]: I0912 22:12:15.863078 2681 factory.go:223] Registration of the systemd container factory successfully Sep 12 22:12:15.863338 kubelet[2681]: I0912 22:12:15.863312 2681 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:12:15.865432 kubelet[2681]: E0912 22:12:15.865404 2681 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:12:15.865584 kubelet[2681]: I0912 22:12:15.865547 2681 factory.go:223] Registration of the containerd container factory successfully Sep 12 22:12:15.882146 kubelet[2681]: I0912 22:12:15.882091 2681 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 22:12:15.884100 kubelet[2681]: I0912 22:12:15.884016 2681 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 22:12:15.884378 kubelet[2681]: I0912 22:12:15.884122 2681 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 22:12:15.884378 kubelet[2681]: I0912 22:12:15.884236 2681 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 22:12:15.884378 kubelet[2681]: I0912 22:12:15.884246 2681 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 22:12:15.884378 kubelet[2681]: E0912 22:12:15.884285 2681 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:12:15.906361 kubelet[2681]: I0912 22:12:15.906334 2681 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 22:12:15.906361 kubelet[2681]: I0912 22:12:15.906353 2681 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 22:12:15.906465 kubelet[2681]: I0912 22:12:15.906374 2681 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:12:15.906523 kubelet[2681]: I0912 22:12:15.906506 2681 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 22:12:15.906569 kubelet[2681]: I0912 22:12:15.906523 2681 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 22:12:15.906569 kubelet[2681]: I0912 22:12:15.906541 2681 policy_none.go:49] "None policy: Start" Sep 12 22:12:15.906569 kubelet[2681]: I0912 22:12:15.906549 2681 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 22:12:15.906569 kubelet[2681]: I0912 22:12:15.906558 2681 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:12:15.906644 kubelet[2681]: I0912 22:12:15.906636 2681 state_mem.go:75] "Updated machine memory state" Sep 12 22:12:15.910279 kubelet[2681]: E0912 22:12:15.910256 2681 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 22:12:15.910875 kubelet[2681]: I0912 22:12:15.910772 2681 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:12:15.911050 kubelet[2681]: I0912 22:12:15.910791 2681 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:12:15.911377 kubelet[2681]: I0912 22:12:15.911262 2681 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:12:15.912004 kubelet[2681]: E0912 22:12:15.911770 2681 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 22:12:15.985812 kubelet[2681]: I0912 22:12:15.985774 2681 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 22:12:15.986068 kubelet[2681]: I0912 22:12:15.985828 2681 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:15.986145 kubelet[2681]: I0912 22:12:15.985923 2681 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:15.993573 kubelet[2681]: E0912 22:12:15.993543 2681 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 12 22:12:16.013936 kubelet[2681]: I0912 22:12:16.013896 2681 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 22:12:16.021748 kubelet[2681]: I0912 22:12:16.021582 2681 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 12 22:12:16.021748 kubelet[2681]: I0912 22:12:16.021684 2681 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 22:12:16.164427 kubelet[2681]: I0912 22:12:16.164318 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/399211a1f5532b3485bd9a297c9b4958-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"399211a1f5532b3485bd9a297c9b4958\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:16.164689 kubelet[2681]: I0912 22:12:16.164556 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:16.164689 kubelet[2681]: I0912 22:12:16.164582 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:16.164689 kubelet[2681]: I0912 22:12:16.164623 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:16.164689 kubelet[2681]: I0912 22:12:16.164647 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:16.164689 kubelet[2681]: I0912 22:12:16.164667 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/399211a1f5532b3485bd9a297c9b4958-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"399211a1f5532b3485bd9a297c9b4958\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:16.164961 kubelet[2681]: I0912 22:12:16.164846 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:16.164961 kubelet[2681]: I0912 22:12:16.164869 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 12 22:12:16.164961 kubelet[2681]: I0912 22:12:16.164883 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/399211a1f5532b3485bd9a297c9b4958-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"399211a1f5532b3485bd9a297c9b4958\") " pod="kube-system/kube-apiserver-localhost" Sep 12 22:12:16.847381 kubelet[2681]: I0912 22:12:16.847339 2681 apiserver.go:52] "Watching apiserver" Sep 12 22:12:16.862790 kubelet[2681]: I0912 22:12:16.862752 2681 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 22:12:16.899370 kubelet[2681]: I0912 22:12:16.899333 2681 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:16.904654 kubelet[2681]: E0912 22:12:16.904577 2681 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 12 22:12:16.929233 kubelet[2681]: I0912 22:12:16.927615 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.9275999820000003 podStartE2EDuration="2.927599982s" podCreationTimestamp="2025-09-12 22:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:12:16.917060252 +0000 UTC m=+1.119054398" watchObservedRunningTime="2025-09-12 22:12:16.927599982 +0000 UTC m=+1.129594168" Sep 12 22:12:16.929233 kubelet[2681]: I0912 22:12:16.927749 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.9277445640000002 podStartE2EDuration="1.927744564s" podCreationTimestamp="2025-09-12 22:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:12:16.925467056 +0000 UTC m=+1.127461282" watchObservedRunningTime="2025-09-12 22:12:16.927744564 +0000 UTC m=+1.129738750" Sep 12 22:12:16.937126 kubelet[2681]: I0912 22:12:16.936495 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.936479658 podStartE2EDuration="1.936479658s" podCreationTimestamp="2025-09-12 22:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:12:16.936436131 +0000 UTC m=+1.138430317" watchObservedRunningTime="2025-09-12 22:12:16.936479658 +0000 UTC m=+1.138473844" Sep 12 22:12:21.096285 kubelet[2681]: I0912 22:12:21.096254 2681 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 22:12:21.096677 containerd[1529]: time="2025-09-12T22:12:21.096555735Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 22:12:21.096861 kubelet[2681]: I0912 22:12:21.096729 2681 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 22:12:22.046295 systemd[1]: Created slice kubepods-besteffort-pod21385d3a_4dd4_4a7d_8637_22deb2152ab3.slice - libcontainer container kubepods-besteffort-pod21385d3a_4dd4_4a7d_8637_22deb2152ab3.slice. Sep 12 22:12:22.101740 kubelet[2681]: I0912 22:12:22.101666 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/21385d3a-4dd4-4a7d-8637-22deb2152ab3-kube-proxy\") pod \"kube-proxy-csqks\" (UID: \"21385d3a-4dd4-4a7d-8637-22deb2152ab3\") " pod="kube-system/kube-proxy-csqks" Sep 12 22:12:22.101740 kubelet[2681]: I0912 22:12:22.101717 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/21385d3a-4dd4-4a7d-8637-22deb2152ab3-xtables-lock\") pod \"kube-proxy-csqks\" (UID: \"21385d3a-4dd4-4a7d-8637-22deb2152ab3\") " pod="kube-system/kube-proxy-csqks" Sep 12 22:12:22.101740 kubelet[2681]: I0912 22:12:22.101734 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21385d3a-4dd4-4a7d-8637-22deb2152ab3-lib-modules\") pod \"kube-proxy-csqks\" (UID: \"21385d3a-4dd4-4a7d-8637-22deb2152ab3\") " pod="kube-system/kube-proxy-csqks" Sep 12 22:12:22.101740 kubelet[2681]: I0912 22:12:22.101753 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhcj5\" (UniqueName: \"kubernetes.io/projected/21385d3a-4dd4-4a7d-8637-22deb2152ab3-kube-api-access-zhcj5\") pod \"kube-proxy-csqks\" (UID: \"21385d3a-4dd4-4a7d-8637-22deb2152ab3\") " pod="kube-system/kube-proxy-csqks" Sep 12 22:12:22.261630 systemd[1]: Created slice kubepods-besteffort-poddc142fcd_8dc3_4547_974e_bb29f3e6961c.slice - libcontainer container kubepods-besteffort-poddc142fcd_8dc3_4547_974e_bb29f3e6961c.slice. Sep 12 22:12:22.303299 kubelet[2681]: I0912 22:12:22.303157 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hcm5\" (UniqueName: \"kubernetes.io/projected/dc142fcd-8dc3-4547-974e-bb29f3e6961c-kube-api-access-6hcm5\") pod \"tigera-operator-755d956888-57r9c\" (UID: \"dc142fcd-8dc3-4547-974e-bb29f3e6961c\") " pod="tigera-operator/tigera-operator-755d956888-57r9c" Sep 12 22:12:22.303299 kubelet[2681]: I0912 22:12:22.303216 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dc142fcd-8dc3-4547-974e-bb29f3e6961c-var-lib-calico\") pod \"tigera-operator-755d956888-57r9c\" (UID: \"dc142fcd-8dc3-4547-974e-bb29f3e6961c\") " pod="tigera-operator/tigera-operator-755d956888-57r9c" Sep 12 22:12:22.362304 containerd[1529]: time="2025-09-12T22:12:22.362250040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-csqks,Uid:21385d3a-4dd4-4a7d-8637-22deb2152ab3,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:22.377445 containerd[1529]: time="2025-09-12T22:12:22.377405297Z" level=info msg="connecting to shim 05c7fa776fa43c3cb55ab794cfcbdb21c1916d86164063c82560f82935a3245a" address="unix:///run/containerd/s/fb1acef043799aa29f0200d35d9ee3647ddc23fbb0e1fb12672e7cf85a2d0d54" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:22.402382 systemd[1]: Started cri-containerd-05c7fa776fa43c3cb55ab794cfcbdb21c1916d86164063c82560f82935a3245a.scope - libcontainer container 05c7fa776fa43c3cb55ab794cfcbdb21c1916d86164063c82560f82935a3245a. Sep 12 22:12:22.427326 containerd[1529]: time="2025-09-12T22:12:22.427288434Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-csqks,Uid:21385d3a-4dd4-4a7d-8637-22deb2152ab3,Namespace:kube-system,Attempt:0,} returns sandbox id \"05c7fa776fa43c3cb55ab794cfcbdb21c1916d86164063c82560f82935a3245a\"" Sep 12 22:12:22.432264 containerd[1529]: time="2025-09-12T22:12:22.431867129Z" level=info msg="CreateContainer within sandbox \"05c7fa776fa43c3cb55ab794cfcbdb21c1916d86164063c82560f82935a3245a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 22:12:22.441785 containerd[1529]: time="2025-09-12T22:12:22.440894471Z" level=info msg="Container 9418069be38fae3ce3198b700d1fae26c64a79cf7c7cd88be4989ec441979778: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:22.447687 containerd[1529]: time="2025-09-12T22:12:22.447646226Z" level=info msg="CreateContainer within sandbox \"05c7fa776fa43c3cb55ab794cfcbdb21c1916d86164063c82560f82935a3245a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9418069be38fae3ce3198b700d1fae26c64a79cf7c7cd88be4989ec441979778\"" Sep 12 22:12:22.448243 containerd[1529]: time="2025-09-12T22:12:22.448214623Z" level=info msg="StartContainer for \"9418069be38fae3ce3198b700d1fae26c64a79cf7c7cd88be4989ec441979778\"" Sep 12 22:12:22.449675 containerd[1529]: time="2025-09-12T22:12:22.449638315Z" level=info msg="connecting to shim 9418069be38fae3ce3198b700d1fae26c64a79cf7c7cd88be4989ec441979778" address="unix:///run/containerd/s/fb1acef043799aa29f0200d35d9ee3647ddc23fbb0e1fb12672e7cf85a2d0d54" protocol=ttrpc version=3 Sep 12 22:12:22.473429 systemd[1]: Started cri-containerd-9418069be38fae3ce3198b700d1fae26c64a79cf7c7cd88be4989ec441979778.scope - libcontainer container 9418069be38fae3ce3198b700d1fae26c64a79cf7c7cd88be4989ec441979778. Sep 12 22:12:22.507317 containerd[1529]: time="2025-09-12T22:12:22.507272791Z" level=info msg="StartContainer for \"9418069be38fae3ce3198b700d1fae26c64a79cf7c7cd88be4989ec441979778\" returns successfully" Sep 12 22:12:22.565396 containerd[1529]: time="2025-09-12T22:12:22.565257330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-57r9c,Uid:dc142fcd-8dc3-4547-974e-bb29f3e6961c,Namespace:tigera-operator,Attempt:0,}" Sep 12 22:12:22.580685 containerd[1529]: time="2025-09-12T22:12:22.580642802Z" level=info msg="connecting to shim 868140fc32e9b0d6e0bcc754e0b76004cd10164e68ad258eb9154b397eb41a8d" address="unix:///run/containerd/s/30adce9444bc2638562e7091588b4c3f225cf9427504ea4a0b9f335878e7b0cc" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:22.604361 systemd[1]: Started cri-containerd-868140fc32e9b0d6e0bcc754e0b76004cd10164e68ad258eb9154b397eb41a8d.scope - libcontainer container 868140fc32e9b0d6e0bcc754e0b76004cd10164e68ad258eb9154b397eb41a8d. Sep 12 22:12:22.643624 containerd[1529]: time="2025-09-12T22:12:22.643568779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-57r9c,Uid:dc142fcd-8dc3-4547-974e-bb29f3e6961c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"868140fc32e9b0d6e0bcc754e0b76004cd10164e68ad258eb9154b397eb41a8d\"" Sep 12 22:12:22.645492 containerd[1529]: time="2025-09-12T22:12:22.645460861Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 22:12:22.925379 kubelet[2681]: I0912 22:12:22.925106 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-csqks" podStartSLOduration=0.925088131 podStartE2EDuration="925.088131ms" podCreationTimestamp="2025-09-12 22:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:12:22.924434769 +0000 UTC m=+7.126428955" watchObservedRunningTime="2025-09-12 22:12:22.925088131 +0000 UTC m=+7.127082317" Sep 12 22:12:24.036482 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount14060463.mount: Deactivated successfully. Sep 12 22:12:24.345175 containerd[1529]: time="2025-09-12T22:12:24.345055734Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:24.346583 containerd[1529]: time="2025-09-12T22:12:24.346538620Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 22:12:24.347156 containerd[1529]: time="2025-09-12T22:12:24.347134254Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:24.349626 containerd[1529]: time="2025-09-12T22:12:24.349572715Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:24.350382 containerd[1529]: time="2025-09-12T22:12:24.350271195Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.704773011s" Sep 12 22:12:24.350382 containerd[1529]: time="2025-09-12T22:12:24.350300317Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 22:12:24.356460 containerd[1529]: time="2025-09-12T22:12:24.356422550Z" level=info msg="CreateContainer within sandbox \"868140fc32e9b0d6e0bcc754e0b76004cd10164e68ad258eb9154b397eb41a8d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 22:12:24.363294 containerd[1529]: time="2025-09-12T22:12:24.362704032Z" level=info msg="Container 1c2a35a2767d7381f23062fe70987d303cbba3735a7c40c1ea1335c59a007c3c: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:24.364756 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2057892630.mount: Deactivated successfully. Sep 12 22:12:24.369791 containerd[1529]: time="2025-09-12T22:12:24.369740918Z" level=info msg="CreateContainer within sandbox \"868140fc32e9b0d6e0bcc754e0b76004cd10164e68ad258eb9154b397eb41a8d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1c2a35a2767d7381f23062fe70987d303cbba3735a7c40c1ea1335c59a007c3c\"" Sep 12 22:12:24.370573 containerd[1529]: time="2025-09-12T22:12:24.370482601Z" level=info msg="StartContainer for \"1c2a35a2767d7381f23062fe70987d303cbba3735a7c40c1ea1335c59a007c3c\"" Sep 12 22:12:24.371632 containerd[1529]: time="2025-09-12T22:12:24.371588145Z" level=info msg="connecting to shim 1c2a35a2767d7381f23062fe70987d303cbba3735a7c40c1ea1335c59a007c3c" address="unix:///run/containerd/s/30adce9444bc2638562e7091588b4c3f225cf9427504ea4a0b9f335878e7b0cc" protocol=ttrpc version=3 Sep 12 22:12:24.393333 systemd[1]: Started cri-containerd-1c2a35a2767d7381f23062fe70987d303cbba3735a7c40c1ea1335c59a007c3c.scope - libcontainer container 1c2a35a2767d7381f23062fe70987d303cbba3735a7c40c1ea1335c59a007c3c. Sep 12 22:12:24.423089 containerd[1529]: time="2025-09-12T22:12:24.422968829Z" level=info msg="StartContainer for \"1c2a35a2767d7381f23062fe70987d303cbba3735a7c40c1ea1335c59a007c3c\" returns successfully" Sep 12 22:12:26.564969 kubelet[2681]: I0912 22:12:26.564901 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-57r9c" podStartSLOduration=2.856645523 podStartE2EDuration="4.564886899s" podCreationTimestamp="2025-09-12 22:12:22 +0000 UTC" firstStartedPulling="2025-09-12 22:12:22.644773577 +0000 UTC m=+6.846767723" lastFinishedPulling="2025-09-12 22:12:24.353014913 +0000 UTC m=+8.555009099" observedRunningTime="2025-09-12 22:12:24.929119074 +0000 UTC m=+9.131113260" watchObservedRunningTime="2025-09-12 22:12:26.564886899 +0000 UTC m=+10.766881085" Sep 12 22:12:29.640545 sudo[1737]: pam_unix(sudo:session): session closed for user root Sep 12 22:12:29.642519 sshd[1736]: Connection closed by 10.0.0.1 port 50398 Sep 12 22:12:29.643066 sshd-session[1733]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:29.648758 systemd[1]: sshd@6-10.0.0.68:22-10.0.0.1:50398.service: Deactivated successfully. Sep 12 22:12:29.652924 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 22:12:29.654721 systemd[1]: session-7.scope: Consumed 6.361s CPU time, 220.6M memory peak. Sep 12 22:12:29.658084 systemd-logind[1501]: Session 7 logged out. Waiting for processes to exit. Sep 12 22:12:29.660274 systemd-logind[1501]: Removed session 7. Sep 12 22:12:31.558222 update_engine[1502]: I20250912 22:12:31.557837 1502 update_attempter.cc:509] Updating boot flags... Sep 12 22:12:37.348511 systemd[1]: Created slice kubepods-besteffort-podf49a2fb9_b710_4c6f_a57c_d254f510ef2d.slice - libcontainer container kubepods-besteffort-podf49a2fb9_b710_4c6f_a57c_d254f510ef2d.slice. Sep 12 22:12:37.401326 kubelet[2681]: I0912 22:12:37.401277 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49a2fb9-b710-4c6f-a57c-d254f510ef2d-tigera-ca-bundle\") pod \"calico-typha-5cfc6cf79b-sz9w8\" (UID: \"f49a2fb9-b710-4c6f-a57c-d254f510ef2d\") " pod="calico-system/calico-typha-5cfc6cf79b-sz9w8" Sep 12 22:12:37.401666 kubelet[2681]: I0912 22:12:37.401332 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cldrh\" (UniqueName: \"kubernetes.io/projected/f49a2fb9-b710-4c6f-a57c-d254f510ef2d-kube-api-access-cldrh\") pod \"calico-typha-5cfc6cf79b-sz9w8\" (UID: \"f49a2fb9-b710-4c6f-a57c-d254f510ef2d\") " pod="calico-system/calico-typha-5cfc6cf79b-sz9w8" Sep 12 22:12:37.401666 kubelet[2681]: I0912 22:12:37.401368 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f49a2fb9-b710-4c6f-a57c-d254f510ef2d-typha-certs\") pod \"calico-typha-5cfc6cf79b-sz9w8\" (UID: \"f49a2fb9-b710-4c6f-a57c-d254f510ef2d\") " pod="calico-system/calico-typha-5cfc6cf79b-sz9w8" Sep 12 22:12:37.626747 systemd[1]: Created slice kubepods-besteffort-pod64e5f232_1073_49cd_8d62_2b5772e4d469.slice - libcontainer container kubepods-besteffort-pod64e5f232_1073_49cd_8d62_2b5772e4d469.slice. Sep 12 22:12:37.655409 containerd[1529]: time="2025-09-12T22:12:37.655357843Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cfc6cf79b-sz9w8,Uid:f49a2fb9-b710-4c6f-a57c-d254f510ef2d,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:37.704782 kubelet[2681]: I0912 22:12:37.704714 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-xtables-lock\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.704922 containerd[1529]: time="2025-09-12T22:12:37.704718712Z" level=info msg="connecting to shim c8a36cba6d33d9b9ebc437fa5f4ecb4a96ef10b8ff60cfabc6f54550c79fe337" address="unix:///run/containerd/s/57c69fcdbb8b2e36dfaa79ea1251ae5bf72e8834e0b0d949c321cb09b5c5e2b8" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:37.705271 kubelet[2681]: I0912 22:12:37.705242 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5rzz\" (UniqueName: \"kubernetes.io/projected/64e5f232-1073-49cd-8d62-2b5772e4d469-kube-api-access-m5rzz\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705321 kubelet[2681]: I0912 22:12:37.705295 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-var-lib-calico\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705344 kubelet[2681]: I0912 22:12:37.705320 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-cni-log-dir\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705344 kubelet[2681]: I0912 22:12:37.705337 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64e5f232-1073-49cd-8d62-2b5772e4d469-tigera-ca-bundle\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705392 kubelet[2681]: I0912 22:12:37.705373 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-cni-net-dir\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705412 kubelet[2681]: I0912 22:12:37.705401 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-lib-modules\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705436 kubelet[2681]: I0912 22:12:37.705418 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-policysync\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705457 kubelet[2681]: I0912 22:12:37.705440 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-var-run-calico\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705479 kubelet[2681]: I0912 22:12:37.705460 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-cni-bin-dir\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705506 kubelet[2681]: I0912 22:12:37.705477 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/64e5f232-1073-49cd-8d62-2b5772e4d469-flexvol-driver-host\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.705506 kubelet[2681]: I0912 22:12:37.705494 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/64e5f232-1073-49cd-8d62-2b5772e4d469-node-certs\") pod \"calico-node-75m29\" (UID: \"64e5f232-1073-49cd-8d62-2b5772e4d469\") " pod="calico-system/calico-node-75m29" Sep 12 22:12:37.766409 systemd[1]: Started cri-containerd-c8a36cba6d33d9b9ebc437fa5f4ecb4a96ef10b8ff60cfabc6f54550c79fe337.scope - libcontainer container c8a36cba6d33d9b9ebc437fa5f4ecb4a96ef10b8ff60cfabc6f54550c79fe337. Sep 12 22:12:37.831881 kubelet[2681]: E0912 22:12:37.831837 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.831881 kubelet[2681]: W0912 22:12:37.831868 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.835791 kubelet[2681]: E0912 22:12:37.835743 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.841133 kubelet[2681]: E0912 22:12:37.841020 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpk44" podUID="f1eff131-ed0a-4062-a13b-42b0f931fef5" Sep 12 22:12:37.887223 containerd[1529]: time="2025-09-12T22:12:37.886584244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cfc6cf79b-sz9w8,Uid:f49a2fb9-b710-4c6f-a57c-d254f510ef2d,Namespace:calico-system,Attempt:0,} returns sandbox id \"c8a36cba6d33d9b9ebc437fa5f4ecb4a96ef10b8ff60cfabc6f54550c79fe337\"" Sep 12 22:12:37.888675 containerd[1529]: time="2025-09-12T22:12:37.888613065Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 22:12:37.894024 kubelet[2681]: E0912 22:12:37.893810 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.894024 kubelet[2681]: W0912 22:12:37.893841 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.894024 kubelet[2681]: E0912 22:12:37.893864 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.894835 kubelet[2681]: E0912 22:12:37.894173 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.902211 kubelet[2681]: W0912 22:12:37.894214 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.902211 kubelet[2681]: E0912 22:12:37.899784 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.902211 kubelet[2681]: E0912 22:12:37.900059 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.902211 kubelet[2681]: W0912 22:12:37.900069 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.902211 kubelet[2681]: E0912 22:12:37.900080 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.902631 kubelet[2681]: E0912 22:12:37.902524 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.902631 kubelet[2681]: W0912 22:12:37.902542 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.902631 kubelet[2681]: E0912 22:12:37.902561 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.903012 kubelet[2681]: E0912 22:12:37.902945 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.903012 kubelet[2681]: W0912 22:12:37.902960 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.903012 kubelet[2681]: E0912 22:12:37.902971 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.903327 kubelet[2681]: E0912 22:12:37.903312 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.903475 kubelet[2681]: W0912 22:12:37.903405 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.903475 kubelet[2681]: E0912 22:12:37.903423 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.903741 kubelet[2681]: E0912 22:12:37.903728 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.903808 kubelet[2681]: W0912 22:12:37.903796 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.903859 kubelet[2681]: E0912 22:12:37.903849 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.904525 kubelet[2681]: E0912 22:12:37.904201 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.904525 kubelet[2681]: W0912 22:12:37.904425 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.904525 kubelet[2681]: E0912 22:12:37.904446 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.906007 kubelet[2681]: E0912 22:12:37.905987 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.906103 kubelet[2681]: W0912 22:12:37.906089 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.906153 kubelet[2681]: E0912 22:12:37.906143 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.907037 kubelet[2681]: E0912 22:12:37.906872 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.907037 kubelet[2681]: W0912 22:12:37.906891 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.907037 kubelet[2681]: E0912 22:12:37.906903 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.907937 kubelet[2681]: E0912 22:12:37.907610 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.907937 kubelet[2681]: W0912 22:12:37.907626 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.907937 kubelet[2681]: E0912 22:12:37.907638 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.908550 kubelet[2681]: E0912 22:12:37.908495 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.908550 kubelet[2681]: W0912 22:12:37.908512 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.908550 kubelet[2681]: E0912 22:12:37.908524 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.909377 kubelet[2681]: E0912 22:12:37.909347 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.909377 kubelet[2681]: W0912 22:12:37.909362 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.909582 kubelet[2681]: E0912 22:12:37.909526 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.910624 kubelet[2681]: E0912 22:12:37.910553 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.910624 kubelet[2681]: W0912 22:12:37.910570 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.910624 kubelet[2681]: E0912 22:12:37.910582 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.911722 kubelet[2681]: E0912 22:12:37.911651 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.911722 kubelet[2681]: W0912 22:12:37.911667 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.911722 kubelet[2681]: E0912 22:12:37.911678 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.911999 kubelet[2681]: E0912 22:12:37.911986 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.912197 kubelet[2681]: W0912 22:12:37.912058 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.912197 kubelet[2681]: E0912 22:12:37.912075 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.913128 kubelet[2681]: E0912 22:12:37.913104 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.915294 kubelet[2681]: W0912 22:12:37.915270 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.915385 kubelet[2681]: E0912 22:12:37.915373 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.915849 kubelet[2681]: E0912 22:12:37.915833 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.915943 kubelet[2681]: W0912 22:12:37.915930 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.915995 kubelet[2681]: E0912 22:12:37.915984 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.916420 kubelet[2681]: E0912 22:12:37.916401 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.916547 kubelet[2681]: W0912 22:12:37.916485 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.916547 kubelet[2681]: E0912 22:12:37.916505 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.917357 kubelet[2681]: E0912 22:12:37.917342 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.917649 kubelet[2681]: W0912 22:12:37.917423 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.917649 kubelet[2681]: E0912 22:12:37.917440 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.917790 kubelet[2681]: E0912 22:12:37.917779 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.917845 kubelet[2681]: W0912 22:12:37.917835 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.917902 kubelet[2681]: E0912 22:12:37.917891 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.917988 kubelet[2681]: I0912 22:12:37.917969 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1eff131-ed0a-4062-a13b-42b0f931fef5-kubelet-dir\") pod \"csi-node-driver-gpk44\" (UID: \"f1eff131-ed0a-4062-a13b-42b0f931fef5\") " pod="calico-system/csi-node-driver-gpk44" Sep 12 22:12:37.918379 kubelet[2681]: E0912 22:12:37.918359 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.918379 kubelet[2681]: W0912 22:12:37.918376 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.918474 kubelet[2681]: E0912 22:12:37.918389 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.918547 kubelet[2681]: E0912 22:12:37.918534 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.918547 kubelet[2681]: W0912 22:12:37.918546 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.918620 kubelet[2681]: E0912 22:12:37.918554 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.918727 kubelet[2681]: E0912 22:12:37.918716 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.918755 kubelet[2681]: W0912 22:12:37.918727 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.918755 kubelet[2681]: E0912 22:12:37.918744 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.918804 kubelet[2681]: I0912 22:12:37.918766 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdl2f\" (UniqueName: \"kubernetes.io/projected/f1eff131-ed0a-4062-a13b-42b0f931fef5-kube-api-access-sdl2f\") pod \"csi-node-driver-gpk44\" (UID: \"f1eff131-ed0a-4062-a13b-42b0f931fef5\") " pod="calico-system/csi-node-driver-gpk44" Sep 12 22:12:37.919271 kubelet[2681]: E0912 22:12:37.918920 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.919271 kubelet[2681]: W0912 22:12:37.918929 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.919271 kubelet[2681]: E0912 22:12:37.918936 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.919271 kubelet[2681]: I0912 22:12:37.918949 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1eff131-ed0a-4062-a13b-42b0f931fef5-registration-dir\") pod \"csi-node-driver-gpk44\" (UID: \"f1eff131-ed0a-4062-a13b-42b0f931fef5\") " pod="calico-system/csi-node-driver-gpk44" Sep 12 22:12:37.919271 kubelet[2681]: E0912 22:12:37.919076 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.919271 kubelet[2681]: W0912 22:12:37.919085 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.919271 kubelet[2681]: E0912 22:12:37.919093 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.919271 kubelet[2681]: I0912 22:12:37.919105 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f1eff131-ed0a-4062-a13b-42b0f931fef5-varrun\") pod \"csi-node-driver-gpk44\" (UID: \"f1eff131-ed0a-4062-a13b-42b0f931fef5\") " pod="calico-system/csi-node-driver-gpk44" Sep 12 22:12:37.919609 kubelet[2681]: E0912 22:12:37.919375 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.919609 kubelet[2681]: W0912 22:12:37.919386 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.919609 kubelet[2681]: E0912 22:12:37.919455 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.919609 kubelet[2681]: I0912 22:12:37.919478 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1eff131-ed0a-4062-a13b-42b0f931fef5-socket-dir\") pod \"csi-node-driver-gpk44\" (UID: \"f1eff131-ed0a-4062-a13b-42b0f931fef5\") " pod="calico-system/csi-node-driver-gpk44" Sep 12 22:12:37.919716 kubelet[2681]: E0912 22:12:37.919699 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.919716 kubelet[2681]: W0912 22:12:37.919713 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.919801 kubelet[2681]: E0912 22:12:37.919723 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.919884 kubelet[2681]: E0912 22:12:37.919872 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.919884 kubelet[2681]: W0912 22:12:37.919882 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.919967 kubelet[2681]: E0912 22:12:37.919890 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.920239 kubelet[2681]: E0912 22:12:37.920061 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.920239 kubelet[2681]: W0912 22:12:37.920221 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.920239 kubelet[2681]: E0912 22:12:37.920232 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.922344 kubelet[2681]: E0912 22:12:37.921042 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.922344 kubelet[2681]: W0912 22:12:37.921053 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.922344 kubelet[2681]: E0912 22:12:37.921082 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.922344 kubelet[2681]: E0912 22:12:37.921411 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.922344 kubelet[2681]: W0912 22:12:37.921422 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.922344 kubelet[2681]: E0912 22:12:37.921595 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.922720 kubelet[2681]: E0912 22:12:37.922698 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.922720 kubelet[2681]: W0912 22:12:37.922718 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.922829 kubelet[2681]: E0912 22:12:37.922735 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.923360 kubelet[2681]: E0912 22:12:37.923339 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.923578 kubelet[2681]: W0912 22:12:37.923355 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.923621 kubelet[2681]: E0912 22:12:37.923578 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.924105 kubelet[2681]: E0912 22:12:37.924087 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:37.924177 kubelet[2681]: W0912 22:12:37.924103 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:37.924249 kubelet[2681]: E0912 22:12:37.924224 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:37.931889 containerd[1529]: time="2025-09-12T22:12:37.931821870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-75m29,Uid:64e5f232-1073-49cd-8d62-2b5772e4d469,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:37.959469 containerd[1529]: time="2025-09-12T22:12:37.959358170Z" level=info msg="connecting to shim 28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993" address="unix:///run/containerd/s/dcabd6f0b5b5c2bfde47fd87de7d4e4fb43edeb135ee86c2fe04add55b7f235a" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:37.999398 systemd[1]: Started cri-containerd-28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993.scope - libcontainer container 28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993. Sep 12 22:12:38.020641 kubelet[2681]: E0912 22:12:38.020552 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.020641 kubelet[2681]: W0912 22:12:38.020577 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.020641 kubelet[2681]: E0912 22:12:38.020599 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.021088 kubelet[2681]: E0912 22:12:38.021048 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.021088 kubelet[2681]: W0912 22:12:38.021061 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.021088 kubelet[2681]: E0912 22:12:38.021073 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.021498 kubelet[2681]: E0912 22:12:38.021444 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.021498 kubelet[2681]: W0912 22:12:38.021458 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.021498 kubelet[2681]: E0912 22:12:38.021468 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.021706 kubelet[2681]: E0912 22:12:38.021687 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.021706 kubelet[2681]: W0912 22:12:38.021706 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.021786 kubelet[2681]: E0912 22:12:38.021720 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.021875 kubelet[2681]: E0912 22:12:38.021865 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.021875 kubelet[2681]: W0912 22:12:38.021874 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.021948 kubelet[2681]: E0912 22:12:38.021884 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.022039 kubelet[2681]: E0912 22:12:38.022029 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.022039 kubelet[2681]: W0912 22:12:38.022038 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.022086 kubelet[2681]: E0912 22:12:38.022045 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.022338 kubelet[2681]: E0912 22:12:38.022325 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.022338 kubelet[2681]: W0912 22:12:38.022337 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.022403 kubelet[2681]: E0912 22:12:38.022345 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.022536 kubelet[2681]: E0912 22:12:38.022525 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.022536 kubelet[2681]: W0912 22:12:38.022535 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.022615 kubelet[2681]: E0912 22:12:38.022543 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.022709 kubelet[2681]: E0912 22:12:38.022698 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.022709 kubelet[2681]: W0912 22:12:38.022708 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.022881 kubelet[2681]: E0912 22:12:38.022718 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.022906 kubelet[2681]: E0912 22:12:38.022885 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.022906 kubelet[2681]: W0912 22:12:38.022893 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.022906 kubelet[2681]: E0912 22:12:38.022902 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.023294 kubelet[2681]: E0912 22:12:38.023031 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.023294 kubelet[2681]: W0912 22:12:38.023043 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.023294 kubelet[2681]: E0912 22:12:38.023050 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.023294 kubelet[2681]: E0912 22:12:38.023238 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.023294 kubelet[2681]: W0912 22:12:38.023247 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.023294 kubelet[2681]: E0912 22:12:38.023257 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.023954 kubelet[2681]: E0912 22:12:38.023935 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.023954 kubelet[2681]: W0912 22:12:38.023952 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.024020 kubelet[2681]: E0912 22:12:38.023964 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.024227 kubelet[2681]: E0912 22:12:38.024210 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.024227 kubelet[2681]: W0912 22:12:38.024224 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.024301 kubelet[2681]: E0912 22:12:38.024234 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.024597 kubelet[2681]: E0912 22:12:38.024579 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.024597 kubelet[2681]: W0912 22:12:38.024594 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.024693 kubelet[2681]: E0912 22:12:38.024610 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.024799 kubelet[2681]: E0912 22:12:38.024782 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.024799 kubelet[2681]: W0912 22:12:38.024793 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.024929 kubelet[2681]: E0912 22:12:38.024803 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.025228 kubelet[2681]: E0912 22:12:38.025206 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.025228 kubelet[2681]: W0912 22:12:38.025224 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.025393 kubelet[2681]: E0912 22:12:38.025237 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.025540 kubelet[2681]: E0912 22:12:38.025526 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.025577 kubelet[2681]: W0912 22:12:38.025552 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.025577 kubelet[2681]: E0912 22:12:38.025570 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.025824 kubelet[2681]: E0912 22:12:38.025807 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.025824 kubelet[2681]: W0912 22:12:38.025820 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.025882 kubelet[2681]: E0912 22:12:38.025829 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.026047 kubelet[2681]: E0912 22:12:38.026034 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.026047 kubelet[2681]: W0912 22:12:38.026046 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.026047 kubelet[2681]: E0912 22:12:38.026057 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.026378 kubelet[2681]: E0912 22:12:38.026364 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.026378 kubelet[2681]: W0912 22:12:38.026377 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.026470 kubelet[2681]: E0912 22:12:38.026387 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.026696 kubelet[2681]: E0912 22:12:38.026679 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.026696 kubelet[2681]: W0912 22:12:38.026693 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.026749 kubelet[2681]: E0912 22:12:38.026703 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.026952 kubelet[2681]: E0912 22:12:38.026938 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.026952 kubelet[2681]: W0912 22:12:38.026951 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.027019 kubelet[2681]: E0912 22:12:38.026962 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.027318 kubelet[2681]: E0912 22:12:38.027300 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.027380 kubelet[2681]: W0912 22:12:38.027333 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.027380 kubelet[2681]: E0912 22:12:38.027349 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.027934 kubelet[2681]: E0912 22:12:38.027912 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.027934 kubelet[2681]: W0912 22:12:38.027930 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.028018 kubelet[2681]: E0912 22:12:38.027944 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:38.029570 containerd[1529]: time="2025-09-12T22:12:38.029508780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-75m29,Uid:64e5f232-1073-49cd-8d62-2b5772e4d469,Namespace:calico-system,Attempt:0,} returns sandbox id \"28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993\"" Sep 12 22:12:38.042260 kubelet[2681]: E0912 22:12:38.041933 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:38.042260 kubelet[2681]: W0912 22:12:38.042107 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:38.042260 kubelet[2681]: E0912 22:12:38.042131 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:39.491242 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1384743154.mount: Deactivated successfully. Sep 12 22:12:39.885205 kubelet[2681]: E0912 22:12:39.884880 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpk44" podUID="f1eff131-ed0a-4062-a13b-42b0f931fef5" Sep 12 22:12:40.034413 containerd[1529]: time="2025-09-12T22:12:40.033948839Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:40.035335 containerd[1529]: time="2025-09-12T22:12:40.035294274Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 22:12:40.035475 containerd[1529]: time="2025-09-12T22:12:40.035419797Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:40.037354 containerd[1529]: time="2025-09-12T22:12:40.037305486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:40.037912 containerd[1529]: time="2025-09-12T22:12:40.037797339Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.149149394s" Sep 12 22:12:40.037912 containerd[1529]: time="2025-09-12T22:12:40.037827820Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 22:12:40.039089 containerd[1529]: time="2025-09-12T22:12:40.039063052Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 22:12:40.052798 containerd[1529]: time="2025-09-12T22:12:40.052739007Z" level=info msg="CreateContainer within sandbox \"c8a36cba6d33d9b9ebc437fa5f4ecb4a96ef10b8ff60cfabc6f54550c79fe337\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 22:12:40.058563 containerd[1529]: time="2025-09-12T22:12:40.058490757Z" level=info msg="Container 3189192a6407a2564a654180a1af93546a6e7783e7a8ca0beccd182f1703ca7b: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:40.066920 containerd[1529]: time="2025-09-12T22:12:40.066872615Z" level=info msg="CreateContainer within sandbox \"c8a36cba6d33d9b9ebc437fa5f4ecb4a96ef10b8ff60cfabc6f54550c79fe337\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3189192a6407a2564a654180a1af93546a6e7783e7a8ca0beccd182f1703ca7b\"" Sep 12 22:12:40.067797 containerd[1529]: time="2025-09-12T22:12:40.067500871Z" level=info msg="StartContainer for \"3189192a6407a2564a654180a1af93546a6e7783e7a8ca0beccd182f1703ca7b\"" Sep 12 22:12:40.069297 containerd[1529]: time="2025-09-12T22:12:40.069255637Z" level=info msg="connecting to shim 3189192a6407a2564a654180a1af93546a6e7783e7a8ca0beccd182f1703ca7b" address="unix:///run/containerd/s/57c69fcdbb8b2e36dfaa79ea1251ae5bf72e8834e0b0d949c321cb09b5c5e2b8" protocol=ttrpc version=3 Sep 12 22:12:40.101360 systemd[1]: Started cri-containerd-3189192a6407a2564a654180a1af93546a6e7783e7a8ca0beccd182f1703ca7b.scope - libcontainer container 3189192a6407a2564a654180a1af93546a6e7783e7a8ca0beccd182f1703ca7b. Sep 12 22:12:40.140976 containerd[1529]: time="2025-09-12T22:12:40.140867020Z" level=info msg="StartContainer for \"3189192a6407a2564a654180a1af93546a6e7783e7a8ca0beccd182f1703ca7b\" returns successfully" Sep 12 22:12:41.041035 kubelet[2681]: E0912 22:12:41.040999 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.041035 kubelet[2681]: W0912 22:12:41.041024 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.041035 kubelet[2681]: E0912 22:12:41.041046 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.041644 kubelet[2681]: E0912 22:12:41.041373 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.041644 kubelet[2681]: W0912 22:12:41.041383 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.041644 kubelet[2681]: E0912 22:12:41.041428 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.041644 kubelet[2681]: E0912 22:12:41.041592 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.041644 kubelet[2681]: W0912 22:12:41.041602 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.041644 kubelet[2681]: E0912 22:12:41.041611 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.041993 kubelet[2681]: E0912 22:12:41.041973 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.041993 kubelet[2681]: W0912 22:12:41.041987 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.042065 kubelet[2681]: E0912 22:12:41.041997 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.042192 kubelet[2681]: E0912 22:12:41.042163 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.042234 kubelet[2681]: W0912 22:12:41.042176 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.042234 kubelet[2681]: E0912 22:12:41.042205 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.042465 kubelet[2681]: E0912 22:12:41.042445 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.042465 kubelet[2681]: W0912 22:12:41.042459 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.042465 kubelet[2681]: E0912 22:12:41.042469 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.042648 kubelet[2681]: E0912 22:12:41.042636 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.042648 kubelet[2681]: W0912 22:12:41.042647 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.042707 kubelet[2681]: E0912 22:12:41.042655 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.042805 kubelet[2681]: E0912 22:12:41.042791 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.042805 kubelet[2681]: W0912 22:12:41.042802 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.042869 kubelet[2681]: E0912 22:12:41.042809 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.042967 kubelet[2681]: E0912 22:12:41.042956 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.042967 kubelet[2681]: W0912 22:12:41.042966 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.043018 kubelet[2681]: E0912 22:12:41.042974 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.043125 kubelet[2681]: E0912 22:12:41.043107 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.043125 kubelet[2681]: W0912 22:12:41.043117 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.043125 kubelet[2681]: E0912 22:12:41.043124 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.043314 kubelet[2681]: E0912 22:12:41.043301 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.043314 kubelet[2681]: W0912 22:12:41.043312 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.043367 kubelet[2681]: E0912 22:12:41.043320 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.043656 kubelet[2681]: E0912 22:12:41.043642 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.043694 kubelet[2681]: W0912 22:12:41.043656 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.043694 kubelet[2681]: E0912 22:12:41.043666 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.044942 kubelet[2681]: E0912 22:12:41.044568 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.044992 kubelet[2681]: W0912 22:12:41.044944 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.044992 kubelet[2681]: E0912 22:12:41.044960 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.045210 kubelet[2681]: E0912 22:12:41.045169 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.045210 kubelet[2681]: W0912 22:12:41.045204 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.045299 kubelet[2681]: E0912 22:12:41.045217 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.046316 kubelet[2681]: E0912 22:12:41.046270 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.046349 kubelet[2681]: W0912 22:12:41.046317 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.046349 kubelet[2681]: E0912 22:12:41.046331 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.046743 kubelet[2681]: E0912 22:12:41.046694 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.046779 kubelet[2681]: W0912 22:12:41.046745 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.046779 kubelet[2681]: E0912 22:12:41.046759 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.046991 kubelet[2681]: E0912 22:12:41.046977 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.047023 kubelet[2681]: W0912 22:12:41.046990 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.047023 kubelet[2681]: E0912 22:12:41.046999 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.047441 kubelet[2681]: E0912 22:12:41.047423 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.047441 kubelet[2681]: W0912 22:12:41.047439 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.047502 kubelet[2681]: E0912 22:12:41.047449 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.047639 kubelet[2681]: E0912 22:12:41.047627 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.047639 kubelet[2681]: W0912 22:12:41.047637 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.047685 kubelet[2681]: E0912 22:12:41.047645 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.047784 kubelet[2681]: E0912 22:12:41.047772 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.047784 kubelet[2681]: W0912 22:12:41.047782 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.047836 kubelet[2681]: E0912 22:12:41.047790 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.047979 kubelet[2681]: E0912 22:12:41.047967 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.047979 kubelet[2681]: W0912 22:12:41.047978 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.048023 kubelet[2681]: E0912 22:12:41.047986 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.048328 kubelet[2681]: E0912 22:12:41.048314 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.048358 kubelet[2681]: W0912 22:12:41.048329 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.048358 kubelet[2681]: E0912 22:12:41.048339 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.048518 kubelet[2681]: E0912 22:12:41.048506 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.048546 kubelet[2681]: W0912 22:12:41.048517 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.048546 kubelet[2681]: E0912 22:12:41.048525 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.048684 kubelet[2681]: E0912 22:12:41.048673 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.048708 kubelet[2681]: W0912 22:12:41.048684 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.048708 kubelet[2681]: E0912 22:12:41.048691 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.048821 kubelet[2681]: E0912 22:12:41.048811 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.048841 kubelet[2681]: W0912 22:12:41.048820 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.048841 kubelet[2681]: E0912 22:12:41.048827 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.048981 kubelet[2681]: E0912 22:12:41.048971 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.049006 kubelet[2681]: W0912 22:12:41.048980 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.049006 kubelet[2681]: E0912 22:12:41.048988 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.049131 kubelet[2681]: E0912 22:12:41.049118 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.049131 kubelet[2681]: W0912 22:12:41.049130 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.049187 kubelet[2681]: E0912 22:12:41.049137 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.049342 kubelet[2681]: E0912 22:12:41.049329 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.049342 kubelet[2681]: W0912 22:12:41.049341 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.049390 kubelet[2681]: E0912 22:12:41.049349 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.049612 kubelet[2681]: E0912 22:12:41.049598 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.049634 kubelet[2681]: W0912 22:12:41.049612 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.049634 kubelet[2681]: E0912 22:12:41.049622 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.049980 kubelet[2681]: E0912 22:12:41.049961 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.049980 kubelet[2681]: W0912 22:12:41.049978 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.050038 kubelet[2681]: E0912 22:12:41.049988 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.050378 kubelet[2681]: E0912 22:12:41.050361 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.050378 kubelet[2681]: W0912 22:12:41.050375 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.050426 kubelet[2681]: E0912 22:12:41.050386 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.050737 kubelet[2681]: E0912 22:12:41.050721 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.050771 kubelet[2681]: W0912 22:12:41.050735 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.050771 kubelet[2681]: E0912 22:12:41.050746 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.050938 kubelet[2681]: E0912 22:12:41.050924 2681 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:12:41.050938 kubelet[2681]: W0912 22:12:41.050937 2681 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:12:41.050989 kubelet[2681]: E0912 22:12:41.050945 2681 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:12:41.125555 containerd[1529]: time="2025-09-12T22:12:41.125482699Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:41.126310 containerd[1529]: time="2025-09-12T22:12:41.126284639Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 22:12:41.127362 containerd[1529]: time="2025-09-12T22:12:41.127337025Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:41.129351 containerd[1529]: time="2025-09-12T22:12:41.129316514Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:41.130416 containerd[1529]: time="2025-09-12T22:12:41.130388301Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.091291369s" Sep 12 22:12:41.130484 containerd[1529]: time="2025-09-12T22:12:41.130421142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 22:12:41.138216 containerd[1529]: time="2025-09-12T22:12:41.137668322Z" level=info msg="CreateContainer within sandbox \"28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 22:12:41.159211 containerd[1529]: time="2025-09-12T22:12:41.159150738Z" level=info msg="Container 52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:41.160152 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3203774821.mount: Deactivated successfully. Sep 12 22:12:41.171510 containerd[1529]: time="2025-09-12T22:12:41.171465965Z" level=info msg="CreateContainer within sandbox \"28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a\"" Sep 12 22:12:41.172957 containerd[1529]: time="2025-09-12T22:12:41.172891520Z" level=info msg="StartContainer for \"52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a\"" Sep 12 22:12:41.174700 containerd[1529]: time="2025-09-12T22:12:41.174585762Z" level=info msg="connecting to shim 52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a" address="unix:///run/containerd/s/dcabd6f0b5b5c2bfde47fd87de7d4e4fb43edeb135ee86c2fe04add55b7f235a" protocol=ttrpc version=3 Sep 12 22:12:41.211417 systemd[1]: Started cri-containerd-52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a.scope - libcontainer container 52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a. Sep 12 22:12:41.252573 containerd[1529]: time="2025-09-12T22:12:41.252533425Z" level=info msg="StartContainer for \"52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a\" returns successfully" Sep 12 22:12:41.267376 systemd[1]: cri-containerd-52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a.scope: Deactivated successfully. Sep 12 22:12:41.267685 systemd[1]: cri-containerd-52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a.scope: Consumed 30ms CPU time, 6.4M memory peak, 4.5M written to disk. Sep 12 22:12:41.305306 containerd[1529]: time="2025-09-12T22:12:41.304976452Z" level=info msg="received exit event container_id:\"52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a\" id:\"52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a\" pid:3384 exited_at:{seconds:1757715161 nanos:296640604}" Sep 12 22:12:41.306324 containerd[1529]: time="2025-09-12T22:12:41.306289284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a\" id:\"52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a\" pid:3384 exited_at:{seconds:1757715161 nanos:296640604}" Sep 12 22:12:41.334502 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-52ac30e787b2601f62f1fbffae92d7e12166796d0f4a577f6a68cf35e3a1de0a-rootfs.mount: Deactivated successfully. Sep 12 22:12:41.885511 kubelet[2681]: E0912 22:12:41.885427 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpk44" podUID="f1eff131-ed0a-4062-a13b-42b0f931fef5" Sep 12 22:12:41.966919 kubelet[2681]: I0912 22:12:41.966876 2681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:12:41.968908 containerd[1529]: time="2025-09-12T22:12:41.968870036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 22:12:41.990761 kubelet[2681]: I0912 22:12:41.990695 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5cfc6cf79b-sz9w8" podStartSLOduration=2.839883021 podStartE2EDuration="4.990680099s" podCreationTimestamp="2025-09-12 22:12:37 +0000 UTC" firstStartedPulling="2025-09-12 22:12:37.888157291 +0000 UTC m=+22.090151477" lastFinishedPulling="2025-09-12 22:12:40.038954369 +0000 UTC m=+24.240948555" observedRunningTime="2025-09-12 22:12:40.986433377 +0000 UTC m=+25.188427563" watchObservedRunningTime="2025-09-12 22:12:41.990680099 +0000 UTC m=+26.192674285" Sep 12 22:12:43.885485 kubelet[2681]: E0912 22:12:43.885428 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpk44" podUID="f1eff131-ed0a-4062-a13b-42b0f931fef5" Sep 12 22:12:45.646740 containerd[1529]: time="2025-09-12T22:12:45.646390240Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:45.647164 containerd[1529]: time="2025-09-12T22:12:45.646766648Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 22:12:45.647414 containerd[1529]: time="2025-09-12T22:12:45.647377821Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:45.660231 containerd[1529]: time="2025-09-12T22:12:45.660150812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:45.660806 containerd[1529]: time="2025-09-12T22:12:45.660774745Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.691868509s" Sep 12 22:12:45.660853 containerd[1529]: time="2025-09-12T22:12:45.660807586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 22:12:45.670213 containerd[1529]: time="2025-09-12T22:12:45.669982900Z" level=info msg="CreateContainer within sandbox \"28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 22:12:45.677439 containerd[1529]: time="2025-09-12T22:12:45.677392297Z" level=info msg="Container f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:45.687728 containerd[1529]: time="2025-09-12T22:12:45.687684955Z" level=info msg="CreateContainer within sandbox \"28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e\"" Sep 12 22:12:45.688430 containerd[1529]: time="2025-09-12T22:12:45.688400210Z" level=info msg="StartContainer for \"f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e\"" Sep 12 22:12:45.689823 containerd[1529]: time="2025-09-12T22:12:45.689796240Z" level=info msg="connecting to shim f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e" address="unix:///run/containerd/s/dcabd6f0b5b5c2bfde47fd87de7d4e4fb43edeb135ee86c2fe04add55b7f235a" protocol=ttrpc version=3 Sep 12 22:12:45.714410 systemd[1]: Started cri-containerd-f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e.scope - libcontainer container f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e. Sep 12 22:12:45.765830 containerd[1529]: time="2025-09-12T22:12:45.765760169Z" level=info msg="StartContainer for \"f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e\" returns successfully" Sep 12 22:12:45.885042 kubelet[2681]: E0912 22:12:45.885004 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpk44" podUID="f1eff131-ed0a-4062-a13b-42b0f931fef5" Sep 12 22:12:46.311603 systemd[1]: cri-containerd-f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e.scope: Deactivated successfully. Sep 12 22:12:46.311880 systemd[1]: cri-containerd-f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e.scope: Consumed 467ms CPU time, 177.4M memory peak, 3M read from disk, 165.8M written to disk. Sep 12 22:12:46.313281 containerd[1529]: time="2025-09-12T22:12:46.313242159Z" level=info msg="received exit event container_id:\"f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e\" id:\"f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e\" pid:3444 exited_at:{seconds:1757715166 nanos:312946673}" Sep 12 22:12:46.313496 containerd[1529]: time="2025-09-12T22:12:46.313250439Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e\" id:\"f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e\" pid:3444 exited_at:{seconds:1757715166 nanos:312946673}" Sep 12 22:12:46.340762 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f2025a20e3da383224f5a1bbf11f128640f5e8b39af4faa096eced78b85d996e-rootfs.mount: Deactivated successfully. Sep 12 22:12:46.398734 kubelet[2681]: I0912 22:12:46.398704 2681 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 22:12:46.448221 systemd[1]: Created slice kubepods-besteffort-pod730c4fd2_47b1_4635_a97a_a8a4c9177989.slice - libcontainer container kubepods-besteffort-pod730c4fd2_47b1_4635_a97a_a8a4c9177989.slice. Sep 12 22:12:46.458224 systemd[1]: Created slice kubepods-besteffort-pod952d4533_ce65_4672_99d2_3f86186944d5.slice - libcontainer container kubepods-besteffort-pod952d4533_ce65_4672_99d2_3f86186944d5.slice. Sep 12 22:12:46.464927 systemd[1]: Created slice kubepods-burstable-podcd03710e_4ae7_41a3_ab06_c94a974ca060.slice - libcontainer container kubepods-burstable-podcd03710e_4ae7_41a3_ab06_c94a974ca060.slice. Sep 12 22:12:46.470475 systemd[1]: Created slice kubepods-besteffort-pod99a1e64b_8e24_4447_9a05_0d3e0d17c8ef.slice - libcontainer container kubepods-besteffort-pod99a1e64b_8e24_4447_9a05_0d3e0d17c8ef.slice. Sep 12 22:12:46.477655 systemd[1]: Created slice kubepods-burstable-pod59fcba95_a65f_4224_a6f7_a1c834aaad44.slice - libcontainer container kubepods-burstable-pod59fcba95_a65f_4224_a6f7_a1c834aaad44.slice. Sep 12 22:12:46.483316 systemd[1]: Created slice kubepods-besteffort-pod8b901d26_47da_4593_92fe_7d9e945b1e0d.slice - libcontainer container kubepods-besteffort-pod8b901d26_47da_4593_92fe_7d9e945b1e0d.slice. Sep 12 22:12:46.487469 kubelet[2681]: I0912 22:12:46.487373 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/952d4533-ce65-4672-99d2-3f86186944d5-whisker-ca-bundle\") pod \"whisker-5789559cdc-cc7gm\" (UID: \"952d4533-ce65-4672-99d2-3f86186944d5\") " pod="calico-system/whisker-5789559cdc-cc7gm" Sep 12 22:12:46.487469 kubelet[2681]: I0912 22:12:46.487417 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b901d26-47da-4593-92fe-7d9e945b1e0d-config\") pod \"goldmane-54d579b49d-7n7zg\" (UID: \"8b901d26-47da-4593-92fe-7d9e945b1e0d\") " pod="calico-system/goldmane-54d579b49d-7n7zg" Sep 12 22:12:46.487469 kubelet[2681]: I0912 22:12:46.487440 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9fgz\" (UniqueName: \"kubernetes.io/projected/99a1e64b-8e24-4447-9a05-0d3e0d17c8ef-kube-api-access-j9fgz\") pod \"calico-apiserver-6845f6bc46-gx8vx\" (UID: \"99a1e64b-8e24-4447-9a05-0d3e0d17c8ef\") " pod="calico-apiserver/calico-apiserver-6845f6bc46-gx8vx" Sep 12 22:12:46.487469 kubelet[2681]: I0912 22:12:46.487459 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8xl\" (UniqueName: \"kubernetes.io/projected/cd03710e-4ae7-41a3-ab06-c94a974ca060-kube-api-access-5s8xl\") pod \"coredns-674b8bbfcf-4wr7d\" (UID: \"cd03710e-4ae7-41a3-ab06-c94a974ca060\") " pod="kube-system/coredns-674b8bbfcf-4wr7d" Sep 12 22:12:46.487469 kubelet[2681]: I0912 22:12:46.487474 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/730c4fd2-47b1-4635-a97a-a8a4c9177989-tigera-ca-bundle\") pod \"calico-kube-controllers-86846c449f-frxv5\" (UID: \"730c4fd2-47b1-4635-a97a-a8a4c9177989\") " pod="calico-system/calico-kube-controllers-86846c449f-frxv5" Sep 12 22:12:46.487725 kubelet[2681]: I0912 22:12:46.487490 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b901d26-47da-4593-92fe-7d9e945b1e0d-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-7n7zg\" (UID: \"8b901d26-47da-4593-92fe-7d9e945b1e0d\") " pod="calico-system/goldmane-54d579b49d-7n7zg" Sep 12 22:12:46.487725 kubelet[2681]: I0912 22:12:46.487509 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8b901d26-47da-4593-92fe-7d9e945b1e0d-goldmane-key-pair\") pod \"goldmane-54d579b49d-7n7zg\" (UID: \"8b901d26-47da-4593-92fe-7d9e945b1e0d\") " pod="calico-system/goldmane-54d579b49d-7n7zg" Sep 12 22:12:46.487725 kubelet[2681]: I0912 22:12:46.487526 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/952d4533-ce65-4672-99d2-3f86186944d5-whisker-backend-key-pair\") pod \"whisker-5789559cdc-cc7gm\" (UID: \"952d4533-ce65-4672-99d2-3f86186944d5\") " pod="calico-system/whisker-5789559cdc-cc7gm" Sep 12 22:12:46.487725 kubelet[2681]: I0912 22:12:46.487539 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngw64\" (UniqueName: \"kubernetes.io/projected/952d4533-ce65-4672-99d2-3f86186944d5-kube-api-access-ngw64\") pod \"whisker-5789559cdc-cc7gm\" (UID: \"952d4533-ce65-4672-99d2-3f86186944d5\") " pod="calico-system/whisker-5789559cdc-cc7gm" Sep 12 22:12:46.487725 kubelet[2681]: I0912 22:12:46.487556 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htvcb\" (UniqueName: \"kubernetes.io/projected/730c4fd2-47b1-4635-a97a-a8a4c9177989-kube-api-access-htvcb\") pod \"calico-kube-controllers-86846c449f-frxv5\" (UID: \"730c4fd2-47b1-4635-a97a-a8a4c9177989\") " pod="calico-system/calico-kube-controllers-86846c449f-frxv5" Sep 12 22:12:46.487870 kubelet[2681]: I0912 22:12:46.487576 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvcs\" (UniqueName: \"kubernetes.io/projected/7fda3ea1-d06d-446e-a30d-ee95f130f24f-kube-api-access-gkvcs\") pod \"calico-apiserver-6845f6bc46-r2w9z\" (UID: \"7fda3ea1-d06d-446e-a30d-ee95f130f24f\") " pod="calico-apiserver/calico-apiserver-6845f6bc46-r2w9z" Sep 12 22:12:46.487870 kubelet[2681]: I0912 22:12:46.487590 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd03710e-4ae7-41a3-ab06-c94a974ca060-config-volume\") pod \"coredns-674b8bbfcf-4wr7d\" (UID: \"cd03710e-4ae7-41a3-ab06-c94a974ca060\") " pod="kube-system/coredns-674b8bbfcf-4wr7d" Sep 12 22:12:46.487870 kubelet[2681]: I0912 22:12:46.487604 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59fcba95-a65f-4224-a6f7-a1c834aaad44-config-volume\") pod \"coredns-674b8bbfcf-mjzjr\" (UID: \"59fcba95-a65f-4224-a6f7-a1c834aaad44\") " pod="kube-system/coredns-674b8bbfcf-mjzjr" Sep 12 22:12:46.487870 kubelet[2681]: I0912 22:12:46.487619 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vw2c\" (UniqueName: \"kubernetes.io/projected/59fcba95-a65f-4224-a6f7-a1c834aaad44-kube-api-access-7vw2c\") pod \"coredns-674b8bbfcf-mjzjr\" (UID: \"59fcba95-a65f-4224-a6f7-a1c834aaad44\") " pod="kube-system/coredns-674b8bbfcf-mjzjr" Sep 12 22:12:46.487870 kubelet[2681]: I0912 22:12:46.487644 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnvw\" (UniqueName: \"kubernetes.io/projected/8b901d26-47da-4593-92fe-7d9e945b1e0d-kube-api-access-mmnvw\") pod \"goldmane-54d579b49d-7n7zg\" (UID: \"8b901d26-47da-4593-92fe-7d9e945b1e0d\") " pod="calico-system/goldmane-54d579b49d-7n7zg" Sep 12 22:12:46.488138 kubelet[2681]: I0912 22:12:46.487658 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7fda3ea1-d06d-446e-a30d-ee95f130f24f-calico-apiserver-certs\") pod \"calico-apiserver-6845f6bc46-r2w9z\" (UID: \"7fda3ea1-d06d-446e-a30d-ee95f130f24f\") " pod="calico-apiserver/calico-apiserver-6845f6bc46-r2w9z" Sep 12 22:12:46.488138 kubelet[2681]: I0912 22:12:46.487673 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/99a1e64b-8e24-4447-9a05-0d3e0d17c8ef-calico-apiserver-certs\") pod \"calico-apiserver-6845f6bc46-gx8vx\" (UID: \"99a1e64b-8e24-4447-9a05-0d3e0d17c8ef\") " pod="calico-apiserver/calico-apiserver-6845f6bc46-gx8vx" Sep 12 22:12:46.491736 systemd[1]: Created slice kubepods-besteffort-pod7fda3ea1_d06d_446e_a30d_ee95f130f24f.slice - libcontainer container kubepods-besteffort-pod7fda3ea1_d06d_446e_a30d_ee95f130f24f.slice. Sep 12 22:12:46.755146 containerd[1529]: time="2025-09-12T22:12:46.755108688Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86846c449f-frxv5,Uid:730c4fd2-47b1-4635-a97a-a8a4c9177989,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:46.761610 containerd[1529]: time="2025-09-12T22:12:46.761579820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5789559cdc-cc7gm,Uid:952d4533-ce65-4672-99d2-3f86186944d5,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:46.769233 containerd[1529]: time="2025-09-12T22:12:46.769169335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4wr7d,Uid:cd03710e-4ae7-41a3-ab06-c94a974ca060,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:46.773939 containerd[1529]: time="2025-09-12T22:12:46.773866750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-gx8vx,Uid:99a1e64b-8e24-4447-9a05-0d3e0d17c8ef,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:12:46.780879 containerd[1529]: time="2025-09-12T22:12:46.780822292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mjzjr,Uid:59fcba95-a65f-4224-a6f7-a1c834aaad44,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:46.788072 containerd[1529]: time="2025-09-12T22:12:46.788001879Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7n7zg,Uid:8b901d26-47da-4593-92fe-7d9e945b1e0d,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:46.795120 containerd[1529]: time="2025-09-12T22:12:46.794982661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-r2w9z,Uid:7fda3ea1-d06d-446e-a30d-ee95f130f24f,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:12:46.874423 containerd[1529]: time="2025-09-12T22:12:46.874313639Z" level=error msg="Failed to destroy network for sandbox \"0c04788b1dc542780d08824205a4ed0edab4f446cdb8ddbdd6b219621c088733\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.876158 containerd[1529]: time="2025-09-12T22:12:46.876114515Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-r2w9z,Uid:7fda3ea1-d06d-446e-a30d-ee95f130f24f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c04788b1dc542780d08824205a4ed0edab4f446cdb8ddbdd6b219621c088733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.876913 kubelet[2681]: E0912 22:12:46.876833 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c04788b1dc542780d08824205a4ed0edab4f446cdb8ddbdd6b219621c088733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.876987 kubelet[2681]: E0912 22:12:46.876927 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c04788b1dc542780d08824205a4ed0edab4f446cdb8ddbdd6b219621c088733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6845f6bc46-r2w9z" Sep 12 22:12:46.876987 kubelet[2681]: E0912 22:12:46.876954 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0c04788b1dc542780d08824205a4ed0edab4f446cdb8ddbdd6b219621c088733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6845f6bc46-r2w9z" Sep 12 22:12:46.877043 kubelet[2681]: E0912 22:12:46.877001 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6845f6bc46-r2w9z_calico-apiserver(7fda3ea1-d06d-446e-a30d-ee95f130f24f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6845f6bc46-r2w9z_calico-apiserver(7fda3ea1-d06d-446e-a30d-ee95f130f24f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0c04788b1dc542780d08824205a4ed0edab4f446cdb8ddbdd6b219621c088733\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6845f6bc46-r2w9z" podUID="7fda3ea1-d06d-446e-a30d-ee95f130f24f" Sep 12 22:12:46.886126 containerd[1529]: time="2025-09-12T22:12:46.886085879Z" level=error msg="Failed to destroy network for sandbox \"7246d2f17d6c739e99f46e8fa9704a5479b2d93b1fafee9e2ca33474974835f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.887303 containerd[1529]: time="2025-09-12T22:12:46.887262543Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86846c449f-frxv5,Uid:730c4fd2-47b1-4635-a97a-a8a4c9177989,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7246d2f17d6c739e99f46e8fa9704a5479b2d93b1fafee9e2ca33474974835f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.887416 containerd[1529]: time="2025-09-12T22:12:46.887278743Z" level=error msg="Failed to destroy network for sandbox \"fb08af8c2063026169f3bae43ef3ddd2b3708c1c5285e59f594807ac83b57670\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.887514 kubelet[2681]: E0912 22:12:46.887476 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7246d2f17d6c739e99f46e8fa9704a5479b2d93b1fafee9e2ca33474974835f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.887766 kubelet[2681]: E0912 22:12:46.887530 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7246d2f17d6c739e99f46e8fa9704a5479b2d93b1fafee9e2ca33474974835f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86846c449f-frxv5" Sep 12 22:12:46.887766 kubelet[2681]: E0912 22:12:46.887549 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7246d2f17d6c739e99f46e8fa9704a5479b2d93b1fafee9e2ca33474974835f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-86846c449f-frxv5" Sep 12 22:12:46.887766 kubelet[2681]: E0912 22:12:46.887590 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-86846c449f-frxv5_calico-system(730c4fd2-47b1-4635-a97a-a8a4c9177989)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-86846c449f-frxv5_calico-system(730c4fd2-47b1-4635-a97a-a8a4c9177989)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7246d2f17d6c739e99f46e8fa9704a5479b2d93b1fafee9e2ca33474974835f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-86846c449f-frxv5" podUID="730c4fd2-47b1-4635-a97a-a8a4c9177989" Sep 12 22:12:46.891443 containerd[1529]: time="2025-09-12T22:12:46.891410867Z" level=error msg="Failed to destroy network for sandbox \"0d9506f45c5e288707ddeea6acf71b326e5db1507cbdcc1fcd7848d6cc43baff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.891777 containerd[1529]: time="2025-09-12T22:12:46.891740394Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mjzjr,Uid:59fcba95-a65f-4224-a6f7-a1c834aaad44,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb08af8c2063026169f3bae43ef3ddd2b3708c1c5285e59f594807ac83b57670\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.892175 kubelet[2681]: E0912 22:12:46.892121 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb08af8c2063026169f3bae43ef3ddd2b3708c1c5285e59f594807ac83b57670\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.892300 kubelet[2681]: E0912 22:12:46.892225 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb08af8c2063026169f3bae43ef3ddd2b3708c1c5285e59f594807ac83b57670\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mjzjr" Sep 12 22:12:46.892300 kubelet[2681]: E0912 22:12:46.892262 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb08af8c2063026169f3bae43ef3ddd2b3708c1c5285e59f594807ac83b57670\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-mjzjr" Sep 12 22:12:46.892791 containerd[1529]: time="2025-09-12T22:12:46.892744094Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4wr7d,Uid:cd03710e-4ae7-41a3-ab06-c94a974ca060,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d9506f45c5e288707ddeea6acf71b326e5db1507cbdcc1fcd7848d6cc43baff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.893049 kubelet[2681]: E0912 22:12:46.892392 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-mjzjr_kube-system(59fcba95-a65f-4224-a6f7-a1c834aaad44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-mjzjr_kube-system(59fcba95-a65f-4224-a6f7-a1c834aaad44)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fb08af8c2063026169f3bae43ef3ddd2b3708c1c5285e59f594807ac83b57670\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-mjzjr" podUID="59fcba95-a65f-4224-a6f7-a1c834aaad44" Sep 12 22:12:46.893144 kubelet[2681]: E0912 22:12:46.893081 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d9506f45c5e288707ddeea6acf71b326e5db1507cbdcc1fcd7848d6cc43baff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.893144 kubelet[2681]: E0912 22:12:46.893110 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d9506f45c5e288707ddeea6acf71b326e5db1507cbdcc1fcd7848d6cc43baff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4wr7d" Sep 12 22:12:46.893144 kubelet[2681]: E0912 22:12:46.893125 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d9506f45c5e288707ddeea6acf71b326e5db1507cbdcc1fcd7848d6cc43baff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4wr7d" Sep 12 22:12:46.893301 kubelet[2681]: E0912 22:12:46.893271 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-4wr7d_kube-system(cd03710e-4ae7-41a3-ab06-c94a974ca060)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-4wr7d_kube-system(cd03710e-4ae7-41a3-ab06-c94a974ca060)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d9506f45c5e288707ddeea6acf71b326e5db1507cbdcc1fcd7848d6cc43baff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-4wr7d" podUID="cd03710e-4ae7-41a3-ab06-c94a974ca060" Sep 12 22:12:46.894359 containerd[1529]: time="2025-09-12T22:12:46.894318406Z" level=error msg="Failed to destroy network for sandbox \"5253b2e98655156530cc62c6bc3115516fc3801474cd2827e0a43e72a5d18bc8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.896281 containerd[1529]: time="2025-09-12T22:12:46.896236246Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-gx8vx,Uid:99a1e64b-8e24-4447-9a05-0d3e0d17c8ef,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5253b2e98655156530cc62c6bc3115516fc3801474cd2827e0a43e72a5d18bc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.896786 kubelet[2681]: E0912 22:12:46.896414 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5253b2e98655156530cc62c6bc3115516fc3801474cd2827e0a43e72a5d18bc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.896786 kubelet[2681]: E0912 22:12:46.896473 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5253b2e98655156530cc62c6bc3115516fc3801474cd2827e0a43e72a5d18bc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6845f6bc46-gx8vx" Sep 12 22:12:46.896786 kubelet[2681]: E0912 22:12:46.896488 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5253b2e98655156530cc62c6bc3115516fc3801474cd2827e0a43e72a5d18bc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6845f6bc46-gx8vx" Sep 12 22:12:46.896871 kubelet[2681]: E0912 22:12:46.896549 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6845f6bc46-gx8vx_calico-apiserver(99a1e64b-8e24-4447-9a05-0d3e0d17c8ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6845f6bc46-gx8vx_calico-apiserver(99a1e64b-8e24-4447-9a05-0d3e0d17c8ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5253b2e98655156530cc62c6bc3115516fc3801474cd2827e0a43e72a5d18bc8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6845f6bc46-gx8vx" podUID="99a1e64b-8e24-4447-9a05-0d3e0d17c8ef" Sep 12 22:12:46.898746 containerd[1529]: time="2025-09-12T22:12:46.898467131Z" level=error msg="Failed to destroy network for sandbox \"d9f9977e85c2fbd6790b6eff8f6f512c694c75f201a6bc08419f26b8829a8c84\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.900840 containerd[1529]: time="2025-09-12T22:12:46.900781618Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5789559cdc-cc7gm,Uid:952d4533-ce65-4672-99d2-3f86186944d5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9f9977e85c2fbd6790b6eff8f6f512c694c75f201a6bc08419f26b8829a8c84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.901106 kubelet[2681]: E0912 22:12:46.901076 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9f9977e85c2fbd6790b6eff8f6f512c694c75f201a6bc08419f26b8829a8c84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.901211 kubelet[2681]: E0912 22:12:46.901127 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9f9977e85c2fbd6790b6eff8f6f512c694c75f201a6bc08419f26b8829a8c84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5789559cdc-cc7gm" Sep 12 22:12:46.901211 kubelet[2681]: E0912 22:12:46.901146 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d9f9977e85c2fbd6790b6eff8f6f512c694c75f201a6bc08419f26b8829a8c84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5789559cdc-cc7gm" Sep 12 22:12:46.901277 kubelet[2681]: E0912 22:12:46.901228 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5789559cdc-cc7gm_calico-system(952d4533-ce65-4672-99d2-3f86186944d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5789559cdc-cc7gm_calico-system(952d4533-ce65-4672-99d2-3f86186944d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d9f9977e85c2fbd6790b6eff8f6f512c694c75f201a6bc08419f26b8829a8c84\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5789559cdc-cc7gm" podUID="952d4533-ce65-4672-99d2-3f86186944d5" Sep 12 22:12:46.905243 containerd[1529]: time="2025-09-12T22:12:46.905205988Z" level=error msg="Failed to destroy network for sandbox \"d7a14e29da909512d1f4c43f953c8015255fa24b821c0afe98a065589cddeb51\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.906104 containerd[1529]: time="2025-09-12T22:12:46.906071486Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7n7zg,Uid:8b901d26-47da-4593-92fe-7d9e945b1e0d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7a14e29da909512d1f4c43f953c8015255fa24b821c0afe98a065589cddeb51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.906319 kubelet[2681]: E0912 22:12:46.906264 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7a14e29da909512d1f4c43f953c8015255fa24b821c0afe98a065589cddeb51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:46.906363 kubelet[2681]: E0912 22:12:46.906328 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7a14e29da909512d1f4c43f953c8015255fa24b821c0afe98a065589cddeb51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-7n7zg" Sep 12 22:12:46.906402 kubelet[2681]: E0912 22:12:46.906373 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7a14e29da909512d1f4c43f953c8015255fa24b821c0afe98a065589cddeb51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-7n7zg" Sep 12 22:12:46.906442 kubelet[2681]: E0912 22:12:46.906412 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-7n7zg_calico-system(8b901d26-47da-4593-92fe-7d9e945b1e0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-7n7zg_calico-system(8b901d26-47da-4593-92fe-7d9e945b1e0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d7a14e29da909512d1f4c43f953c8015255fa24b821c0afe98a065589cddeb51\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-7n7zg" podUID="8b901d26-47da-4593-92fe-7d9e945b1e0d" Sep 12 22:12:46.986879 containerd[1529]: time="2025-09-12T22:12:46.986831453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 22:12:47.678175 systemd[1]: run-netns-cni\x2d3659f72d\x2d45ac\x2d9063\x2d7db7\x2d88a3b39b60fd.mount: Deactivated successfully. Sep 12 22:12:47.678287 systemd[1]: run-netns-cni\x2d04434e36\x2d9885\x2d8a05\x2d9de3\x2d431251b9ca06.mount: Deactivated successfully. Sep 12 22:12:47.678332 systemd[1]: run-netns-cni\x2d58cf3f31\x2dd63a\x2d442c\x2d8e9c\x2d75f6ff2d8160.mount: Deactivated successfully. Sep 12 22:12:47.678394 systemd[1]: run-netns-cni\x2d194a72c5\x2d015d\x2db497\x2d7aaa\x2d14283fed96e0.mount: Deactivated successfully. Sep 12 22:12:47.891243 systemd[1]: Created slice kubepods-besteffort-podf1eff131_ed0a_4062_a13b_42b0f931fef5.slice - libcontainer container kubepods-besteffort-podf1eff131_ed0a_4062_a13b_42b0f931fef5.slice. Sep 12 22:12:47.894213 containerd[1529]: time="2025-09-12T22:12:47.894163329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpk44,Uid:f1eff131-ed0a-4062-a13b-42b0f931fef5,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:47.952333 containerd[1529]: time="2025-09-12T22:12:47.951906743Z" level=error msg="Failed to destroy network for sandbox \"80c13d697a84597364058085e5fabe7c7bef2db0dc1f651a90fa7e81db2c5dde\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:47.956782 systemd[1]: run-netns-cni\x2d54e04c6c\x2d95eb\x2d21d5\x2d99b0\x2d6ae676d89c56.mount: Deactivated successfully. Sep 12 22:12:47.971213 containerd[1529]: time="2025-09-12T22:12:47.971097280Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpk44,Uid:f1eff131-ed0a-4062-a13b-42b0f931fef5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c13d697a84597364058085e5fabe7c7bef2db0dc1f651a90fa7e81db2c5dde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:47.971604 kubelet[2681]: E0912 22:12:47.971555 2681 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c13d697a84597364058085e5fabe7c7bef2db0dc1f651a90fa7e81db2c5dde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:12:47.973226 kubelet[2681]: E0912 22:12:47.971621 2681 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c13d697a84597364058085e5fabe7c7bef2db0dc1f651a90fa7e81db2c5dde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gpk44" Sep 12 22:12:47.973226 kubelet[2681]: E0912 22:12:47.971642 2681 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80c13d697a84597364058085e5fabe7c7bef2db0dc1f651a90fa7e81db2c5dde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gpk44" Sep 12 22:12:47.973226 kubelet[2681]: E0912 22:12:47.971689 2681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gpk44_calico-system(f1eff131-ed0a-4062-a13b-42b0f931fef5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gpk44_calico-system(f1eff131-ed0a-4062-a13b-42b0f931fef5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80c13d697a84597364058085e5fabe7c7bef2db0dc1f651a90fa7e81db2c5dde\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gpk44" podUID="f1eff131-ed0a-4062-a13b-42b0f931fef5" Sep 12 22:12:49.882828 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1825000290.mount: Deactivated successfully. Sep 12 22:12:50.100357 containerd[1529]: time="2025-09-12T22:12:50.100298468Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:50.123579 containerd[1529]: time="2025-09-12T22:12:50.100724836Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 22:12:50.123704 containerd[1529]: time="2025-09-12T22:12:50.102459387Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:50.123957 containerd[1529]: time="2025-09-12T22:12:50.104944551Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 3.118069217s" Sep 12 22:12:50.123957 containerd[1529]: time="2025-09-12T22:12:50.123868885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 22:12:50.124284 containerd[1529]: time="2025-09-12T22:12:50.124237452Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:50.141786 containerd[1529]: time="2025-09-12T22:12:50.141692000Z" level=info msg="CreateContainer within sandbox \"28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 22:12:50.160448 containerd[1529]: time="2025-09-12T22:12:50.160397371Z" level=info msg="Container 0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:50.161144 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2234642410.mount: Deactivated successfully. Sep 12 22:12:50.168812 containerd[1529]: time="2025-09-12T22:12:50.168759319Z" level=info msg="CreateContainer within sandbox \"28320f41bf00bac5409ea7dac77cd2ac3632c6a166a7585d9530141ef33ea993\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e\"" Sep 12 22:12:50.169363 containerd[1529]: time="2025-09-12T22:12:50.169319889Z" level=info msg="StartContainer for \"0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e\"" Sep 12 22:12:50.171265 containerd[1529]: time="2025-09-12T22:12:50.171226963Z" level=info msg="connecting to shim 0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e" address="unix:///run/containerd/s/dcabd6f0b5b5c2bfde47fd87de7d4e4fb43edeb135ee86c2fe04add55b7f235a" protocol=ttrpc version=3 Sep 12 22:12:50.219484 systemd[1]: Started cri-containerd-0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e.scope - libcontainer container 0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e. Sep 12 22:12:50.254487 containerd[1529]: time="2025-09-12T22:12:50.254423314Z" level=info msg="StartContainer for \"0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e\" returns successfully" Sep 12 22:12:50.380342 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 22:12:50.380451 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 22:12:50.611512 kubelet[2681]: I0912 22:12:50.610946 2681 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngw64\" (UniqueName: \"kubernetes.io/projected/952d4533-ce65-4672-99d2-3f86186944d5-kube-api-access-ngw64\") pod \"952d4533-ce65-4672-99d2-3f86186944d5\" (UID: \"952d4533-ce65-4672-99d2-3f86186944d5\") " Sep 12 22:12:50.611512 kubelet[2681]: I0912 22:12:50.610993 2681 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/952d4533-ce65-4672-99d2-3f86186944d5-whisker-ca-bundle\") pod \"952d4533-ce65-4672-99d2-3f86186944d5\" (UID: \"952d4533-ce65-4672-99d2-3f86186944d5\") " Sep 12 22:12:50.611512 kubelet[2681]: I0912 22:12:50.611019 2681 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/952d4533-ce65-4672-99d2-3f86186944d5-whisker-backend-key-pair\") pod \"952d4533-ce65-4672-99d2-3f86186944d5\" (UID: \"952d4533-ce65-4672-99d2-3f86186944d5\") " Sep 12 22:12:50.629658 kubelet[2681]: I0912 22:12:50.629364 2681 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952d4533-ce65-4672-99d2-3f86186944d5-kube-api-access-ngw64" (OuterVolumeSpecName: "kube-api-access-ngw64") pod "952d4533-ce65-4672-99d2-3f86186944d5" (UID: "952d4533-ce65-4672-99d2-3f86186944d5"). InnerVolumeSpecName "kube-api-access-ngw64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 22:12:50.632025 kubelet[2681]: I0912 22:12:50.631990 2681 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952d4533-ce65-4672-99d2-3f86186944d5-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "952d4533-ce65-4672-99d2-3f86186944d5" (UID: "952d4533-ce65-4672-99d2-3f86186944d5"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 22:12:50.640939 kubelet[2681]: I0912 22:12:50.640863 2681 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/952d4533-ce65-4672-99d2-3f86186944d5-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "952d4533-ce65-4672-99d2-3f86186944d5" (UID: "952d4533-ce65-4672-99d2-3f86186944d5"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 22:12:50.711647 kubelet[2681]: I0912 22:12:50.711582 2681 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngw64\" (UniqueName: \"kubernetes.io/projected/952d4533-ce65-4672-99d2-3f86186944d5-kube-api-access-ngw64\") on node \"localhost\" DevicePath \"\"" Sep 12 22:12:50.711647 kubelet[2681]: I0912 22:12:50.711617 2681 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/952d4533-ce65-4672-99d2-3f86186944d5-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 22:12:50.711647 kubelet[2681]: I0912 22:12:50.711625 2681 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/952d4533-ce65-4672-99d2-3f86186944d5-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 22:12:50.883602 systemd[1]: var-lib-kubelet-pods-952d4533\x2dce65\x2d4672\x2d99d2\x2d3f86186944d5-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dngw64.mount: Deactivated successfully. Sep 12 22:12:50.883688 systemd[1]: var-lib-kubelet-pods-952d4533\x2dce65\x2d4672\x2d99d2\x2d3f86186944d5-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 22:12:51.019199 systemd[1]: Removed slice kubepods-besteffort-pod952d4533_ce65_4672_99d2_3f86186944d5.slice - libcontainer container kubepods-besteffort-pod952d4533_ce65_4672_99d2_3f86186944d5.slice. Sep 12 22:12:51.035851 kubelet[2681]: I0912 22:12:51.035791 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-75m29" podStartSLOduration=1.945457811 podStartE2EDuration="14.03576887s" podCreationTimestamp="2025-09-12 22:12:37 +0000 UTC" firstStartedPulling="2025-09-12 22:12:38.034573284 +0000 UTC m=+22.236567470" lastFinishedPulling="2025-09-12 22:12:50.124884383 +0000 UTC m=+34.326878529" observedRunningTime="2025-09-12 22:12:51.035264182 +0000 UTC m=+35.237258368" watchObservedRunningTime="2025-09-12 22:12:51.03576887 +0000 UTC m=+35.237763056" Sep 12 22:12:51.084978 systemd[1]: Created slice kubepods-besteffort-poda5000b44_e9ef_4421_8e8f_10e23dc9be05.slice - libcontainer container kubepods-besteffort-poda5000b44_e9ef_4421_8e8f_10e23dc9be05.slice. Sep 12 22:12:51.113682 kubelet[2681]: I0912 22:12:51.113470 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5000b44-e9ef-4421-8e8f-10e23dc9be05-whisker-ca-bundle\") pod \"whisker-85fb998b6-ldppf\" (UID: \"a5000b44-e9ef-4421-8e8f-10e23dc9be05\") " pod="calico-system/whisker-85fb998b6-ldppf" Sep 12 22:12:51.113682 kubelet[2681]: I0912 22:12:51.113582 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a5000b44-e9ef-4421-8e8f-10e23dc9be05-whisker-backend-key-pair\") pod \"whisker-85fb998b6-ldppf\" (UID: \"a5000b44-e9ef-4421-8e8f-10e23dc9be05\") " pod="calico-system/whisker-85fb998b6-ldppf" Sep 12 22:12:51.113682 kubelet[2681]: I0912 22:12:51.113665 2681 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vngx\" (UniqueName: \"kubernetes.io/projected/a5000b44-e9ef-4421-8e8f-10e23dc9be05-kube-api-access-4vngx\") pod \"whisker-85fb998b6-ldppf\" (UID: \"a5000b44-e9ef-4421-8e8f-10e23dc9be05\") " pod="calico-system/whisker-85fb998b6-ldppf" Sep 12 22:12:51.388113 containerd[1529]: time="2025-09-12T22:12:51.388061698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85fb998b6-ldppf,Uid:a5000b44-e9ef-4421-8e8f-10e23dc9be05,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:51.563774 systemd-networkd[1433]: cali43ad0db7501: Link UP Sep 12 22:12:51.564374 systemd-networkd[1433]: cali43ad0db7501: Gained carrier Sep 12 22:12:51.576075 containerd[1529]: 2025-09-12 22:12:51.409 [INFO][3819] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:12:51.576075 containerd[1529]: 2025-09-12 22:12:51.445 [INFO][3819] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--85fb998b6--ldppf-eth0 whisker-85fb998b6- calico-system a5000b44-e9ef-4421-8e8f-10e23dc9be05 869 0 2025-09-12 22:12:51 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:85fb998b6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-85fb998b6-ldppf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali43ad0db7501 [] [] }} ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-" Sep 12 22:12:51.576075 containerd[1529]: 2025-09-12 22:12:51.445 [INFO][3819] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-eth0" Sep 12 22:12:51.576075 containerd[1529]: 2025-09-12 22:12:51.516 [INFO][3834] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" HandleID="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Workload="localhost-k8s-whisker--85fb998b6--ldppf-eth0" Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.516 [INFO][3834] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" HandleID="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Workload="localhost-k8s-whisker--85fb998b6--ldppf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003fb660), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-85fb998b6-ldppf", "timestamp":"2025-09-12 22:12:51.516583736 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.516 [INFO][3834] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.516 [INFO][3834] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.516 [INFO][3834] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.528 [INFO][3834] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" host="localhost" Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.534 [INFO][3834] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.539 [INFO][3834] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.541 [INFO][3834] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.543 [INFO][3834] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:51.576323 containerd[1529]: 2025-09-12 22:12:51.543 [INFO][3834] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" host="localhost" Sep 12 22:12:51.576530 containerd[1529]: 2025-09-12 22:12:51.545 [INFO][3834] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67 Sep 12 22:12:51.576530 containerd[1529]: 2025-09-12 22:12:51.548 [INFO][3834] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" host="localhost" Sep 12 22:12:51.576530 containerd[1529]: 2025-09-12 22:12:51.553 [INFO][3834] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" host="localhost" Sep 12 22:12:51.576530 containerd[1529]: 2025-09-12 22:12:51.553 [INFO][3834] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" host="localhost" Sep 12 22:12:51.576530 containerd[1529]: 2025-09-12 22:12:51.553 [INFO][3834] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:12:51.576530 containerd[1529]: 2025-09-12 22:12:51.553 [INFO][3834] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" HandleID="k8s-pod-network.bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Workload="localhost-k8s-whisker--85fb998b6--ldppf-eth0" Sep 12 22:12:51.576639 containerd[1529]: 2025-09-12 22:12:51.555 [INFO][3819] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--85fb998b6--ldppf-eth0", GenerateName:"whisker-85fb998b6-", Namespace:"calico-system", SelfLink:"", UID:"a5000b44-e9ef-4421-8e8f-10e23dc9be05", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85fb998b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-85fb998b6-ldppf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali43ad0db7501", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:51.576639 containerd[1529]: 2025-09-12 22:12:51.556 [INFO][3819] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-eth0" Sep 12 22:12:51.576707 containerd[1529]: 2025-09-12 22:12:51.556 [INFO][3819] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43ad0db7501 ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-eth0" Sep 12 22:12:51.576707 containerd[1529]: 2025-09-12 22:12:51.564 [INFO][3819] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-eth0" Sep 12 22:12:51.576746 containerd[1529]: 2025-09-12 22:12:51.564 [INFO][3819] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--85fb998b6--ldppf-eth0", GenerateName:"whisker-85fb998b6-", Namespace:"calico-system", SelfLink:"", UID:"a5000b44-e9ef-4421-8e8f-10e23dc9be05", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85fb998b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67", Pod:"whisker-85fb998b6-ldppf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali43ad0db7501", MAC:"ba:f2:bb:a1:8d:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:51.576791 containerd[1529]: 2025-09-12 22:12:51.574 [INFO][3819] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" Namespace="calico-system" Pod="whisker-85fb998b6-ldppf" WorkloadEndpoint="localhost-k8s-whisker--85fb998b6--ldppf-eth0" Sep 12 22:12:51.893215 kubelet[2681]: I0912 22:12:51.892670 2681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952d4533-ce65-4672-99d2-3f86186944d5" path="/var/lib/kubelet/pods/952d4533-ce65-4672-99d2-3f86186944d5/volumes" Sep 12 22:12:51.915776 containerd[1529]: time="2025-09-12T22:12:51.915716725Z" level=info msg="connecting to shim bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67" address="unix:///run/containerd/s/2d2cbec3ce89e7ca6f707862b2a33219136ca138ae08650cadab9a405edc5859" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:51.962801 systemd[1]: Started cri-containerd-bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67.scope - libcontainer container bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67. Sep 12 22:12:51.987858 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:12:52.022248 kubelet[2681]: I0912 22:12:52.022177 2681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:12:52.028545 containerd[1529]: time="2025-09-12T22:12:52.028384638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85fb998b6-ldppf,Uid:a5000b44-e9ef-4421-8e8f-10e23dc9be05,Namespace:calico-system,Attempt:0,} returns sandbox id \"bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67\"" Sep 12 22:12:52.034700 containerd[1529]: time="2025-09-12T22:12:52.034403578Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 22:12:52.981085 containerd[1529]: time="2025-09-12T22:12:52.980747859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:52.981815 containerd[1529]: time="2025-09-12T22:12:52.981777236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 22:12:52.982709 containerd[1529]: time="2025-09-12T22:12:52.982684331Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:52.985592 containerd[1529]: time="2025-09-12T22:12:52.984908328Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:52.985592 containerd[1529]: time="2025-09-12T22:12:52.985479977Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 950.621952ms" Sep 12 22:12:52.985592 containerd[1529]: time="2025-09-12T22:12:52.985506978Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 22:12:52.992212 containerd[1529]: time="2025-09-12T22:12:52.992126607Z" level=info msg="CreateContainer within sandbox \"bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 22:12:53.005423 containerd[1529]: time="2025-09-12T22:12:53.005373464Z" level=info msg="Container ae703f2d0f255de1602bd110e04e36c5c3ba661e54a23f719d22668b5d84f368: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:53.012667 containerd[1529]: time="2025-09-12T22:12:53.012618701Z" level=info msg="CreateContainer within sandbox \"bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"ae703f2d0f255de1602bd110e04e36c5c3ba661e54a23f719d22668b5d84f368\"" Sep 12 22:12:53.013375 containerd[1529]: time="2025-09-12T22:12:53.013341512Z" level=info msg="StartContainer for \"ae703f2d0f255de1602bd110e04e36c5c3ba661e54a23f719d22668b5d84f368\"" Sep 12 22:12:53.026039 containerd[1529]: time="2025-09-12T22:12:53.025995715Z" level=info msg="connecting to shim ae703f2d0f255de1602bd110e04e36c5c3ba661e54a23f719d22668b5d84f368" address="unix:///run/containerd/s/2d2cbec3ce89e7ca6f707862b2a33219136ca138ae08650cadab9a405edc5859" protocol=ttrpc version=3 Sep 12 22:12:53.043302 systemd-networkd[1433]: cali43ad0db7501: Gained IPv6LL Sep 12 22:12:53.047623 systemd[1]: Started cri-containerd-ae703f2d0f255de1602bd110e04e36c5c3ba661e54a23f719d22668b5d84f368.scope - libcontainer container ae703f2d0f255de1602bd110e04e36c5c3ba661e54a23f719d22668b5d84f368. Sep 12 22:12:53.106939 containerd[1529]: time="2025-09-12T22:12:53.106891495Z" level=info msg="StartContainer for \"ae703f2d0f255de1602bd110e04e36c5c3ba661e54a23f719d22668b5d84f368\" returns successfully" Sep 12 22:12:53.108710 containerd[1529]: time="2025-09-12T22:12:53.108648363Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 22:12:54.551662 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3712934662.mount: Deactivated successfully. Sep 12 22:12:54.566627 containerd[1529]: time="2025-09-12T22:12:54.566579197Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:54.567722 containerd[1529]: time="2025-09-12T22:12:54.567282088Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 22:12:54.568135 containerd[1529]: time="2025-09-12T22:12:54.568075621Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:54.570565 containerd[1529]: time="2025-09-12T22:12:54.570529219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:12:54.571324 containerd[1529]: time="2025-09-12T22:12:54.571286911Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.462577667s" Sep 12 22:12:54.571324 containerd[1529]: time="2025-09-12T22:12:54.571322711Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 22:12:54.576683 containerd[1529]: time="2025-09-12T22:12:54.576626914Z" level=info msg="CreateContainer within sandbox \"bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 22:12:54.590534 containerd[1529]: time="2025-09-12T22:12:54.590482930Z" level=info msg="Container 44f48c4198e97354f53a3f449a74007ef7b08ff0e90f7f1f897614e53eae1437: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:54.598922 containerd[1529]: time="2025-09-12T22:12:54.598845021Z" level=info msg="CreateContainer within sandbox \"bffd1a4325353d257ed5958e6b718f0bdbeb9f01f31c79f8ed99991031370d67\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"44f48c4198e97354f53a3f449a74007ef7b08ff0e90f7f1f897614e53eae1437\"" Sep 12 22:12:54.599990 containerd[1529]: time="2025-09-12T22:12:54.599945478Z" level=info msg="StartContainer for \"44f48c4198e97354f53a3f449a74007ef7b08ff0e90f7f1f897614e53eae1437\"" Sep 12 22:12:54.601097 containerd[1529]: time="2025-09-12T22:12:54.601069775Z" level=info msg="connecting to shim 44f48c4198e97354f53a3f449a74007ef7b08ff0e90f7f1f897614e53eae1437" address="unix:///run/containerd/s/2d2cbec3ce89e7ca6f707862b2a33219136ca138ae08650cadab9a405edc5859" protocol=ttrpc version=3 Sep 12 22:12:54.624440 systemd[1]: Started cri-containerd-44f48c4198e97354f53a3f449a74007ef7b08ff0e90f7f1f897614e53eae1437.scope - libcontainer container 44f48c4198e97354f53a3f449a74007ef7b08ff0e90f7f1f897614e53eae1437. Sep 12 22:12:54.657616 containerd[1529]: time="2025-09-12T22:12:54.657574576Z" level=info msg="StartContainer for \"44f48c4198e97354f53a3f449a74007ef7b08ff0e90f7f1f897614e53eae1437\" returns successfully" Sep 12 22:12:55.051005 kubelet[2681]: I0912 22:12:55.050944 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-85fb998b6-ldppf" podStartSLOduration=1.511740278 podStartE2EDuration="4.050927287s" podCreationTimestamp="2025-09-12 22:12:51 +0000 UTC" firstStartedPulling="2025-09-12 22:12:52.033987611 +0000 UTC m=+36.235981797" lastFinishedPulling="2025-09-12 22:12:54.57317462 +0000 UTC m=+38.775168806" observedRunningTime="2025-09-12 22:12:55.050363279 +0000 UTC m=+39.252357465" watchObservedRunningTime="2025-09-12 22:12:55.050927287 +0000 UTC m=+39.252921473" Sep 12 22:12:56.408033 systemd[1]: Started sshd@7-10.0.0.68:22-10.0.0.1:42670.service - OpenSSH per-connection server daemon (10.0.0.1:42670). Sep 12 22:12:56.470767 sshd[4183]: Accepted publickey for core from 10.0.0.1 port 42670 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:12:56.472285 sshd-session[4183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:12:56.476608 systemd-logind[1501]: New session 8 of user core. Sep 12 22:12:56.482384 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 22:12:56.632466 sshd[4186]: Connection closed by 10.0.0.1 port 42670 Sep 12 22:12:56.632980 sshd-session[4183]: pam_unix(sshd:session): session closed for user core Sep 12 22:12:56.636303 systemd[1]: sshd@7-10.0.0.68:22-10.0.0.1:42670.service: Deactivated successfully. Sep 12 22:12:56.638068 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 22:12:56.638840 systemd-logind[1501]: Session 8 logged out. Waiting for processes to exit. Sep 12 22:12:56.639981 systemd-logind[1501]: Removed session 8. Sep 12 22:12:57.885728 containerd[1529]: time="2025-09-12T22:12:57.885670300Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4wr7d,Uid:cd03710e-4ae7-41a3-ab06-c94a974ca060,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:57.887935 containerd[1529]: time="2025-09-12T22:12:57.885670340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mjzjr,Uid:59fcba95-a65f-4224-a6f7-a1c834aaad44,Namespace:kube-system,Attempt:0,}" Sep 12 22:12:58.089683 systemd-networkd[1433]: calic68885d62b2: Link UP Sep 12 22:12:58.091080 systemd-networkd[1433]: calic68885d62b2: Gained carrier Sep 12 22:12:58.125892 containerd[1529]: 2025-09-12 22:12:57.954 [INFO][4237] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:12:58.125892 containerd[1529]: 2025-09-12 22:12:57.973 [INFO][4237] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0 coredns-674b8bbfcf- kube-system 59fcba95-a65f-4224-a6f7-a1c834aaad44 813 0 2025-09-12 22:12:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-mjzjr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic68885d62b2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-" Sep 12 22:12:58.125892 containerd[1529]: 2025-09-12 22:12:57.973 [INFO][4237] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" Sep 12 22:12:58.125892 containerd[1529]: 2025-09-12 22:12:58.009 [INFO][4255] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" HandleID="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Workload="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.009 [INFO][4255] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" HandleID="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Workload="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000140490), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-mjzjr", "timestamp":"2025-09-12 22:12:58.009832357 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.010 [INFO][4255] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.010 [INFO][4255] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.010 [INFO][4255] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.022 [INFO][4255] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" host="localhost" Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.027 [INFO][4255] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.033 [INFO][4255] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.036 [INFO][4255] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.041 [INFO][4255] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:58.126117 containerd[1529]: 2025-09-12 22:12:58.041 [INFO][4255] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" host="localhost" Sep 12 22:12:58.126364 containerd[1529]: 2025-09-12 22:12:58.044 [INFO][4255] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69 Sep 12 22:12:58.126364 containerd[1529]: 2025-09-12 22:12:58.052 [INFO][4255] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" host="localhost" Sep 12 22:12:58.126364 containerd[1529]: 2025-09-12 22:12:58.074 [INFO][4255] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" host="localhost" Sep 12 22:12:58.126364 containerd[1529]: 2025-09-12 22:12:58.074 [INFO][4255] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" host="localhost" Sep 12 22:12:58.126364 containerd[1529]: 2025-09-12 22:12:58.074 [INFO][4255] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:12:58.126364 containerd[1529]: 2025-09-12 22:12:58.074 [INFO][4255] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" HandleID="k8s-pod-network.cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Workload="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" Sep 12 22:12:58.126481 containerd[1529]: 2025-09-12 22:12:58.080 [INFO][4237] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"59fcba95-a65f-4224-a6f7-a1c834aaad44", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-mjzjr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic68885d62b2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:58.126564 containerd[1529]: 2025-09-12 22:12:58.080 [INFO][4237] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" Sep 12 22:12:58.126564 containerd[1529]: 2025-09-12 22:12:58.082 [INFO][4237] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic68885d62b2 ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" Sep 12 22:12:58.126564 containerd[1529]: 2025-09-12 22:12:58.090 [INFO][4237] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" Sep 12 22:12:58.126683 containerd[1529]: 2025-09-12 22:12:58.091 [INFO][4237] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"59fcba95-a65f-4224-a6f7-a1c834aaad44", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69", Pod:"coredns-674b8bbfcf-mjzjr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic68885d62b2", MAC:"06:81:5a:65:57:7b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:58.126683 containerd[1529]: 2025-09-12 22:12:58.120 [INFO][4237] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" Namespace="kube-system" Pod="coredns-674b8bbfcf-mjzjr" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--mjzjr-eth0" Sep 12 22:12:58.152291 containerd[1529]: time="2025-09-12T22:12:58.151221693Z" level=info msg="connecting to shim cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69" address="unix:///run/containerd/s/e73d7ae00bab9a271eb790cd74010abcb8833b1184392cd0c67a9abe45942ce9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:58.164435 systemd-networkd[1433]: cali6ec42c56102: Link UP Sep 12 22:12:58.165696 systemd-networkd[1433]: cali6ec42c56102: Gained carrier Sep 12 22:12:58.195416 systemd[1]: Started cri-containerd-cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69.scope - libcontainer container cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69. Sep 12 22:12:58.205705 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:12:58.255510 containerd[1529]: time="2025-09-12T22:12:58.255449470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-mjzjr,Uid:59fcba95-a65f-4224-a6f7-a1c834aaad44,Namespace:kube-system,Attempt:0,} returns sandbox id \"cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69\"" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:57.962 [INFO][4227] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:57.982 [INFO][4227] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0 coredns-674b8bbfcf- kube-system cd03710e-4ae7-41a3-ab06-c94a974ca060 810 0 2025-09-12 22:12:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-4wr7d eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6ec42c56102 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:57.982 [INFO][4227] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.010 [INFO][4262] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" HandleID="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Workload="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.010 [INFO][4262] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" HandleID="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Workload="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400051eb40), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-4wr7d", "timestamp":"2025-09-12 22:12:58.010422325 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.010 [INFO][4262] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.074 [INFO][4262] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.075 [INFO][4262] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.123 [INFO][4262] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.129 [INFO][4262] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.133 [INFO][4262] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.135 [INFO][4262] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.137 [INFO][4262] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.137 [INFO][4262] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.139 [INFO][4262] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.144 [INFO][4262] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.150 [INFO][4262] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.150 [INFO][4262] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" host="localhost" Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.150 [INFO][4262] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:12:58.265354 containerd[1529]: 2025-09-12 22:12:58.150 [INFO][4262] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" HandleID="k8s-pod-network.3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Workload="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" Sep 12 22:12:58.265867 containerd[1529]: 2025-09-12 22:12:58.156 [INFO][4227] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"cd03710e-4ae7-41a3-ab06-c94a974ca060", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-4wr7d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ec42c56102", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:58.265867 containerd[1529]: 2025-09-12 22:12:58.156 [INFO][4227] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" Sep 12 22:12:58.265867 containerd[1529]: 2025-09-12 22:12:58.156 [INFO][4227] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6ec42c56102 ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" Sep 12 22:12:58.265867 containerd[1529]: 2025-09-12 22:12:58.165 [INFO][4227] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" Sep 12 22:12:58.265867 containerd[1529]: 2025-09-12 22:12:58.168 [INFO][4227] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"cd03710e-4ae7-41a3-ab06-c94a974ca060", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf", Pod:"coredns-674b8bbfcf-4wr7d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ec42c56102", MAC:"66:a3:61:a7:57:d1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:58.265867 containerd[1529]: 2025-09-12 22:12:58.258 [INFO][4227] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" Namespace="kube-system" Pod="coredns-674b8bbfcf-4wr7d" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--4wr7d-eth0" Sep 12 22:12:58.271695 containerd[1529]: time="2025-09-12T22:12:58.271654336Z" level=info msg="CreateContainer within sandbox \"cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:12:58.284328 containerd[1529]: time="2025-09-12T22:12:58.284293193Z" level=info msg="Container 5e063775be39c99f96fcc0c2d857395e80c8996fc1bcf335b4ad6fc32bf18257: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:58.291945 containerd[1529]: time="2025-09-12T22:12:58.291909820Z" level=info msg="CreateContainer within sandbox \"cbdfbf3b32e9deae9e9ebb77467249be42de8cf5f0a0e01ddd0ec07a300b8c69\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5e063775be39c99f96fcc0c2d857395e80c8996fc1bcf335b4ad6fc32bf18257\"" Sep 12 22:12:58.292578 containerd[1529]: time="2025-09-12T22:12:58.292549029Z" level=info msg="StartContainer for \"5e063775be39c99f96fcc0c2d857395e80c8996fc1bcf335b4ad6fc32bf18257\"" Sep 12 22:12:58.293627 containerd[1529]: time="2025-09-12T22:12:58.293553163Z" level=info msg="connecting to shim 5e063775be39c99f96fcc0c2d857395e80c8996fc1bcf335b4ad6fc32bf18257" address="unix:///run/containerd/s/e73d7ae00bab9a271eb790cd74010abcb8833b1184392cd0c67a9abe45942ce9" protocol=ttrpc version=3 Sep 12 22:12:58.295253 containerd[1529]: time="2025-09-12T22:12:58.295223866Z" level=info msg="connecting to shim 3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf" address="unix:///run/containerd/s/e6e37b0dc0b6d2209cf74037167749042d9ca803c65d41f44ec7648904da615b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:58.318395 systemd[1]: Started cri-containerd-5e063775be39c99f96fcc0c2d857395e80c8996fc1bcf335b4ad6fc32bf18257.scope - libcontainer container 5e063775be39c99f96fcc0c2d857395e80c8996fc1bcf335b4ad6fc32bf18257. Sep 12 22:12:58.321301 systemd[1]: Started cri-containerd-3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf.scope - libcontainer container 3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf. Sep 12 22:12:58.332533 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:12:58.350947 containerd[1529]: time="2025-09-12T22:12:58.350822603Z" level=info msg="StartContainer for \"5e063775be39c99f96fcc0c2d857395e80c8996fc1bcf335b4ad6fc32bf18257\" returns successfully" Sep 12 22:12:58.358866 containerd[1529]: time="2025-09-12T22:12:58.358548191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4wr7d,Uid:cd03710e-4ae7-41a3-ab06-c94a974ca060,Namespace:kube-system,Attempt:0,} returns sandbox id \"3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf\"" Sep 12 22:12:58.364507 containerd[1529]: time="2025-09-12T22:12:58.364348992Z" level=info msg="CreateContainer within sandbox \"3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:12:58.372549 containerd[1529]: time="2025-09-12T22:12:58.371910538Z" level=info msg="Container d054d0e2ba84f036f8c458df94aae6dfc48146a84459760133a3164f80dab537: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:12:58.377546 containerd[1529]: time="2025-09-12T22:12:58.377508896Z" level=info msg="CreateContainer within sandbox \"3188a9defb33486b37d7fa6ba30352013722791e86a407808fedccd4dc83b8bf\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d054d0e2ba84f036f8c458df94aae6dfc48146a84459760133a3164f80dab537\"" Sep 12 22:12:58.378143 containerd[1529]: time="2025-09-12T22:12:58.378120185Z" level=info msg="StartContainer for \"d054d0e2ba84f036f8c458df94aae6dfc48146a84459760133a3164f80dab537\"" Sep 12 22:12:58.379240 containerd[1529]: time="2025-09-12T22:12:58.378961236Z" level=info msg="connecting to shim d054d0e2ba84f036f8c458df94aae6dfc48146a84459760133a3164f80dab537" address="unix:///run/containerd/s/e6e37b0dc0b6d2209cf74037167749042d9ca803c65d41f44ec7648904da615b" protocol=ttrpc version=3 Sep 12 22:12:58.402381 systemd[1]: Started cri-containerd-d054d0e2ba84f036f8c458df94aae6dfc48146a84459760133a3164f80dab537.scope - libcontainer container d054d0e2ba84f036f8c458df94aae6dfc48146a84459760133a3164f80dab537. Sep 12 22:12:58.431984 containerd[1529]: time="2025-09-12T22:12:58.431945057Z" level=info msg="StartContainer for \"d054d0e2ba84f036f8c458df94aae6dfc48146a84459760133a3164f80dab537\" returns successfully" Sep 12 22:12:58.886374 containerd[1529]: time="2025-09-12T22:12:58.886322488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86846c449f-frxv5,Uid:730c4fd2-47b1-4635-a97a-a8a4c9177989,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:58.886909 containerd[1529]: time="2025-09-12T22:12:58.886330088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-gx8vx,Uid:99a1e64b-8e24-4447-9a05-0d3e0d17c8ef,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:12:58.886909 containerd[1529]: time="2025-09-12T22:12:58.886350328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-r2w9z,Uid:7fda3ea1-d06d-446e-a30d-ee95f130f24f,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:12:59.009794 systemd-networkd[1433]: calid7dd63045d4: Link UP Sep 12 22:12:59.012590 systemd-networkd[1433]: calid7dd63045d4: Gained carrier Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.913 [INFO][4471] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.940 [INFO][4471] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0 calico-apiserver-6845f6bc46- calico-apiserver 99a1e64b-8e24-4447-9a05-0d3e0d17c8ef 809 0 2025-09-12 22:12:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6845f6bc46 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6845f6bc46-gx8vx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid7dd63045d4 [] [] }} ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.940 [INFO][4471] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.969 [INFO][4517] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" HandleID="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Workload="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.969 [INFO][4517] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" HandleID="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Workload="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c760), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6845f6bc46-gx8vx", "timestamp":"2025-09-12 22:12:58.969732894 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.969 [INFO][4517] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.969 [INFO][4517] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.969 [INFO][4517] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.981 [INFO][4517] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.985 [INFO][4517] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.988 [INFO][4517] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.990 [INFO][4517] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.992 [INFO][4517] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.992 [INFO][4517] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.993 [INFO][4517] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:58.999 [INFO][4517] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:59.004 [INFO][4517] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:59.004 [INFO][4517] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" host="localhost" Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:59.004 [INFO][4517] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:12:59.026702 containerd[1529]: 2025-09-12 22:12:59.004 [INFO][4517] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" HandleID="k8s-pod-network.75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Workload="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" Sep 12 22:12:59.027253 containerd[1529]: 2025-09-12 22:12:59.008 [INFO][4471] cni-plugin/k8s.go 418: Populated endpoint ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0", GenerateName:"calico-apiserver-6845f6bc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"99a1e64b-8e24-4447-9a05-0d3e0d17c8ef", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6845f6bc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6845f6bc46-gx8vx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid7dd63045d4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:59.027253 containerd[1529]: 2025-09-12 22:12:59.008 [INFO][4471] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" Sep 12 22:12:59.027253 containerd[1529]: 2025-09-12 22:12:59.008 [INFO][4471] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid7dd63045d4 ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" Sep 12 22:12:59.027253 containerd[1529]: 2025-09-12 22:12:59.014 [INFO][4471] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" Sep 12 22:12:59.027253 containerd[1529]: 2025-09-12 22:12:59.015 [INFO][4471] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0", GenerateName:"calico-apiserver-6845f6bc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"99a1e64b-8e24-4447-9a05-0d3e0d17c8ef", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6845f6bc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b", Pod:"calico-apiserver-6845f6bc46-gx8vx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid7dd63045d4", MAC:"b6:19:b9:bd:1a:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:59.027253 containerd[1529]: 2025-09-12 22:12:59.024 [INFO][4471] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-gx8vx" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--gx8vx-eth0" Sep 12 22:12:59.047368 containerd[1529]: time="2025-09-12T22:12:59.047300802Z" level=info msg="connecting to shim 75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b" address="unix:///run/containerd/s/fa4c5852bf46ce979b85cee44f9fcbe42a355eed848b8d330637f0bfa9a15bd7" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:59.068271 kubelet[2681]: I0912 22:12:59.068217 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-4wr7d" podStartSLOduration=37.068201327 podStartE2EDuration="37.068201327s" podCreationTimestamp="2025-09-12 22:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:12:59.068191926 +0000 UTC m=+43.270186112" watchObservedRunningTime="2025-09-12 22:12:59.068201327 +0000 UTC m=+43.270195473" Sep 12 22:12:59.071384 systemd[1]: Started cri-containerd-75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b.scope - libcontainer container 75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b. Sep 12 22:12:59.081837 kubelet[2681]: I0912 22:12:59.081780 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-mjzjr" podStartSLOduration=37.081738231 podStartE2EDuration="37.081738231s" podCreationTimestamp="2025-09-12 22:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:12:59.080414933 +0000 UTC m=+43.282409119" watchObservedRunningTime="2025-09-12 22:12:59.081738231 +0000 UTC m=+43.283732457" Sep 12 22:12:59.099539 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:12:59.136670 systemd-networkd[1433]: cali3648b811cfa: Link UP Sep 12 22:12:59.138031 systemd-networkd[1433]: cali3648b811cfa: Gained carrier Sep 12 22:12:59.141699 containerd[1529]: time="2025-09-12T22:12:59.141594687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-gx8vx,Uid:99a1e64b-8e24-4447-9a05-0d3e0d17c8ef,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b\"" Sep 12 22:12:59.144064 containerd[1529]: time="2025-09-12T22:12:59.144029520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:58.918 [INFO][4495] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:58.940 [INFO][4495] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0 calico-kube-controllers-86846c449f- calico-system 730c4fd2-47b1-4635-a97a-a8a4c9177989 803 0 2025-09-12 22:12:37 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:86846c449f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-86846c449f-frxv5 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3648b811cfa [] [] }} ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:58.940 [INFO][4495] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:58.971 [INFO][4523] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" HandleID="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Workload="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:58.971 [INFO][4523] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" HandleID="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Workload="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136360), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-86846c449f-frxv5", "timestamp":"2025-09-12 22:12:58.971226674 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:58.971 [INFO][4523] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.004 [INFO][4523] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.004 [INFO][4523] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.083 [INFO][4523] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.090 [INFO][4523] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.102 [INFO][4523] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.105 [INFO][4523] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.109 [INFO][4523] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.111 [INFO][4523] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.114 [INFO][4523] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.118 [INFO][4523] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.124 [INFO][4523] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.124 [INFO][4523] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" host="localhost" Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.124 [INFO][4523] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:12:59.154906 containerd[1529]: 2025-09-12 22:12:59.124 [INFO][4523] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" HandleID="k8s-pod-network.f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Workload="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" Sep 12 22:12:59.155560 containerd[1529]: 2025-09-12 22:12:59.128 [INFO][4495] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0", GenerateName:"calico-kube-controllers-86846c449f-", Namespace:"calico-system", SelfLink:"", UID:"730c4fd2-47b1-4635-a97a-a8a4c9177989", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86846c449f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-86846c449f-frxv5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3648b811cfa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:59.155560 containerd[1529]: 2025-09-12 22:12:59.129 [INFO][4495] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" Sep 12 22:12:59.155560 containerd[1529]: 2025-09-12 22:12:59.129 [INFO][4495] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3648b811cfa ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" Sep 12 22:12:59.155560 containerd[1529]: 2025-09-12 22:12:59.139 [INFO][4495] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" Sep 12 22:12:59.155560 containerd[1529]: 2025-09-12 22:12:59.139 [INFO][4495] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0", GenerateName:"calico-kube-controllers-86846c449f-", Namespace:"calico-system", SelfLink:"", UID:"730c4fd2-47b1-4635-a97a-a8a4c9177989", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"86846c449f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b", Pod:"calico-kube-controllers-86846c449f-frxv5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3648b811cfa", MAC:"56:e1:c0:5d:71:0d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:59.155560 containerd[1529]: 2025-09-12 22:12:59.152 [INFO][4495] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" Namespace="calico-system" Pod="calico-kube-controllers-86846c449f-frxv5" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--86846c449f--frxv5-eth0" Sep 12 22:12:59.176252 containerd[1529]: time="2025-09-12T22:12:59.174983102Z" level=info msg="connecting to shim f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b" address="unix:///run/containerd/s/a3ca60fc88f702c50e3a963ed44781316d37f3193420756dad9c6e9e9dd9bd44" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:59.197374 systemd[1]: Started cri-containerd-f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b.scope - libcontainer container f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b. Sep 12 22:12:59.212450 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:12:59.218652 systemd-networkd[1433]: cali6ed6633d4c8: Link UP Sep 12 22:12:59.219964 systemd-networkd[1433]: cali6ed6633d4c8: Gained carrier Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:58.918 [INFO][4482] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:58.947 [INFO][4482] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0 calico-apiserver-6845f6bc46- calico-apiserver 7fda3ea1-d06d-446e-a30d-ee95f130f24f 814 0 2025-09-12 22:12:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6845f6bc46 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6845f6bc46-r2w9z eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6ed6633d4c8 [] [] }} ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:58.947 [INFO][4482] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:58.973 [INFO][4529] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" HandleID="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Workload="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:58.973 [INFO][4529] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" HandleID="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Workload="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323490), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6845f6bc46-r2w9z", "timestamp":"2025-09-12 22:12:58.973623068 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:58.973 [INFO][4529] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.124 [INFO][4529] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.125 [INFO][4529] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.181 [INFO][4529] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.191 [INFO][4529] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.197 [INFO][4529] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.200 [INFO][4529] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.202 [INFO][4529] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.202 [INFO][4529] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.203 [INFO][4529] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899 Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.207 [INFO][4529] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.213 [INFO][4529] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.214 [INFO][4529] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" host="localhost" Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.214 [INFO][4529] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:12:59.233827 containerd[1529]: 2025-09-12 22:12:59.214 [INFO][4529] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" HandleID="k8s-pod-network.e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Workload="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" Sep 12 22:12:59.234361 containerd[1529]: 2025-09-12 22:12:59.216 [INFO][4482] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0", GenerateName:"calico-apiserver-6845f6bc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"7fda3ea1-d06d-446e-a30d-ee95f130f24f", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6845f6bc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6845f6bc46-r2w9z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6ed6633d4c8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:59.234361 containerd[1529]: 2025-09-12 22:12:59.217 [INFO][4482] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" Sep 12 22:12:59.234361 containerd[1529]: 2025-09-12 22:12:59.217 [INFO][4482] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6ed6633d4c8 ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" Sep 12 22:12:59.234361 containerd[1529]: 2025-09-12 22:12:59.218 [INFO][4482] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" Sep 12 22:12:59.234361 containerd[1529]: 2025-09-12 22:12:59.219 [INFO][4482] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0", GenerateName:"calico-apiserver-6845f6bc46-", Namespace:"calico-apiserver", SelfLink:"", UID:"7fda3ea1-d06d-446e-a30d-ee95f130f24f", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6845f6bc46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899", Pod:"calico-apiserver-6845f6bc46-r2w9z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6ed6633d4c8", MAC:"b2:9e:c0:77:c9:08", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:12:59.234361 containerd[1529]: 2025-09-12 22:12:59.231 [INFO][4482] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" Namespace="calico-apiserver" Pod="calico-apiserver-6845f6bc46-r2w9z" WorkloadEndpoint="localhost-k8s-calico--apiserver--6845f6bc46--r2w9z-eth0" Sep 12 22:12:59.248328 containerd[1529]: time="2025-09-12T22:12:59.248276142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-86846c449f-frxv5,Uid:730c4fd2-47b1-4635-a97a-a8a4c9177989,Namespace:calico-system,Attempt:0,} returns sandbox id \"f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b\"" Sep 12 22:12:59.252023 systemd-networkd[1433]: calic68885d62b2: Gained IPv6LL Sep 12 22:12:59.256323 containerd[1529]: time="2025-09-12T22:12:59.256231890Z" level=info msg="connecting to shim e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899" address="unix:///run/containerd/s/7d4bd9e22969ceaad0c7941f78453bcfa5c4617d743edcd2ddfc781b7f3c6816" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:12:59.281519 systemd[1]: Started cri-containerd-e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899.scope - libcontainer container e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899. Sep 12 22:12:59.295424 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:12:59.313383 containerd[1529]: time="2025-09-12T22:12:59.313346189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6845f6bc46-r2w9z,Uid:7fda3ea1-d06d-446e-a30d-ee95f130f24f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899\"" Sep 12 22:12:59.886019 containerd[1529]: time="2025-09-12T22:12:59.885965196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpk44,Uid:f1eff131-ed0a-4062-a13b-42b0f931fef5,Namespace:calico-system,Attempt:0,}" Sep 12 22:12:59.955311 systemd-networkd[1433]: cali6ec42c56102: Gained IPv6LL Sep 12 22:13:00.008475 systemd-networkd[1433]: cali1143db1439e: Link UP Sep 12 22:13:00.009294 systemd-networkd[1433]: cali1143db1439e: Gained carrier Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.919 [INFO][4732] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.933 [INFO][4732] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--gpk44-eth0 csi-node-driver- calico-system f1eff131-ed0a-4062-a13b-42b0f931fef5 691 0 2025-09-12 22:12:37 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-gpk44 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali1143db1439e [] [] }} ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.933 [INFO][4732] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-eth0" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.957 [INFO][4749] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" HandleID="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Workload="localhost-k8s-csi--node--driver--gpk44-eth0" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.957 [INFO][4749] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" HandleID="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Workload="localhost-k8s-csi--node--driver--gpk44-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000584490), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-gpk44", "timestamp":"2025-09-12 22:12:59.957579093 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.957 [INFO][4749] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.957 [INFO][4749] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.957 [INFO][4749] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.967 [INFO][4749] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.975 [INFO][4749] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.980 [INFO][4749] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.982 [INFO][4749] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.985 [INFO][4749] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.985 [INFO][4749] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.987 [INFO][4749] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:12:59.991 [INFO][4749] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:13:00.002 [INFO][4749] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:13:00.002 [INFO][4749] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" host="localhost" Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:13:00.003 [INFO][4749] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:13:00.028221 containerd[1529]: 2025-09-12 22:13:00.003 [INFO][4749] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" HandleID="k8s-pod-network.f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Workload="localhost-k8s-csi--node--driver--gpk44-eth0" Sep 12 22:13:00.028958 containerd[1529]: 2025-09-12 22:13:00.005 [INFO][4732] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--gpk44-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f1eff131-ed0a-4062-a13b-42b0f931fef5", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-gpk44", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1143db1439e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:13:00.028958 containerd[1529]: 2025-09-12 22:13:00.005 [INFO][4732] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-eth0" Sep 12 22:13:00.028958 containerd[1529]: 2025-09-12 22:13:00.005 [INFO][4732] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1143db1439e ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-eth0" Sep 12 22:13:00.028958 containerd[1529]: 2025-09-12 22:13:00.010 [INFO][4732] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-eth0" Sep 12 22:13:00.028958 containerd[1529]: 2025-09-12 22:13:00.011 [INFO][4732] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--gpk44-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f1eff131-ed0a-4062-a13b-42b0f931fef5", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a", Pod:"csi-node-driver-gpk44", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1143db1439e", MAC:"96:86:2f:73:b9:06", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:13:00.028958 containerd[1529]: 2025-09-12 22:13:00.024 [INFO][4732] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" Namespace="calico-system" Pod="csi-node-driver-gpk44" WorkloadEndpoint="localhost-k8s-csi--node--driver--gpk44-eth0" Sep 12 22:13:00.063802 containerd[1529]: time="2025-09-12T22:13:00.063400475Z" level=info msg="connecting to shim f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a" address="unix:///run/containerd/s/7331fb108033ab6d82faa5cc632cd67912553d1637db45c7a266f29a34e6bde8" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:13:00.095400 systemd[1]: Started cri-containerd-f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a.scope - libcontainer container f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a. Sep 12 22:13:00.106364 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:13:00.162480 containerd[1529]: time="2025-09-12T22:13:00.162387713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpk44,Uid:f1eff131-ed0a-4062-a13b-42b0f931fef5,Namespace:calico-system,Attempt:0,} returns sandbox id \"f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a\"" Sep 12 22:13:00.595403 systemd-networkd[1433]: calid7dd63045d4: Gained IPv6LL Sep 12 22:13:00.598317 containerd[1529]: time="2025-09-12T22:13:00.598279356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:00.599053 containerd[1529]: time="2025-09-12T22:13:00.599011806Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 22:13:00.599680 containerd[1529]: time="2025-09-12T22:13:00.599653934Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:00.601785 containerd[1529]: time="2025-09-12T22:13:00.601740722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:00.602405 containerd[1529]: time="2025-09-12T22:13:00.602368850Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 1.458114207s" Sep 12 22:13:00.602405 containerd[1529]: time="2025-09-12T22:13:00.602402251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:13:00.603499 containerd[1529]: time="2025-09-12T22:13:00.603261862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 22:13:00.606987 containerd[1529]: time="2025-09-12T22:13:00.606950671Z" level=info msg="CreateContainer within sandbox \"75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:13:00.619163 containerd[1529]: time="2025-09-12T22:13:00.618876310Z" level=info msg="Container d04b6430a34e5ef7ee7b3d80944a50f4e2e34d8240c26a98443799bd4bfe8bd1: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:13:00.630912 containerd[1529]: time="2025-09-12T22:13:00.630880390Z" level=info msg="CreateContainer within sandbox \"75c2c777d369e9a2568932a74eccb163a290be6cf6983909134b03a9e0ae2e1b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d04b6430a34e5ef7ee7b3d80944a50f4e2e34d8240c26a98443799bd4bfe8bd1\"" Sep 12 22:13:00.631560 containerd[1529]: time="2025-09-12T22:13:00.631507758Z" level=info msg="StartContainer for \"d04b6430a34e5ef7ee7b3d80944a50f4e2e34d8240c26a98443799bd4bfe8bd1\"" Sep 12 22:13:00.632693 containerd[1529]: time="2025-09-12T22:13:00.632666734Z" level=info msg="connecting to shim d04b6430a34e5ef7ee7b3d80944a50f4e2e34d8240c26a98443799bd4bfe8bd1" address="unix:///run/containerd/s/fa4c5852bf46ce979b85cee44f9fcbe42a355eed848b8d330637f0bfa9a15bd7" protocol=ttrpc version=3 Sep 12 22:13:00.663377 systemd[1]: Started cri-containerd-d04b6430a34e5ef7ee7b3d80944a50f4e2e34d8240c26a98443799bd4bfe8bd1.scope - libcontainer container d04b6430a34e5ef7ee7b3d80944a50f4e2e34d8240c26a98443799bd4bfe8bd1. Sep 12 22:13:00.727368 containerd[1529]: time="2025-09-12T22:13:00.727289233Z" level=info msg="StartContainer for \"d04b6430a34e5ef7ee7b3d80944a50f4e2e34d8240c26a98443799bd4bfe8bd1\" returns successfully" Sep 12 22:13:00.851400 systemd-networkd[1433]: cali6ed6633d4c8: Gained IPv6LL Sep 12 22:13:00.886108 containerd[1529]: time="2025-09-12T22:13:00.886042427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7n7zg,Uid:8b901d26-47da-4593-92fe-7d9e945b1e0d,Namespace:calico-system,Attempt:0,}" Sep 12 22:13:01.000114 systemd-networkd[1433]: califb38dd89aa2: Link UP Sep 12 22:13:01.000671 systemd-networkd[1433]: califb38dd89aa2: Gained carrier Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.911 [INFO][4879] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.929 [INFO][4879] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--7n7zg-eth0 goldmane-54d579b49d- calico-system 8b901d26-47da-4593-92fe-7d9e945b1e0d 811 0 2025-09-12 22:12:37 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-7n7zg eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califb38dd89aa2 [] [] }} ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.929 [INFO][4879] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.954 [INFO][4894] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" HandleID="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Workload="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.955 [INFO][4894] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" HandleID="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Workload="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001364d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-7n7zg", "timestamp":"2025-09-12 22:13:00.954947104 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.955 [INFO][4894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.955 [INFO][4894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.955 [INFO][4894] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.967 [INFO][4894] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.971 [INFO][4894] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.977 [INFO][4894] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.980 [INFO][4894] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.982 [INFO][4894] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.982 [INFO][4894] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.983 [INFO][4894] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295 Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.988 [INFO][4894] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.994 [INFO][4894] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.994 [INFO][4894] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" host="localhost" Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.994 [INFO][4894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:13:01.013933 containerd[1529]: 2025-09-12 22:13:00.994 [INFO][4894] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" HandleID="k8s-pod-network.90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Workload="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" Sep 12 22:13:01.014484 containerd[1529]: 2025-09-12 22:13:00.998 [INFO][4879] cni-plugin/k8s.go 418: Populated endpoint ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--7n7zg-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8b901d26-47da-4593-92fe-7d9e945b1e0d", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-7n7zg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califb38dd89aa2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:13:01.014484 containerd[1529]: 2025-09-12 22:13:00.998 [INFO][4879] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" Sep 12 22:13:01.014484 containerd[1529]: 2025-09-12 22:13:00.998 [INFO][4879] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califb38dd89aa2 ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" Sep 12 22:13:01.014484 containerd[1529]: 2025-09-12 22:13:01.000 [INFO][4879] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" Sep 12 22:13:01.014484 containerd[1529]: 2025-09-12 22:13:01.001 [INFO][4879] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--7n7zg-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8b901d26-47da-4593-92fe-7d9e945b1e0d", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 12, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295", Pod:"goldmane-54d579b49d-7n7zg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califb38dd89aa2", MAC:"52:95:9f:1a:1d:d8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:13:01.014484 containerd[1529]: 2025-09-12 22:13:01.010 [INFO][4879] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" Namespace="calico-system" Pod="goldmane-54d579b49d-7n7zg" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--7n7zg-eth0" Sep 12 22:13:01.035691 containerd[1529]: time="2025-09-12T22:13:01.035632728Z" level=info msg="connecting to shim 90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295" address="unix:///run/containerd/s/7917d4283efa24ff57992c3893d7daaa6409ef76fd4a6476f846f591dfdf0e29" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:13:01.044355 systemd-networkd[1433]: cali3648b811cfa: Gained IPv6LL Sep 12 22:13:01.073347 systemd[1]: Started cri-containerd-90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295.scope - libcontainer container 90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295. Sep 12 22:13:01.088947 systemd-resolved[1351]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 22:13:01.103283 kubelet[2681]: I0912 22:13:01.102612 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6845f6bc46-gx8vx" podStartSLOduration=27.643122894 podStartE2EDuration="29.102594719s" podCreationTimestamp="2025-09-12 22:12:32 +0000 UTC" firstStartedPulling="2025-09-12 22:12:59.143639675 +0000 UTC m=+43.345633821" lastFinishedPulling="2025-09-12 22:13:00.60311146 +0000 UTC m=+44.805105646" observedRunningTime="2025-09-12 22:13:01.100659974 +0000 UTC m=+45.302654160" watchObservedRunningTime="2025-09-12 22:13:01.102594719 +0000 UTC m=+45.304588865" Sep 12 22:13:01.116887 containerd[1529]: time="2025-09-12T22:13:01.116851945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-7n7zg,Uid:8b901d26-47da-4593-92fe-7d9e945b1e0d,Namespace:calico-system,Attempt:0,} returns sandbox id \"90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295\"" Sep 12 22:13:01.619417 systemd-networkd[1433]: cali1143db1439e: Gained IPv6LL Sep 12 22:13:01.651992 systemd[1]: Started sshd@8-10.0.0.68:22-10.0.0.1:50094.service - OpenSSH per-connection server daemon (10.0.0.1:50094). Sep 12 22:13:01.755859 sshd[4958]: Accepted publickey for core from 10.0.0.1 port 50094 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:01.758831 sshd-session[4958]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:01.770297 systemd-logind[1501]: New session 9 of user core. Sep 12 22:13:01.776065 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 22:13:02.045784 sshd[4985]: Connection closed by 10.0.0.1 port 50094 Sep 12 22:13:02.046102 sshd-session[4958]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:02.050696 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 22:13:02.053580 systemd-logind[1501]: Session 9 logged out. Waiting for processes to exit. Sep 12 22:13:02.053769 systemd[1]: sshd@8-10.0.0.68:22-10.0.0.1:50094.service: Deactivated successfully. Sep 12 22:13:02.057809 systemd-logind[1501]: Removed session 9. Sep 12 22:13:02.085420 kubelet[2681]: I0912 22:13:02.085390 2681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:13:02.195369 systemd-networkd[1433]: califb38dd89aa2: Gained IPv6LL Sep 12 22:13:02.458315 containerd[1529]: time="2025-09-12T22:13:02.458031627Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:02.458853 containerd[1529]: time="2025-09-12T22:13:02.458804916Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 22:13:02.459856 containerd[1529]: time="2025-09-12T22:13:02.459824129Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:02.462032 containerd[1529]: time="2025-09-12T22:13:02.461988397Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:02.463058 containerd[1529]: time="2025-09-12T22:13:02.463016090Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 1.859723267s" Sep 12 22:13:02.463058 containerd[1529]: time="2025-09-12T22:13:02.463048450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 22:13:02.464119 containerd[1529]: time="2025-09-12T22:13:02.464039543Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:13:02.478718 containerd[1529]: time="2025-09-12T22:13:02.478662849Z" level=info msg="CreateContainer within sandbox \"f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 22:13:02.487224 containerd[1529]: time="2025-09-12T22:13:02.486294026Z" level=info msg="Container aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:13:02.493528 containerd[1529]: time="2025-09-12T22:13:02.493481118Z" level=info msg="CreateContainer within sandbox \"f3d6a3c0f370e98972adfeefcb51108bf7d362f3be98b2a8d5f1565a440fde1b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9\"" Sep 12 22:13:02.493995 containerd[1529]: time="2025-09-12T22:13:02.493968524Z" level=info msg="StartContainer for \"aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9\"" Sep 12 22:13:02.495082 containerd[1529]: time="2025-09-12T22:13:02.495055058Z" level=info msg="connecting to shim aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9" address="unix:///run/containerd/s/a3ca60fc88f702c50e3a963ed44781316d37f3193420756dad9c6e9e9dd9bd44" protocol=ttrpc version=3 Sep 12 22:13:02.520376 systemd[1]: Started cri-containerd-aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9.scope - libcontainer container aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9. Sep 12 22:13:02.559057 containerd[1529]: time="2025-09-12T22:13:02.559017832Z" level=info msg="StartContainer for \"aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9\" returns successfully" Sep 12 22:13:02.702711 containerd[1529]: time="2025-09-12T22:13:02.702668461Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:02.703412 containerd[1529]: time="2025-09-12T22:13:02.703375110Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 22:13:02.705593 containerd[1529]: time="2025-09-12T22:13:02.705554977Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 241.479434ms" Sep 12 22:13:02.705765 containerd[1529]: time="2025-09-12T22:13:02.705692499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:13:02.706900 containerd[1529]: time="2025-09-12T22:13:02.706805313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 22:13:02.712100 containerd[1529]: time="2025-09-12T22:13:02.712007019Z" level=info msg="CreateContainer within sandbox \"e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:13:02.718724 containerd[1529]: time="2025-09-12T22:13:02.718690185Z" level=info msg="Container 70c79366762df74f1ed24e506f63a82da699923c3610dbd9c819911eecdba5c2: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:13:02.726324 containerd[1529]: time="2025-09-12T22:13:02.726259481Z" level=info msg="CreateContainer within sandbox \"e1136421f7c662b025008b730836a60ee3045a6d08c0182c840004e390493899\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"70c79366762df74f1ed24e506f63a82da699923c3610dbd9c819911eecdba5c2\"" Sep 12 22:13:02.727063 containerd[1529]: time="2025-09-12T22:13:02.727027131Z" level=info msg="StartContainer for \"70c79366762df74f1ed24e506f63a82da699923c3610dbd9c819911eecdba5c2\"" Sep 12 22:13:02.729502 containerd[1529]: time="2025-09-12T22:13:02.729268559Z" level=info msg="connecting to shim 70c79366762df74f1ed24e506f63a82da699923c3610dbd9c819911eecdba5c2" address="unix:///run/containerd/s/7d4bd9e22969ceaad0c7941f78453bcfa5c4617d743edcd2ddfc781b7f3c6816" protocol=ttrpc version=3 Sep 12 22:13:02.752503 systemd[1]: Started cri-containerd-70c79366762df74f1ed24e506f63a82da699923c3610dbd9c819911eecdba5c2.scope - libcontainer container 70c79366762df74f1ed24e506f63a82da699923c3610dbd9c819911eecdba5c2. Sep 12 22:13:02.807993 containerd[1529]: time="2025-09-12T22:13:02.807884480Z" level=info msg="StartContainer for \"70c79366762df74f1ed24e506f63a82da699923c3610dbd9c819911eecdba5c2\" returns successfully" Sep 12 22:13:03.113704 kubelet[2681]: I0912 22:13:03.113590 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6845f6bc46-r2w9z" podStartSLOduration=27.721247232 podStartE2EDuration="31.113574101s" podCreationTimestamp="2025-09-12 22:12:32 +0000 UTC" firstStartedPulling="2025-09-12 22:12:59.314364803 +0000 UTC m=+43.516358949" lastFinishedPulling="2025-09-12 22:13:02.706691632 +0000 UTC m=+46.908685818" observedRunningTime="2025-09-12 22:13:03.11344766 +0000 UTC m=+47.315441846" watchObservedRunningTime="2025-09-12 22:13:03.113574101 +0000 UTC m=+47.315568327" Sep 12 22:13:03.149703 kubelet[2681]: I0912 22:13:03.149630 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-86846c449f-frxv5" podStartSLOduration=22.961079999 podStartE2EDuration="26.14960967s" podCreationTimestamp="2025-09-12 22:12:37 +0000 UTC" firstStartedPulling="2025-09-12 22:12:59.27531775 +0000 UTC m=+43.477311936" lastFinishedPulling="2025-09-12 22:13:02.463847421 +0000 UTC m=+46.665841607" observedRunningTime="2025-09-12 22:13:03.148874301 +0000 UTC m=+47.350868487" watchObservedRunningTime="2025-09-12 22:13:03.14960967 +0000 UTC m=+47.351603816" Sep 12 22:13:03.176793 containerd[1529]: time="2025-09-12T22:13:03.176754089Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aaa1c4a4997d6071940aceb5fa2b29333d0668188b980c67260ea66e711b36a9\" id:\"03224924b4a53e61e9ff7d9ea53e75f2537bcb3b8e23803a872616b9c5dad38b\" pid:5122 exited_at:{seconds:1757715183 nanos:175065388}" Sep 12 22:13:03.685142 containerd[1529]: time="2025-09-12T22:13:03.684918383Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:03.686029 containerd[1529]: time="2025-09-12T22:13:03.685867275Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 22:13:03.687009 containerd[1529]: time="2025-09-12T22:13:03.686873767Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:03.690026 containerd[1529]: time="2025-09-12T22:13:03.689999566Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:03.692165 containerd[1529]: time="2025-09-12T22:13:03.692127873Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 985.293359ms" Sep 12 22:13:03.692165 containerd[1529]: time="2025-09-12T22:13:03.692159513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 22:13:03.693615 containerd[1529]: time="2025-09-12T22:13:03.693415489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 22:13:03.695714 containerd[1529]: time="2025-09-12T22:13:03.695685197Z" level=info msg="CreateContainer within sandbox \"f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 22:13:03.707504 containerd[1529]: time="2025-09-12T22:13:03.707463624Z" level=info msg="Container 339c6ed5f4285aa61537cd8825a2ec4cf6f3fd58d14893a787f86e55f6d6e3b8: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:13:03.722077 containerd[1529]: time="2025-09-12T22:13:03.721961605Z" level=info msg="CreateContainer within sandbox \"f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"339c6ed5f4285aa61537cd8825a2ec4cf6f3fd58d14893a787f86e55f6d6e3b8\"" Sep 12 22:13:03.722834 containerd[1529]: time="2025-09-12T22:13:03.722807575Z" level=info msg="StartContainer for \"339c6ed5f4285aa61537cd8825a2ec4cf6f3fd58d14893a787f86e55f6d6e3b8\"" Sep 12 22:13:03.730131 containerd[1529]: time="2025-09-12T22:13:03.730019985Z" level=info msg="connecting to shim 339c6ed5f4285aa61537cd8825a2ec4cf6f3fd58d14893a787f86e55f6d6e3b8" address="unix:///run/containerd/s/7331fb108033ab6d82faa5cc632cd67912553d1637db45c7a266f29a34e6bde8" protocol=ttrpc version=3 Sep 12 22:13:03.759340 systemd[1]: Started cri-containerd-339c6ed5f4285aa61537cd8825a2ec4cf6f3fd58d14893a787f86e55f6d6e3b8.scope - libcontainer container 339c6ed5f4285aa61537cd8825a2ec4cf6f3fd58d14893a787f86e55f6d6e3b8. Sep 12 22:13:03.800934 containerd[1529]: time="2025-09-12T22:13:03.800896948Z" level=info msg="StartContainer for \"339c6ed5f4285aa61537cd8825a2ec4cf6f3fd58d14893a787f86e55f6d6e3b8\" returns successfully" Sep 12 22:13:04.620748 kubelet[2681]: I0912 22:13:04.620644 2681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:13:04.729369 containerd[1529]: time="2025-09-12T22:13:04.729026776Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e\" id:\"e79d7b949e2c8d841c5331269d1a2510e048502928c3d8dcf24899207df6d802\" pid:5207 exit_status:1 exited_at:{seconds:1757715184 nanos:728652572}" Sep 12 22:13:04.855508 containerd[1529]: time="2025-09-12T22:13:04.855457441Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0f13fc7a039eb4d1090a51b61645d499c412d9168afb746a7146ca8a9091f66e\" id:\"3e514868fa00c0d62ec451614af0f905cfa2819940205d3de65baa58ed0c320b\" pid:5233 exit_status:1 exited_at:{seconds:1757715184 nanos:853450136}" Sep 12 22:13:05.301484 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2810418706.mount: Deactivated successfully. Sep 12 22:13:05.748852 containerd[1529]: time="2025-09-12T22:13:05.748789020Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:05.749462 containerd[1529]: time="2025-09-12T22:13:05.749418148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 22:13:05.750828 containerd[1529]: time="2025-09-12T22:13:05.750775924Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:05.753059 containerd[1529]: time="2025-09-12T22:13:05.753016511Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:05.753753 containerd[1529]: time="2025-09-12T22:13:05.753713279Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.06026899s" Sep 12 22:13:05.753790 containerd[1529]: time="2025-09-12T22:13:05.753750880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 22:13:05.755026 containerd[1529]: time="2025-09-12T22:13:05.754932814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 22:13:05.758487 containerd[1529]: time="2025-09-12T22:13:05.758440616Z" level=info msg="CreateContainer within sandbox \"90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 22:13:05.770563 containerd[1529]: time="2025-09-12T22:13:05.770447440Z" level=info msg="Container 19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:13:05.786596 containerd[1529]: time="2025-09-12T22:13:05.786527112Z" level=info msg="CreateContainer within sandbox \"90399bb39cb062aa8bf6408157269e36d11541395a859db646b3867511f1f295\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43\"" Sep 12 22:13:05.787580 containerd[1529]: time="2025-09-12T22:13:05.787523324Z" level=info msg="StartContainer for \"19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43\"" Sep 12 22:13:05.789042 containerd[1529]: time="2025-09-12T22:13:05.788989502Z" level=info msg="connecting to shim 19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43" address="unix:///run/containerd/s/7917d4283efa24ff57992c3893d7daaa6409ef76fd4a6476f846f591dfdf0e29" protocol=ttrpc version=3 Sep 12 22:13:05.822418 systemd[1]: Started cri-containerd-19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43.scope - libcontainer container 19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43. Sep 12 22:13:05.868470 containerd[1529]: time="2025-09-12T22:13:05.868419334Z" level=info msg="StartContainer for \"19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43\" returns successfully" Sep 12 22:13:06.239406 containerd[1529]: time="2025-09-12T22:13:06.239308407Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43\" id:\"4674dd9b7b6867683bfecb69c22adb9be921b2ac320b83688ff5604f17deb8e8\" pid:5347 exit_status:1 exited_at:{seconds:1757715186 nanos:238752640}" Sep 12 22:13:06.824209 containerd[1529]: time="2025-09-12T22:13:06.823937245Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:06.825514 containerd[1529]: time="2025-09-12T22:13:06.825430783Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 22:13:06.826393 containerd[1529]: time="2025-09-12T22:13:06.826358954Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:06.829207 containerd[1529]: time="2025-09-12T22:13:06.829016305Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:13:06.830357 containerd[1529]: time="2025-09-12T22:13:06.830314080Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.075341946s" Sep 12 22:13:06.830357 containerd[1529]: time="2025-09-12T22:13:06.830362521Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 22:13:06.835794 containerd[1529]: time="2025-09-12T22:13:06.835740704Z" level=info msg="CreateContainer within sandbox \"f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 22:13:06.844422 containerd[1529]: time="2025-09-12T22:13:06.844367845Z" level=info msg="Container 6e489bf22fa282c8beaa467feab6835452bb05be487d100b485b79fd8c961c37: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:13:06.856929 containerd[1529]: time="2025-09-12T22:13:06.856857512Z" level=info msg="CreateContainer within sandbox \"f78c343a20110669453e129765b2128407506f25b81697ac1a82766a9c4c1c9a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"6e489bf22fa282c8beaa467feab6835452bb05be487d100b485b79fd8c961c37\"" Sep 12 22:13:06.857499 containerd[1529]: time="2025-09-12T22:13:06.857469920Z" level=info msg="StartContainer for \"6e489bf22fa282c8beaa467feab6835452bb05be487d100b485b79fd8c961c37\"" Sep 12 22:13:06.859808 containerd[1529]: time="2025-09-12T22:13:06.859772947Z" level=info msg="connecting to shim 6e489bf22fa282c8beaa467feab6835452bb05be487d100b485b79fd8c961c37" address="unix:///run/containerd/s/7331fb108033ab6d82faa5cc632cd67912553d1637db45c7a266f29a34e6bde8" protocol=ttrpc version=3 Sep 12 22:13:06.885428 systemd[1]: Started cri-containerd-6e489bf22fa282c8beaa467feab6835452bb05be487d100b485b79fd8c961c37.scope - libcontainer container 6e489bf22fa282c8beaa467feab6835452bb05be487d100b485b79fd8c961c37. Sep 12 22:13:06.907255 kubelet[2681]: I0912 22:13:06.907217 2681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:13:06.953174 kubelet[2681]: I0912 22:13:06.953048 2681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-7n7zg" podStartSLOduration=25.31691588 podStartE2EDuration="29.952142353s" podCreationTimestamp="2025-09-12 22:12:37 +0000 UTC" firstStartedPulling="2025-09-12 22:13:01.119281056 +0000 UTC m=+45.321275202" lastFinishedPulling="2025-09-12 22:13:05.754507489 +0000 UTC m=+49.956501675" observedRunningTime="2025-09-12 22:13:06.143875324 +0000 UTC m=+50.345869510" watchObservedRunningTime="2025-09-12 22:13:06.952142353 +0000 UTC m=+51.154136539" Sep 12 22:13:06.958574 containerd[1529]: time="2025-09-12T22:13:06.958533909Z" level=info msg="StartContainer for \"6e489bf22fa282c8beaa467feab6835452bb05be487d100b485b79fd8c961c37\" returns successfully" Sep 12 22:13:07.058111 systemd[1]: Started sshd@9-10.0.0.68:22-10.0.0.1:50096.service - OpenSSH per-connection server daemon (10.0.0.1:50096). Sep 12 22:13:07.142737 sshd[5419]: Accepted publickey for core from 10.0.0.1 port 50096 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:07.146074 sshd-session[5419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:07.159552 systemd-logind[1501]: New session 10 of user core. Sep 12 22:13:07.164420 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 22:13:07.236824 containerd[1529]: time="2025-09-12T22:13:07.236768534Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19be49dc151ded07d252c9baa56d254572abbac9717df3ef362d13f4a5aeee43\" id:\"4a833abd67769f9ca5e8a59eeae304b3ffb9d3e46667c4f6eedad3ca03d2c46a\" pid:5434 exit_status:1 exited_at:{seconds:1757715187 nanos:236243448}" Sep 12 22:13:07.435250 sshd[5441]: Connection closed by 10.0.0.1 port 50096 Sep 12 22:13:07.436027 sshd-session[5419]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:07.450530 systemd[1]: sshd@9-10.0.0.68:22-10.0.0.1:50096.service: Deactivated successfully. Sep 12 22:13:07.454753 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 22:13:07.457162 systemd-logind[1501]: Session 10 logged out. Waiting for processes to exit. Sep 12 22:13:07.460110 systemd[1]: Started sshd@10-10.0.0.68:22-10.0.0.1:50098.service - OpenSSH per-connection server daemon (10.0.0.1:50098). Sep 12 22:13:07.462562 systemd-logind[1501]: Removed session 10. Sep 12 22:13:07.519074 sshd[5498]: Accepted publickey for core from 10.0.0.1 port 50098 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:07.520795 sshd-session[5498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:07.529475 systemd-logind[1501]: New session 11 of user core. Sep 12 22:13:07.538396 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 22:13:07.771941 sshd[5509]: Connection closed by 10.0.0.1 port 50098 Sep 12 22:13:07.773903 sshd-session[5498]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:07.788215 systemd[1]: Started sshd@11-10.0.0.68:22-10.0.0.1:50106.service - OpenSSH per-connection server daemon (10.0.0.1:50106). Sep 12 22:13:07.789081 systemd[1]: sshd@10-10.0.0.68:22-10.0.0.1:50098.service: Deactivated successfully. Sep 12 22:13:07.794864 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 22:13:07.798256 systemd-logind[1501]: Session 11 logged out. Waiting for processes to exit. Sep 12 22:13:07.803915 systemd-logind[1501]: Removed session 11. Sep 12 22:13:07.852090 sshd[5533]: Accepted publickey for core from 10.0.0.1 port 50106 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:07.853781 sshd-session[5533]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:07.862474 systemd-logind[1501]: New session 12 of user core. Sep 12 22:13:07.870462 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 22:13:07.936495 systemd-networkd[1433]: vxlan.calico: Link UP Sep 12 22:13:07.936501 systemd-networkd[1433]: vxlan.calico: Gained carrier Sep 12 22:13:07.993991 kubelet[2681]: I0912 22:13:07.993942 2681 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 22:13:08.002795 kubelet[2681]: I0912 22:13:08.002658 2681 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 22:13:08.100528 sshd[5553]: Connection closed by 10.0.0.1 port 50106 Sep 12 22:13:08.100819 sshd-session[5533]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:08.106570 systemd[1]: sshd@11-10.0.0.68:22-10.0.0.1:50106.service: Deactivated successfully. Sep 12 22:13:08.110004 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 22:13:08.110925 systemd-logind[1501]: Session 12 logged out. Waiting for processes to exit. Sep 12 22:13:08.112494 systemd-logind[1501]: Removed session 12. Sep 12 22:13:09.555370 systemd-networkd[1433]: vxlan.calico: Gained IPv6LL Sep 12 22:13:13.116326 systemd[1]: Started sshd@12-10.0.0.68:22-10.0.0.1:58330.service - OpenSSH per-connection server daemon (10.0.0.1:58330). Sep 12 22:13:13.173441 sshd[5640]: Accepted publickey for core from 10.0.0.1 port 58330 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:13.174862 sshd-session[5640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:13.179234 systemd-logind[1501]: New session 13 of user core. Sep 12 22:13:13.191337 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 22:13:13.322234 sshd[5643]: Connection closed by 10.0.0.1 port 58330 Sep 12 22:13:13.323019 sshd-session[5640]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:13.335592 systemd[1]: sshd@12-10.0.0.68:22-10.0.0.1:58330.service: Deactivated successfully. Sep 12 22:13:13.338099 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 22:13:13.340083 systemd-logind[1501]: Session 13 logged out. Waiting for processes to exit. Sep 12 22:13:13.345293 systemd[1]: Started sshd@13-10.0.0.68:22-10.0.0.1:58344.service - OpenSSH per-connection server daemon (10.0.0.1:58344). Sep 12 22:13:13.347403 systemd-logind[1501]: Removed session 13. Sep 12 22:13:13.414704 sshd[5664]: Accepted publickey for core from 10.0.0.1 port 58344 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:13.416108 sshd-session[5664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:13.421765 systemd-logind[1501]: New session 14 of user core. Sep 12 22:13:13.444496 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 22:13:13.657020 sshd[5667]: Connection closed by 10.0.0.1 port 58344 Sep 12 22:13:13.657660 sshd-session[5664]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:13.676144 systemd[1]: sshd@13-10.0.0.68:22-10.0.0.1:58344.service: Deactivated successfully. Sep 12 22:13:13.679053 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 22:13:13.679895 systemd-logind[1501]: Session 14 logged out. Waiting for processes to exit. Sep 12 22:13:13.681462 systemd-logind[1501]: Removed session 14. Sep 12 22:13:13.683160 systemd[1]: Started sshd@14-10.0.0.68:22-10.0.0.1:58356.service - OpenSSH per-connection server daemon (10.0.0.1:58356). Sep 12 22:13:13.745971 sshd[5679]: Accepted publickey for core from 10.0.0.1 port 58356 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:13.747483 sshd-session[5679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:13.751608 systemd-logind[1501]: New session 15 of user core. Sep 12 22:13:13.765404 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 22:13:14.428061 sshd[5682]: Connection closed by 10.0.0.1 port 58356 Sep 12 22:13:14.430284 sshd-session[5679]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:14.441586 systemd[1]: sshd@14-10.0.0.68:22-10.0.0.1:58356.service: Deactivated successfully. Sep 12 22:13:14.444231 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 22:13:14.446784 systemd-logind[1501]: Session 15 logged out. Waiting for processes to exit. Sep 12 22:13:14.451895 systemd[1]: Started sshd@15-10.0.0.68:22-10.0.0.1:58368.service - OpenSSH per-connection server daemon (10.0.0.1:58368). Sep 12 22:13:14.455471 systemd-logind[1501]: Removed session 15. Sep 12 22:13:14.524017 sshd[5701]: Accepted publickey for core from 10.0.0.1 port 58368 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:14.525744 sshd-session[5701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:14.529811 systemd-logind[1501]: New session 16 of user core. Sep 12 22:13:14.537368 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 22:13:14.850715 sshd[5704]: Connection closed by 10.0.0.1 port 58368 Sep 12 22:13:14.851379 sshd-session[5701]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:14.865916 systemd[1]: sshd@15-10.0.0.68:22-10.0.0.1:58368.service: Deactivated successfully. Sep 12 22:13:14.868793 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 22:13:14.870900 systemd-logind[1501]: Session 16 logged out. Waiting for processes to exit. Sep 12 22:13:14.874909 systemd[1]: Started sshd@16-10.0.0.68:22-10.0.0.1:58370.service - OpenSSH per-connection server daemon (10.0.0.1:58370). Sep 12 22:13:14.876255 systemd-logind[1501]: Removed session 16. Sep 12 22:13:14.937586 sshd[5715]: Accepted publickey for core from 10.0.0.1 port 58370 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:14.939272 sshd-session[5715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:14.943754 systemd-logind[1501]: New session 17 of user core. Sep 12 22:13:14.951399 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 22:13:15.086706 sshd[5718]: Connection closed by 10.0.0.1 port 58370 Sep 12 22:13:15.087277 sshd-session[5715]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:15.091349 systemd[1]: sshd@16-10.0.0.68:22-10.0.0.1:58370.service: Deactivated successfully. Sep 12 22:13:15.094125 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 22:13:15.095306 systemd-logind[1501]: Session 17 logged out. Waiting for processes to exit. Sep 12 22:13:15.096810 systemd-logind[1501]: Removed session 17. Sep 12 22:13:20.105551 systemd[1]: Started sshd@17-10.0.0.68:22-10.0.0.1:49218.service - OpenSSH per-connection server daemon (10.0.0.1:49218). Sep 12 22:13:20.154316 sshd[5742]: Accepted publickey for core from 10.0.0.1 port 49218 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:20.156649 sshd-session[5742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:20.168261 systemd-logind[1501]: New session 18 of user core. Sep 12 22:13:20.180415 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 22:13:20.337000 sshd[5745]: Connection closed by 10.0.0.1 port 49218 Sep 12 22:13:20.336828 sshd-session[5742]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:20.341506 systemd[1]: sshd@17-10.0.0.68:22-10.0.0.1:49218.service: Deactivated successfully. Sep 12 22:13:20.344107 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 22:13:20.345127 systemd-logind[1501]: Session 18 logged out. Waiting for processes to exit. Sep 12 22:13:20.346868 systemd-logind[1501]: Removed session 18. Sep 12 22:13:25.349454 systemd[1]: Started sshd@18-10.0.0.68:22-10.0.0.1:49234.service - OpenSSH per-connection server daemon (10.0.0.1:49234). Sep 12 22:13:25.409933 sshd[5762]: Accepted publickey for core from 10.0.0.1 port 49234 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:25.412066 sshd-session[5762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:25.418254 systemd-logind[1501]: New session 19 of user core. Sep 12 22:13:25.432359 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 22:13:25.552837 sshd[5765]: Connection closed by 10.0.0.1 port 49234 Sep 12 22:13:25.553415 sshd-session[5762]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:25.557420 systemd-logind[1501]: Session 19 logged out. Waiting for processes to exit. Sep 12 22:13:25.557711 systemd[1]: sshd@18-10.0.0.68:22-10.0.0.1:49234.service: Deactivated successfully. Sep 12 22:13:25.559758 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 22:13:25.562507 systemd-logind[1501]: Removed session 19. Sep 12 22:13:30.583666 systemd[1]: Started sshd@19-10.0.0.68:22-10.0.0.1:43780.service - OpenSSH per-connection server daemon (10.0.0.1:43780). Sep 12 22:13:30.658346 sshd[5789]: Accepted publickey for core from 10.0.0.1 port 43780 ssh2: RSA SHA256:Yqy+ciIRp9tS6RxmRMX9+tv4H/mrc+u7L29C7Pz/5UI Sep 12 22:13:30.660616 sshd-session[5789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:13:30.664456 systemd-logind[1501]: New session 20 of user core. Sep 12 22:13:30.671337 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 22:13:30.841340 sshd[5792]: Connection closed by 10.0.0.1 port 43780 Sep 12 22:13:30.841913 sshd-session[5789]: pam_unix(sshd:session): session closed for user core Sep 12 22:13:30.845262 systemd[1]: sshd@19-10.0.0.68:22-10.0.0.1:43780.service: Deactivated successfully. Sep 12 22:13:30.848253 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 22:13:30.849110 systemd-logind[1501]: Session 20 logged out. Waiting for processes to exit. Sep 12 22:13:30.850282 systemd-logind[1501]: Removed session 20.