Sep 9 23:47:18.766359 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 9 23:47:18.766383 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 22:10:22 -00 2025 Sep 9 23:47:18.766393 kernel: KASLR enabled Sep 9 23:47:18.766399 kernel: efi: EFI v2.7 by EDK II Sep 9 23:47:18.766405 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb221f18 Sep 9 23:47:18.766410 kernel: random: crng init done Sep 9 23:47:18.766417 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 9 23:47:18.766423 kernel: secureboot: Secure boot enabled Sep 9 23:47:18.766430 kernel: ACPI: Early table checksum verification disabled Sep 9 23:47:18.766437 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Sep 9 23:47:18.766443 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 9 23:47:18.766448 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766454 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766460 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766467 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766475 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766481 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766487 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766493 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766499 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:47:18.766505 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 9 23:47:18.766511 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 23:47:18.766517 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 23:47:18.766523 kernel: NODE_DATA(0) allocated [mem 0xdc737a00-0xdc73efff] Sep 9 23:47:18.766529 kernel: Zone ranges: Sep 9 23:47:18.766536 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 23:47:18.766542 kernel: DMA32 empty Sep 9 23:47:18.766556 kernel: Normal empty Sep 9 23:47:18.766570 kernel: Device empty Sep 9 23:47:18.766576 kernel: Movable zone start for each node Sep 9 23:47:18.766582 kernel: Early memory node ranges Sep 9 23:47:18.766591 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Sep 9 23:47:18.766598 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Sep 9 23:47:18.766603 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Sep 9 23:47:18.766609 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Sep 9 23:47:18.766615 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Sep 9 23:47:18.766621 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Sep 9 23:47:18.766630 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Sep 9 23:47:18.766635 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Sep 9 23:47:18.766642 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 9 23:47:18.766650 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 23:47:18.766656 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 9 23:47:18.766663 kernel: cma: Reserved 16 MiB at 0x00000000d7a00000 on node -1 Sep 9 23:47:18.766669 kernel: psci: probing for conduit method from ACPI. Sep 9 23:47:18.766677 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 23:47:18.766683 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 23:47:18.766689 kernel: psci: Trusted OS migration not required Sep 9 23:47:18.766696 kernel: psci: SMC Calling Convention v1.1 Sep 9 23:47:18.766702 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 9 23:47:18.766709 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 23:47:18.766715 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 23:47:18.766722 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 9 23:47:18.766728 kernel: Detected PIPT I-cache on CPU0 Sep 9 23:47:18.766735 kernel: CPU features: detected: GIC system register CPU interface Sep 9 23:47:18.766742 kernel: CPU features: detected: Spectre-v4 Sep 9 23:47:18.766748 kernel: CPU features: detected: Spectre-BHB Sep 9 23:47:18.766755 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 23:47:18.766761 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 23:47:18.766767 kernel: CPU features: detected: ARM erratum 1418040 Sep 9 23:47:18.766774 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 23:47:18.766780 kernel: alternatives: applying boot alternatives Sep 9 23:47:18.766787 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=fc7b279c2d918629032c01551b74c66c198cf923a976f9b3bc0d959e7c2302db Sep 9 23:47:18.766794 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 23:47:18.766800 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 23:47:18.766808 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 23:47:18.766814 kernel: Fallback order for Node 0: 0 Sep 9 23:47:18.766821 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 9 23:47:18.766827 kernel: Policy zone: DMA Sep 9 23:47:18.766955 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 23:47:18.766964 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 9 23:47:18.766970 kernel: software IO TLB: area num 4. Sep 9 23:47:18.766977 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 9 23:47:18.766984 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Sep 9 23:47:18.766990 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 23:47:18.766996 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 23:47:18.767003 kernel: rcu: RCU event tracing is enabled. Sep 9 23:47:18.767013 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 23:47:18.767020 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 23:47:18.767027 kernel: Tracing variant of Tasks RCU enabled. Sep 9 23:47:18.767033 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 23:47:18.767040 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 23:47:18.767046 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 23:47:18.767053 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 23:47:18.767060 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 23:47:18.767066 kernel: GICv3: 256 SPIs implemented Sep 9 23:47:18.767072 kernel: GICv3: 0 Extended SPIs implemented Sep 9 23:47:18.767079 kernel: Root IRQ handler: gic_handle_irq Sep 9 23:47:18.767087 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 9 23:47:18.767093 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 9 23:47:18.767099 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 9 23:47:18.767106 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 9 23:47:18.767113 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 9 23:47:18.767119 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 9 23:47:18.767126 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 9 23:47:18.767133 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 9 23:47:18.767139 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 23:47:18.767146 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 23:47:18.767152 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 9 23:47:18.767159 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 9 23:47:18.767174 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 9 23:47:18.767188 kernel: arm-pv: using stolen time PV Sep 9 23:47:18.767196 kernel: Console: colour dummy device 80x25 Sep 9 23:47:18.767202 kernel: ACPI: Core revision 20240827 Sep 9 23:47:18.767209 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 9 23:47:18.767216 kernel: pid_max: default: 32768 minimum: 301 Sep 9 23:47:18.767222 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 23:47:18.767229 kernel: landlock: Up and running. Sep 9 23:47:18.767236 kernel: SELinux: Initializing. Sep 9 23:47:18.767243 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 23:47:18.767251 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 23:47:18.767258 kernel: rcu: Hierarchical SRCU implementation. Sep 9 23:47:18.767265 kernel: rcu: Max phase no-delay instances is 400. Sep 9 23:47:18.767272 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 23:47:18.767279 kernel: Remapping and enabling EFI services. Sep 9 23:47:18.767285 kernel: smp: Bringing up secondary CPUs ... Sep 9 23:47:18.767292 kernel: Detected PIPT I-cache on CPU1 Sep 9 23:47:18.767299 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 9 23:47:18.767307 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 9 23:47:18.767318 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 23:47:18.767325 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 9 23:47:18.767333 kernel: Detected PIPT I-cache on CPU2 Sep 9 23:47:18.767340 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 9 23:47:18.767348 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 9 23:47:18.767355 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 23:47:18.767361 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 9 23:47:18.767369 kernel: Detected PIPT I-cache on CPU3 Sep 9 23:47:18.767377 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 9 23:47:18.767384 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 9 23:47:18.767391 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 23:47:18.767398 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 9 23:47:18.767405 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 23:47:18.767411 kernel: SMP: Total of 4 processors activated. Sep 9 23:47:18.767418 kernel: CPU: All CPU(s) started at EL1 Sep 9 23:47:18.767425 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 23:47:18.767432 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 23:47:18.767441 kernel: CPU features: detected: Common not Private translations Sep 9 23:47:18.767448 kernel: CPU features: detected: CRC32 instructions Sep 9 23:47:18.767455 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 9 23:47:18.767462 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 23:47:18.767468 kernel: CPU features: detected: LSE atomic instructions Sep 9 23:47:18.767475 kernel: CPU features: detected: Privileged Access Never Sep 9 23:47:18.767482 kernel: CPU features: detected: RAS Extension Support Sep 9 23:47:18.767489 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 23:47:18.767496 kernel: alternatives: applying system-wide alternatives Sep 9 23:47:18.767504 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 9 23:47:18.767512 kernel: Memory: 2422436K/2572288K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38912K init, 1038K bss, 127516K reserved, 16384K cma-reserved) Sep 9 23:47:18.767519 kernel: devtmpfs: initialized Sep 9 23:47:18.767527 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 23:47:18.767534 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 23:47:18.767542 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 23:47:18.767557 kernel: 0 pages in range for non-PLT usage Sep 9 23:47:18.767564 kernel: 508576 pages in range for PLT usage Sep 9 23:47:18.767572 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 23:47:18.767582 kernel: SMBIOS 3.0.0 present. Sep 9 23:47:18.767590 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 9 23:47:18.767597 kernel: DMI: Memory slots populated: 1/1 Sep 9 23:47:18.767605 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 23:47:18.767613 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 23:47:18.767620 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 23:47:18.767627 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 23:47:18.767634 kernel: audit: initializing netlink subsys (disabled) Sep 9 23:47:18.767642 kernel: audit: type=2000 audit(0.023:1): state=initialized audit_enabled=0 res=1 Sep 9 23:47:18.767651 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 23:47:18.767658 kernel: cpuidle: using governor menu Sep 9 23:47:18.767665 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 23:47:18.767672 kernel: ASID allocator initialised with 32768 entries Sep 9 23:47:18.767679 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 23:47:18.767686 kernel: Serial: AMBA PL011 UART driver Sep 9 23:47:18.767694 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 23:47:18.767701 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 23:47:18.767709 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 23:47:18.767718 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 23:47:18.767725 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 23:47:18.767732 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 23:47:18.767739 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 23:47:18.767746 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 23:47:18.767754 kernel: ACPI: Added _OSI(Module Device) Sep 9 23:47:18.767761 kernel: ACPI: Added _OSI(Processor Device) Sep 9 23:47:18.767767 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 23:47:18.767775 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 23:47:18.767783 kernel: ACPI: Interpreter enabled Sep 9 23:47:18.767790 kernel: ACPI: Using GIC for interrupt routing Sep 9 23:47:18.767797 kernel: ACPI: MCFG table detected, 1 entries Sep 9 23:47:18.767804 kernel: ACPI: CPU0 has been hot-added Sep 9 23:47:18.767811 kernel: ACPI: CPU1 has been hot-added Sep 9 23:47:18.767818 kernel: ACPI: CPU2 has been hot-added Sep 9 23:47:18.767824 kernel: ACPI: CPU3 has been hot-added Sep 9 23:47:18.767841 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 9 23:47:18.767848 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 23:47:18.767858 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 23:47:18.768002 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 23:47:18.768072 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 23:47:18.768135 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 23:47:18.768197 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 9 23:47:18.768258 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 9 23:47:18.768268 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 9 23:47:18.768277 kernel: PCI host bridge to bus 0000:00 Sep 9 23:47:18.768353 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 9 23:47:18.768410 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 9 23:47:18.768466 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 9 23:47:18.768521 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 23:47:18.768621 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 9 23:47:18.768697 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 23:47:18.768764 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 9 23:47:18.768829 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 9 23:47:18.768926 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 9 23:47:18.768995 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 9 23:47:18.769059 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 9 23:47:18.769123 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 9 23:47:18.769183 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 9 23:47:18.769241 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 9 23:47:18.769298 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 9 23:47:18.769307 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 9 23:47:18.769315 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 9 23:47:18.769322 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 9 23:47:18.769330 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 9 23:47:18.769337 kernel: iommu: Default domain type: Translated Sep 9 23:47:18.769346 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 23:47:18.769354 kernel: efivars: Registered efivars operations Sep 9 23:47:18.769361 kernel: vgaarb: loaded Sep 9 23:47:18.769369 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 23:47:18.769376 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 23:47:18.769383 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 23:47:18.769390 kernel: pnp: PnP ACPI init Sep 9 23:47:18.769462 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 9 23:47:18.769472 kernel: pnp: PnP ACPI: found 1 devices Sep 9 23:47:18.769481 kernel: NET: Registered PF_INET protocol family Sep 9 23:47:18.769488 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 23:47:18.769496 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 23:47:18.769503 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 23:47:18.769510 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 23:47:18.769517 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 23:47:18.769525 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 23:47:18.769532 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 23:47:18.769539 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 23:47:18.769556 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 23:47:18.769563 kernel: PCI: CLS 0 bytes, default 64 Sep 9 23:47:18.769570 kernel: kvm [1]: HYP mode not available Sep 9 23:47:18.769578 kernel: Initialise system trusted keyrings Sep 9 23:47:18.769585 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 23:47:18.769592 kernel: Key type asymmetric registered Sep 9 23:47:18.769600 kernel: Asymmetric key parser 'x509' registered Sep 9 23:47:18.769607 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 23:47:18.769614 kernel: io scheduler mq-deadline registered Sep 9 23:47:18.769623 kernel: io scheduler kyber registered Sep 9 23:47:18.769630 kernel: io scheduler bfq registered Sep 9 23:47:18.769637 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 9 23:47:18.769644 kernel: ACPI: button: Power Button [PWRB] Sep 9 23:47:18.769651 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 9 23:47:18.769714 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 9 23:47:18.769724 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 23:47:18.769731 kernel: thunder_xcv, ver 1.0 Sep 9 23:47:18.769738 kernel: thunder_bgx, ver 1.0 Sep 9 23:47:18.769747 kernel: nicpf, ver 1.0 Sep 9 23:47:18.769753 kernel: nicvf, ver 1.0 Sep 9 23:47:18.769821 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 23:47:18.769907 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T23:47:18 UTC (1757461638) Sep 9 23:47:18.769918 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 23:47:18.769925 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 23:47:18.769932 kernel: watchdog: NMI not fully supported Sep 9 23:47:18.769939 kernel: watchdog: Hard watchdog permanently disabled Sep 9 23:47:18.769949 kernel: NET: Registered PF_INET6 protocol family Sep 9 23:47:18.769956 kernel: Segment Routing with IPv6 Sep 9 23:47:18.769963 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 23:47:18.769970 kernel: NET: Registered PF_PACKET protocol family Sep 9 23:47:18.769977 kernel: Key type dns_resolver registered Sep 9 23:47:18.769984 kernel: registered taskstats version 1 Sep 9 23:47:18.769990 kernel: Loading compiled-in X.509 certificates Sep 9 23:47:18.769998 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 61217a1897415238555e2058a4e44c51622b0f87' Sep 9 23:47:18.770005 kernel: Demotion targets for Node 0: null Sep 9 23:47:18.770013 kernel: Key type .fscrypt registered Sep 9 23:47:18.770020 kernel: Key type fscrypt-provisioning registered Sep 9 23:47:18.770027 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 23:47:18.770034 kernel: ima: Allocated hash algorithm: sha1 Sep 9 23:47:18.770041 kernel: ima: No architecture policies found Sep 9 23:47:18.770048 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 23:47:18.770054 kernel: clk: Disabling unused clocks Sep 9 23:47:18.770061 kernel: PM: genpd: Disabling unused power domains Sep 9 23:47:18.770068 kernel: Warning: unable to open an initial console. Sep 9 23:47:18.770078 kernel: Freeing unused kernel memory: 38912K Sep 9 23:47:18.770085 kernel: Run /init as init process Sep 9 23:47:18.770092 kernel: with arguments: Sep 9 23:47:18.770099 kernel: /init Sep 9 23:47:18.770105 kernel: with environment: Sep 9 23:47:18.770112 kernel: HOME=/ Sep 9 23:47:18.770119 kernel: TERM=linux Sep 9 23:47:18.770126 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 23:47:18.770134 systemd[1]: Successfully made /usr/ read-only. Sep 9 23:47:18.770145 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:47:18.770153 systemd[1]: Detected virtualization kvm. Sep 9 23:47:18.770161 systemd[1]: Detected architecture arm64. Sep 9 23:47:18.770168 systemd[1]: Running in initrd. Sep 9 23:47:18.770175 systemd[1]: No hostname configured, using default hostname. Sep 9 23:47:18.770183 systemd[1]: Hostname set to . Sep 9 23:47:18.770190 systemd[1]: Initializing machine ID from VM UUID. Sep 9 23:47:18.770199 systemd[1]: Queued start job for default target initrd.target. Sep 9 23:47:18.770207 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:47:18.770214 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:47:18.770222 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 23:47:18.770230 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:47:18.770238 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 23:47:18.770246 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 23:47:18.770256 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 23:47:18.770264 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 23:47:18.770272 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:47:18.770279 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:47:18.770287 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:47:18.770294 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:47:18.770302 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:47:18.770310 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:47:18.770318 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:47:18.770326 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:47:18.770334 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 23:47:18.770341 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 23:47:18.770349 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:47:18.770357 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:47:18.770364 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:47:18.770372 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:47:18.770379 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 23:47:18.770389 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:47:18.770396 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 23:47:18.770404 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 23:47:18.770412 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 23:47:18.770419 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:47:18.770427 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:47:18.770435 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:47:18.770442 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 23:47:18.770452 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:47:18.770460 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 23:47:18.770467 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 23:47:18.770490 systemd-journald[244]: Collecting audit messages is disabled. Sep 9 23:47:18.770511 systemd-journald[244]: Journal started Sep 9 23:47:18.770529 systemd-journald[244]: Runtime Journal (/run/log/journal/272aeff007f1484e953ebe3007b76861) is 6M, max 48.5M, 42.4M free. Sep 9 23:47:18.780238 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 23:47:18.780265 kernel: Bridge firewalling registered Sep 9 23:47:18.762486 systemd-modules-load[245]: Inserted module 'overlay' Sep 9 23:47:18.778216 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 9 23:47:18.782523 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:47:18.785153 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:47:18.785505 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:47:18.786559 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 23:47:18.790596 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 23:47:18.792222 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:47:18.793604 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:47:18.795252 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:47:18.807433 systemd-tmpfiles[268]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 23:47:18.808721 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:47:18.811111 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:47:18.814044 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:47:18.816924 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:47:18.818985 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:47:18.821967 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 23:47:18.842581 dracut-cmdline[289]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=fc7b279c2d918629032c01551b74c66c198cf923a976f9b3bc0d959e7c2302db Sep 9 23:47:18.858048 systemd-resolved[286]: Positive Trust Anchors: Sep 9 23:47:18.858067 systemd-resolved[286]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:47:18.858097 systemd-resolved[286]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:47:18.862796 systemd-resolved[286]: Defaulting to hostname 'linux'. Sep 9 23:47:18.863744 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:47:18.866978 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:47:18.920854 kernel: SCSI subsystem initialized Sep 9 23:47:18.924847 kernel: Loading iSCSI transport class v2.0-870. Sep 9 23:47:18.941861 kernel: iscsi: registered transport (tcp) Sep 9 23:47:18.954862 kernel: iscsi: registered transport (qla4xxx) Sep 9 23:47:18.954915 kernel: QLogic iSCSI HBA Driver Sep 9 23:47:18.971877 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:47:18.986277 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:47:18.988210 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:47:19.034878 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 23:47:19.037044 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 23:47:19.096859 kernel: raid6: neonx8 gen() 15698 MB/s Sep 9 23:47:19.113843 kernel: raid6: neonx4 gen() 15802 MB/s Sep 9 23:47:19.130844 kernel: raid6: neonx2 gen() 13217 MB/s Sep 9 23:47:19.147843 kernel: raid6: neonx1 gen() 10466 MB/s Sep 9 23:47:19.164844 kernel: raid6: int64x8 gen() 6881 MB/s Sep 9 23:47:19.181843 kernel: raid6: int64x4 gen() 7350 MB/s Sep 9 23:47:19.198846 kernel: raid6: int64x2 gen() 6089 MB/s Sep 9 23:47:19.215845 kernel: raid6: int64x1 gen() 5044 MB/s Sep 9 23:47:19.215865 kernel: raid6: using algorithm neonx4 gen() 15802 MB/s Sep 9 23:47:19.232856 kernel: raid6: .... xor() 12303 MB/s, rmw enabled Sep 9 23:47:19.232882 kernel: raid6: using neon recovery algorithm Sep 9 23:47:19.239098 kernel: xor: measuring software checksum speed Sep 9 23:47:19.239124 kernel: 8regs : 21653 MB/sec Sep 9 23:47:19.239848 kernel: 32regs : 21618 MB/sec Sep 9 23:47:19.239860 kernel: arm64_neon : 25183 MB/sec Sep 9 23:47:19.240992 kernel: xor: using function: arm64_neon (25183 MB/sec) Sep 9 23:47:19.298861 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 23:47:19.305253 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:47:19.307653 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:47:19.347798 systemd-udevd[497]: Using default interface naming scheme 'v255'. Sep 9 23:47:19.351987 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:47:19.354352 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 23:47:19.379856 dracut-pre-trigger[505]: rd.md=0: removing MD RAID activation Sep 9 23:47:19.405116 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:47:19.407413 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:47:19.463752 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:47:19.466602 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 23:47:19.510855 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 9 23:47:19.511058 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 23:47:19.514000 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 23:47:19.514049 kernel: GPT:9289727 != 19775487 Sep 9 23:47:19.514059 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 23:47:19.514848 kernel: GPT:9289727 != 19775487 Sep 9 23:47:19.514873 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 23:47:19.516404 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 23:47:19.527690 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:47:19.527810 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:47:19.542786 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:47:19.545149 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:47:19.553970 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 23:47:19.566937 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 23:47:19.568371 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 23:47:19.571265 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 23:47:19.579916 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:47:19.587732 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 23:47:19.595439 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 23:47:19.601154 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:47:19.602314 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:47:19.604040 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:47:19.606625 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 23:47:19.608491 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 23:47:19.624625 disk-uuid[590]: Primary Header is updated. Sep 9 23:47:19.624625 disk-uuid[590]: Secondary Entries is updated. Sep 9 23:47:19.624625 disk-uuid[590]: Secondary Header is updated. Sep 9 23:47:19.628635 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:47:19.631558 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 23:47:20.635871 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 23:47:20.636263 disk-uuid[593]: The operation has completed successfully. Sep 9 23:47:20.669393 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 23:47:20.669520 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 23:47:20.689438 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 23:47:20.707120 sh[610]: Success Sep 9 23:47:20.719856 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 23:47:20.719900 kernel: device-mapper: uevent: version 1.0.3 Sep 9 23:47:20.721867 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 23:47:20.728872 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 23:47:20.754557 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 23:47:20.759157 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 23:47:20.783194 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 23:47:20.789861 kernel: BTRFS: device fsid 2bc16190-0dd5-44d6-b331-3d703f5a1d1f devid 1 transid 40 /dev/mapper/usr (253:0) scanned by mount (623) Sep 9 23:47:20.791846 kernel: BTRFS info (device dm-0): first mount of filesystem 2bc16190-0dd5-44d6-b331-3d703f5a1d1f Sep 9 23:47:20.791874 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:47:20.797851 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 23:47:20.797891 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 23:47:20.798989 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 23:47:20.800218 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:47:20.801411 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 23:47:20.802221 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 23:47:20.805276 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 23:47:20.836849 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (653) Sep 9 23:47:20.838853 kernel: BTRFS info (device vda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:47:20.838895 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:47:20.841035 kernel: BTRFS info (device vda6): turning on async discard Sep 9 23:47:20.841086 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 23:47:20.845853 kernel: BTRFS info (device vda6): last unmount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:47:20.847143 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 23:47:20.851041 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 23:47:20.921403 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:47:20.925423 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:47:20.965985 ignition[703]: Ignition 2.21.0 Sep 9 23:47:20.966224 systemd-networkd[803]: lo: Link UP Sep 9 23:47:20.965996 ignition[703]: Stage: fetch-offline Sep 9 23:47:20.966228 systemd-networkd[803]: lo: Gained carrier Sep 9 23:47:20.966032 ignition[703]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:47:20.966983 systemd-networkd[803]: Enumeration completed Sep 9 23:47:20.966040 ignition[703]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 23:47:20.967094 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:47:20.966210 ignition[703]: parsed url from cmdline: "" Sep 9 23:47:20.967399 systemd-networkd[803]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:47:20.966214 ignition[703]: no config URL provided Sep 9 23:47:20.967402 systemd-networkd[803]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:47:20.966218 ignition[703]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 23:47:20.967820 systemd-networkd[803]: eth0: Link UP Sep 9 23:47:20.966225 ignition[703]: no config at "/usr/lib/ignition/user.ign" Sep 9 23:47:20.968100 systemd[1]: Reached target network.target - Network. Sep 9 23:47:20.966246 ignition[703]: op(1): [started] loading QEMU firmware config module Sep 9 23:47:20.969110 systemd-networkd[803]: eth0: Gained carrier Sep 9 23:47:20.966251 ignition[703]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 23:47:20.969121 systemd-networkd[803]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:47:20.974901 ignition[703]: op(1): [finished] loading QEMU firmware config module Sep 9 23:47:20.999895 systemd-networkd[803]: eth0: DHCPv4 address 10.0.0.67/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 23:47:21.027095 ignition[703]: parsing config with SHA512: dcc07e0f5c002648a149fabc37c87d621c26ba21687122d70d010906b9a1665c82ebc74c8e91146533415cba42452bb7e4349f373fa1d5f41ce8147d8a70c99e Sep 9 23:47:21.031896 unknown[703]: fetched base config from "system" Sep 9 23:47:21.031908 unknown[703]: fetched user config from "qemu" Sep 9 23:47:21.032380 ignition[703]: fetch-offline: fetch-offline passed Sep 9 23:47:21.032439 ignition[703]: Ignition finished successfully Sep 9 23:47:21.034109 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:47:21.035797 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 23:47:21.036601 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 23:47:21.071334 ignition[811]: Ignition 2.21.0 Sep 9 23:47:21.071349 ignition[811]: Stage: kargs Sep 9 23:47:21.072679 ignition[811]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:47:21.072698 ignition[811]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 23:47:21.074192 ignition[811]: kargs: kargs passed Sep 9 23:47:21.074249 ignition[811]: Ignition finished successfully Sep 9 23:47:21.076379 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 23:47:21.078297 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 23:47:21.105462 ignition[819]: Ignition 2.21.0 Sep 9 23:47:21.105473 ignition[819]: Stage: disks Sep 9 23:47:21.105644 ignition[819]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:47:21.105654 ignition[819]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 23:47:21.110377 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 23:47:21.106443 ignition[819]: disks: disks passed Sep 9 23:47:21.111361 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 23:47:21.106494 ignition[819]: Ignition finished successfully Sep 9 23:47:21.113940 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 23:47:21.115196 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:47:21.116673 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:47:21.118002 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:47:21.120622 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 23:47:21.157261 systemd-fsck[829]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 23:47:21.161483 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 23:47:21.163514 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 23:47:21.231870 kernel: EXT4-fs (vda9): mounted filesystem 7cc0d7f3-e4a1-4dc4-8b58-ceece0d874c1 r/w with ordered data mode. Quota mode: none. Sep 9 23:47:21.232309 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 23:47:21.233573 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 23:47:21.236315 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:47:21.238411 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 23:47:21.239269 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 23:47:21.239310 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 23:47:21.239334 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:47:21.257695 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 23:47:21.259912 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 23:47:21.265680 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (837) Sep 9 23:47:21.265715 kernel: BTRFS info (device vda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:47:21.265726 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:47:21.269024 kernel: BTRFS info (device vda6): turning on async discard Sep 9 23:47:21.269049 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 23:47:21.271602 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:47:21.299076 initrd-setup-root[861]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 23:47:21.303906 initrd-setup-root[868]: cut: /sysroot/etc/group: No such file or directory Sep 9 23:47:21.308113 initrd-setup-root[875]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 23:47:21.311865 initrd-setup-root[882]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 23:47:21.388266 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 23:47:21.390201 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 23:47:21.391594 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 23:47:21.409854 kernel: BTRFS info (device vda6): last unmount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:47:21.419574 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 23:47:21.429348 ignition[952]: INFO : Ignition 2.21.0 Sep 9 23:47:21.429348 ignition[952]: INFO : Stage: mount Sep 9 23:47:21.431250 ignition[952]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:47:21.431250 ignition[952]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 23:47:21.433753 ignition[952]: INFO : mount: mount passed Sep 9 23:47:21.433753 ignition[952]: INFO : Ignition finished successfully Sep 9 23:47:21.435438 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 23:47:21.438239 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 23:47:21.789300 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 23:47:21.790812 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:47:21.823730 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (964) Sep 9 23:47:21.823782 kernel: BTRFS info (device vda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:47:21.823794 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:47:21.830606 kernel: BTRFS info (device vda6): turning on async discard Sep 9 23:47:21.830646 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 23:47:21.832304 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:47:21.863780 ignition[982]: INFO : Ignition 2.21.0 Sep 9 23:47:21.863780 ignition[982]: INFO : Stage: files Sep 9 23:47:21.865277 ignition[982]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:47:21.865277 ignition[982]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 23:47:21.866875 ignition[982]: DEBUG : files: compiled without relabeling support, skipping Sep 9 23:47:21.868078 ignition[982]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 23:47:21.868078 ignition[982]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 23:47:21.870905 ignition[982]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 23:47:21.872007 ignition[982]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 23:47:21.872007 ignition[982]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 23:47:21.871570 unknown[982]: wrote ssh authorized keys file for user: core Sep 9 23:47:21.875091 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 23:47:21.875091 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 9 23:47:21.925881 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 23:47:22.257482 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 23:47:22.257482 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 23:47:22.260800 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 23:47:22.260800 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:47:22.260800 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:47:22.260800 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:47:22.260800 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:47:22.260800 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:47:22.260800 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:47:22.271259 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:47:22.271259 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:47:22.271259 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:47:22.271259 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:47:22.271259 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:47:22.271259 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 9 23:47:22.329005 systemd-networkd[803]: eth0: Gained IPv6LL Sep 9 23:47:22.713027 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 23:47:23.409965 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:47:23.409965 ignition[982]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 23:47:23.413128 ignition[982]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:47:23.414893 ignition[982]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:47:23.414893 ignition[982]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 23:47:23.414893 ignition[982]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 23:47:23.414893 ignition[982]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 23:47:23.420433 ignition[982]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 23:47:23.420433 ignition[982]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 23:47:23.420433 ignition[982]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 23:47:23.429914 ignition[982]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 23:47:23.433544 ignition[982]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 23:47:23.435859 ignition[982]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 23:47:23.435859 ignition[982]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 23:47:23.435859 ignition[982]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 23:47:23.435859 ignition[982]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:47:23.435859 ignition[982]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:47:23.435859 ignition[982]: INFO : files: files passed Sep 9 23:47:23.435859 ignition[982]: INFO : Ignition finished successfully Sep 9 23:47:23.436580 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 23:47:23.438707 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 23:47:23.440459 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 23:47:23.451893 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 23:47:23.451983 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 23:47:23.454505 initrd-setup-root-after-ignition[1009]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 23:47:23.455653 initrd-setup-root-after-ignition[1012]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:47:23.455653 initrd-setup-root-after-ignition[1012]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:47:23.458093 initrd-setup-root-after-ignition[1016]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:47:23.458596 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:47:23.461138 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 23:47:23.463367 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 23:47:23.501779 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 23:47:23.501925 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 23:47:23.503641 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 23:47:23.505083 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 23:47:23.506508 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 23:47:23.507306 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 23:47:23.529269 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:47:23.531448 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 23:47:23.551123 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:47:23.552088 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:47:23.553817 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 23:47:23.555337 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 23:47:23.555454 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:47:23.557544 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 23:47:23.559314 systemd[1]: Stopped target basic.target - Basic System. Sep 9 23:47:23.560622 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 23:47:23.562184 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:47:23.563851 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 23:47:23.565679 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:47:23.567288 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 23:47:23.568767 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:47:23.570435 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 23:47:23.572050 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 23:47:23.573515 systemd[1]: Stopped target swap.target - Swaps. Sep 9 23:47:23.574768 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 23:47:23.574909 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:47:23.576901 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:47:23.578643 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:47:23.580334 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 23:47:23.583913 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:47:23.584961 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 23:47:23.585072 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 23:47:23.587484 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 23:47:23.587605 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:47:23.589334 systemd[1]: Stopped target paths.target - Path Units. Sep 9 23:47:23.590627 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 23:47:23.593901 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:47:23.594947 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 23:47:23.596766 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 23:47:23.598157 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 23:47:23.598245 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:47:23.599713 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 23:47:23.599787 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:47:23.601202 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 23:47:23.601311 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:47:23.602951 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 23:47:23.603046 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 23:47:23.605345 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 23:47:23.606882 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 23:47:23.606993 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:47:23.609601 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 23:47:23.611103 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 23:47:23.611215 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:47:23.613000 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 23:47:23.613084 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:47:23.620310 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 23:47:23.620388 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 23:47:23.629097 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 23:47:23.635403 ignition[1036]: INFO : Ignition 2.21.0 Sep 9 23:47:23.635403 ignition[1036]: INFO : Stage: umount Sep 9 23:47:23.636735 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:47:23.636735 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 23:47:23.638382 ignition[1036]: INFO : umount: umount passed Sep 9 23:47:23.638382 ignition[1036]: INFO : Ignition finished successfully Sep 9 23:47:23.638782 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 23:47:23.638893 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 23:47:23.640031 systemd[1]: Stopped target network.target - Network. Sep 9 23:47:23.641166 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 23:47:23.641215 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 23:47:23.642586 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 23:47:23.642622 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 23:47:23.644075 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 23:47:23.644118 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 23:47:23.645607 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 23:47:23.645646 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 23:47:23.647335 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 23:47:23.648779 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 23:47:23.660769 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 23:47:23.660911 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 23:47:23.664975 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 23:47:23.665231 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 23:47:23.665322 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 23:47:23.668222 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 23:47:23.668746 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 23:47:23.669981 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 23:47:23.670022 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:47:23.672581 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 23:47:23.673430 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 23:47:23.673498 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:47:23.675670 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 23:47:23.675719 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:47:23.678555 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 23:47:23.678604 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 23:47:23.680771 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 23:47:23.680813 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:47:23.685516 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:47:23.689914 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 23:47:23.689979 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:47:23.692904 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 23:47:23.693002 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 23:47:23.694991 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 23:47:23.695095 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 23:47:23.708499 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 23:47:23.708679 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:47:23.710882 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 23:47:23.710996 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 23:47:23.713000 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 23:47:23.713082 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 23:47:23.714087 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 23:47:23.714119 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:47:23.715576 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 23:47:23.715627 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:47:23.717796 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 23:47:23.717856 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 23:47:23.720310 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 23:47:23.720361 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:47:23.723685 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 23:47:23.725399 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 23:47:23.725450 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:47:23.728245 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 23:47:23.728285 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:47:23.731018 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:47:23.731056 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:47:23.735233 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 23:47:23.735284 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 23:47:23.735317 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:47:23.745385 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 23:47:23.745508 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 23:47:23.747645 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 23:47:23.749968 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 23:47:23.778874 systemd[1]: Switching root. Sep 9 23:47:23.816186 systemd-journald[244]: Journal stopped Sep 9 23:47:24.641670 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 9 23:47:24.641735 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 23:47:24.641751 kernel: SELinux: policy capability open_perms=1 Sep 9 23:47:24.641761 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 23:47:24.641771 kernel: SELinux: policy capability always_check_network=0 Sep 9 23:47:24.641782 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 23:47:24.641798 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 23:47:24.641807 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 23:47:24.641817 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 23:47:24.641826 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 23:47:24.641860 kernel: audit: type=1403 audit(1757461643.986:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 23:47:24.641877 systemd[1]: Successfully loaded SELinux policy in 56.635ms. Sep 9 23:47:24.641894 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.828ms. Sep 9 23:47:24.641908 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:47:24.641923 systemd[1]: Detected virtualization kvm. Sep 9 23:47:24.641934 systemd[1]: Detected architecture arm64. Sep 9 23:47:24.641943 systemd[1]: Detected first boot. Sep 9 23:47:24.641954 systemd[1]: Initializing machine ID from VM UUID. Sep 9 23:47:24.641964 zram_generator::config[1082]: No configuration found. Sep 9 23:47:24.641978 kernel: NET: Registered PF_VSOCK protocol family Sep 9 23:47:24.641988 systemd[1]: Populated /etc with preset unit settings. Sep 9 23:47:24.642001 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 23:47:24.642011 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 23:47:24.642020 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 23:47:24.642047 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 23:47:24.642057 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 23:47:24.642068 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 23:47:24.642079 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 23:47:24.642089 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 23:47:24.642100 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 23:47:24.642111 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 23:47:24.642121 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 23:47:24.642131 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 23:47:24.642142 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:47:24.642153 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:47:24.642164 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 23:47:24.642175 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 23:47:24.642187 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 23:47:24.642199 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:47:24.642209 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 23:47:24.642220 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:47:24.642230 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:47:24.642241 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 23:47:24.642252 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 23:47:24.642262 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 23:47:24.642274 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 23:47:24.642286 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:47:24.642297 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:47:24.642307 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:47:24.642318 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:47:24.642328 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 23:47:24.642338 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 23:47:24.642348 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 23:47:24.642358 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:47:24.642368 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:47:24.642380 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:47:24.642391 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 23:47:24.642402 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 23:47:24.642412 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 23:47:24.642422 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 23:47:24.642431 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 23:47:24.642441 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 23:47:24.642451 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 23:47:24.642465 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 23:47:24.642476 systemd[1]: Reached target machines.target - Containers. Sep 9 23:47:24.642487 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 23:47:24.642496 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:47:24.642506 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:47:24.642517 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 23:47:24.642526 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:47:24.642545 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:47:24.642557 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:47:24.642570 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 23:47:24.642580 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:47:24.642590 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 23:47:24.642600 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 23:47:24.642610 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 23:47:24.642621 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 23:47:24.642632 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 23:47:24.642642 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:47:24.642654 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:47:24.642665 kernel: fuse: init (API version 7.41) Sep 9 23:47:24.642678 kernel: loop: module loaded Sep 9 23:47:24.642688 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:47:24.642699 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:47:24.642710 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 23:47:24.642721 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 23:47:24.642731 kernel: ACPI: bus type drm_connector registered Sep 9 23:47:24.642742 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:47:24.642754 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 23:47:24.642764 systemd[1]: Stopped verity-setup.service. Sep 9 23:47:24.642807 systemd-journald[1155]: Collecting audit messages is disabled. Sep 9 23:47:24.642964 systemd-journald[1155]: Journal started Sep 9 23:47:24.642993 systemd-journald[1155]: Runtime Journal (/run/log/journal/272aeff007f1484e953ebe3007b76861) is 6M, max 48.5M, 42.4M free. Sep 9 23:47:24.392720 systemd[1]: Queued start job for default target multi-user.target. Sep 9 23:47:24.411592 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 23:47:24.412121 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 23:47:24.646053 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:47:24.646564 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 23:47:24.647687 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 23:47:24.648846 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 23:47:24.649726 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 23:47:24.650916 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 23:47:24.651915 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 23:47:24.652977 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 23:47:24.654181 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:47:24.655417 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 23:47:24.655609 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 23:47:24.656985 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:47:24.657157 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:47:24.658378 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:47:24.658566 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:47:24.659827 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:47:24.660033 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:47:24.661355 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 23:47:24.661541 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 23:47:24.662711 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:47:24.662902 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:47:24.664088 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:47:24.665424 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:47:24.667883 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 23:47:24.669175 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 23:47:24.681226 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:47:24.683475 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 23:47:24.685491 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 23:47:24.686449 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 23:47:24.686486 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:47:24.688315 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 23:47:24.695740 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 23:47:24.696846 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:47:24.698293 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 23:47:24.700136 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 23:47:24.701340 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:47:24.703972 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 23:47:24.705025 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:47:24.705967 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:47:24.708289 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 23:47:24.708752 systemd-journald[1155]: Time spent on flushing to /var/log/journal/272aeff007f1484e953ebe3007b76861 is 17.821ms for 884 entries. Sep 9 23:47:24.708752 systemd-journald[1155]: System Journal (/var/log/journal/272aeff007f1484e953ebe3007b76861) is 8M, max 195.6M, 187.6M free. Sep 9 23:47:24.734186 systemd-journald[1155]: Received client request to flush runtime journal. Sep 9 23:47:24.734249 kernel: loop0: detected capacity change from 0 to 100608 Sep 9 23:47:24.713973 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 23:47:24.717926 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:47:24.722060 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 23:47:24.723093 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 23:47:24.733495 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 23:47:24.734790 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 23:47:24.738115 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 23:47:24.739722 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 23:47:24.742509 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:47:24.753935 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 23:47:24.772908 kernel: loop1: detected capacity change from 0 to 119320 Sep 9 23:47:24.774876 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 23:47:24.777968 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:47:24.783513 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 23:47:24.799882 kernel: loop2: detected capacity change from 0 to 207008 Sep 9 23:47:24.810163 systemd-tmpfiles[1215]: ACLs are not supported, ignoring. Sep 9 23:47:24.810182 systemd-tmpfiles[1215]: ACLs are not supported, ignoring. Sep 9 23:47:24.813781 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:47:24.831868 kernel: loop3: detected capacity change from 0 to 100608 Sep 9 23:47:24.837869 kernel: loop4: detected capacity change from 0 to 119320 Sep 9 23:47:24.842932 kernel: loop5: detected capacity change from 0 to 207008 Sep 9 23:47:24.847442 (sd-merge)[1221]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 23:47:24.847888 (sd-merge)[1221]: Merged extensions into '/usr'. Sep 9 23:47:24.851306 systemd[1]: Reload requested from client PID 1199 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 23:47:24.851322 systemd[1]: Reloading... Sep 9 23:47:24.915924 zram_generator::config[1248]: No configuration found. Sep 9 23:47:25.003059 ldconfig[1194]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 23:47:25.081374 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 23:47:25.081869 systemd[1]: Reloading finished in 230 ms. Sep 9 23:47:25.121887 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 23:47:25.124881 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 23:47:25.151175 systemd[1]: Starting ensure-sysext.service... Sep 9 23:47:25.156701 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:47:25.172925 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 23:47:25.175025 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 23:47:25.175371 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 23:47:25.175668 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 23:47:25.175906 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 23:47:25.176557 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 23:47:25.176761 systemd-tmpfiles[1283]: ACLs are not supported, ignoring. Sep 9 23:47:25.176804 systemd-tmpfiles[1283]: ACLs are not supported, ignoring. Sep 9 23:47:25.178228 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:47:25.179794 systemd[1]: Reload requested from client PID 1282 ('systemctl') (unit ensure-sysext.service)... Sep 9 23:47:25.179810 systemd[1]: Reloading... Sep 9 23:47:25.180685 systemd-tmpfiles[1283]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:47:25.181889 systemd-tmpfiles[1283]: Skipping /boot Sep 9 23:47:25.188402 systemd-tmpfiles[1283]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:47:25.188422 systemd-tmpfiles[1283]: Skipping /boot Sep 9 23:47:25.219109 systemd-udevd[1286]: Using default interface naming scheme 'v255'. Sep 9 23:47:25.229877 zram_generator::config[1311]: No configuration found. Sep 9 23:47:25.437020 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 23:47:25.438211 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 23:47:25.438289 systemd[1]: Reloading finished in 257 ms. Sep 9 23:47:25.450874 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:47:25.457863 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:47:25.481513 systemd[1]: Finished ensure-sysext.service. Sep 9 23:47:25.494962 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:47:25.497333 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 23:47:25.498593 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:47:25.499499 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:47:25.506706 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:47:25.510994 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:47:25.513441 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:47:25.514573 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:47:25.516999 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 23:47:25.518181 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:47:25.522066 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 23:47:25.532664 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:47:25.535975 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:47:25.538663 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 23:47:25.543035 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 23:47:25.547064 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:47:25.549885 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 23:47:25.553213 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:47:25.553931 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:47:25.555337 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:47:25.555548 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:47:25.558450 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:47:25.558712 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:47:25.561361 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:47:25.563780 augenrules[1428]: No rules Sep 9 23:47:25.570082 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:47:25.571389 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:47:25.571640 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:47:25.572993 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 23:47:25.575807 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 23:47:25.584878 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:47:25.585019 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:47:25.586327 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 23:47:25.588979 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 23:47:25.590104 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 23:47:25.590527 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 23:47:25.600862 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 23:47:25.607916 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:47:25.624085 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 23:47:25.677018 systemd-networkd[1418]: lo: Link UP Sep 9 23:47:25.677027 systemd-networkd[1418]: lo: Gained carrier Sep 9 23:47:25.677821 systemd-networkd[1418]: Enumeration completed Sep 9 23:47:25.678274 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:47:25.678810 systemd-networkd[1418]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:47:25.678917 systemd-networkd[1418]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:47:25.679252 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 23:47:25.679601 systemd-networkd[1418]: eth0: Link UP Sep 9 23:47:25.679805 systemd-networkd[1418]: eth0: Gained carrier Sep 9 23:47:25.679884 systemd-networkd[1418]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:47:25.680262 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 23:47:25.680902 systemd-resolved[1420]: Positive Trust Anchors: Sep 9 23:47:25.680919 systemd-resolved[1420]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:47:25.680950 systemd-resolved[1420]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:47:25.682269 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 23:47:25.684132 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 23:47:25.687416 systemd-resolved[1420]: Defaulting to hostname 'linux'. Sep 9 23:47:25.688924 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:47:25.689910 systemd[1]: Reached target network.target - Network. Sep 9 23:47:25.690601 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:47:25.691629 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:47:25.692572 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 23:47:25.693599 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 23:47:25.694877 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 23:47:25.695887 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 23:47:25.696984 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 23:47:25.698097 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 23:47:25.698133 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:47:25.698957 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:47:25.698959 systemd-networkd[1418]: eth0: DHCPv4 address 10.0.0.67/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 23:47:25.700121 systemd-timesyncd[1421]: Network configuration changed, trying to establish connection. Sep 9 23:47:25.700754 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 23:47:25.702967 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 23:47:25.705387 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 23:47:25.706589 systemd-timesyncd[1421]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 23:47:25.706639 systemd-timesyncd[1421]: Initial clock synchronization to Tue 2025-09-09 23:47:26.025389 UTC. Sep 9 23:47:25.706715 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 23:47:25.707727 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 23:47:25.710466 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 23:47:25.711604 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 23:47:25.713515 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 23:47:25.714750 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 23:47:25.717420 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:47:25.718287 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:47:25.719133 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:47:25.719167 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:47:25.720239 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 23:47:25.722008 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 23:47:25.723653 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 23:47:25.725485 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 23:47:25.727180 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 23:47:25.727922 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 23:47:25.728806 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 23:47:25.731685 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 23:47:25.732980 jq[1468]: false Sep 9 23:47:25.733371 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 23:47:25.735252 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 23:47:25.739019 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 23:47:25.740607 extend-filesystems[1469]: Found /dev/vda6 Sep 9 23:47:25.741769 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 23:47:25.742214 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 23:47:25.743014 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 23:47:25.744549 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 23:47:25.748309 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 23:47:25.749550 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 23:47:25.750479 extend-filesystems[1469]: Found /dev/vda9 Sep 9 23:47:25.749723 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 23:47:25.755605 extend-filesystems[1469]: Checking size of /dev/vda9 Sep 9 23:47:25.759875 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 23:47:25.763076 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 23:47:25.763278 (ntainerd)[1495]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 23:47:25.765626 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 23:47:25.765963 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 23:47:25.772644 jq[1482]: true Sep 9 23:47:25.777467 tar[1490]: linux-arm64/LICENSE Sep 9 23:47:25.777467 tar[1490]: linux-arm64/helm Sep 9 23:47:25.791842 jq[1504]: true Sep 9 23:47:25.792102 extend-filesystems[1469]: Resized partition /dev/vda9 Sep 9 23:47:25.795846 extend-filesystems[1507]: resize2fs 1.47.2 (1-Jan-2025) Sep 9 23:47:25.806941 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 23:47:25.807000 update_engine[1480]: I20250909 23:47:25.805480 1480 main.cc:92] Flatcar Update Engine starting Sep 9 23:47:25.808489 dbus-daemon[1466]: [system] SELinux support is enabled Sep 9 23:47:25.808988 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 23:47:25.813889 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 23:47:25.813923 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 23:47:25.815408 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 23:47:25.815431 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 23:47:25.818044 update_engine[1480]: I20250909 23:47:25.816743 1480 update_check_scheduler.cc:74] Next update check in 4m36s Sep 9 23:47:25.817169 systemd[1]: Started update-engine.service - Update Engine. Sep 9 23:47:25.828040 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 23:47:25.837055 systemd-logind[1477]: Watching system buttons on /dev/input/event0 (Power Button) Sep 9 23:47:25.838980 systemd-logind[1477]: New seat seat0. Sep 9 23:47:25.842264 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 23:47:25.870903 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 23:47:25.885808 extend-filesystems[1507]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 23:47:25.885808 extend-filesystems[1507]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 23:47:25.885808 extend-filesystems[1507]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 23:47:25.892549 extend-filesystems[1469]: Resized filesystem in /dev/vda9 Sep 9 23:47:25.887179 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 23:47:25.888150 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 23:47:25.905489 bash[1526]: Updated "/home/core/.ssh/authorized_keys" Sep 9 23:47:25.907804 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 23:47:25.912738 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 23:47:25.919946 locksmithd[1511]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 23:47:25.970422 containerd[1495]: time="2025-09-09T23:47:25Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 23:47:25.971379 containerd[1495]: time="2025-09-09T23:47:25.971340080Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 23:47:25.981550 containerd[1495]: time="2025-09-09T23:47:25.981487800Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.52µs" Sep 9 23:47:25.981550 containerd[1495]: time="2025-09-09T23:47:25.981538840Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 23:47:25.981550 containerd[1495]: time="2025-09-09T23:47:25.981560240Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 23:47:25.981756 containerd[1495]: time="2025-09-09T23:47:25.981734400Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 23:47:25.981783 containerd[1495]: time="2025-09-09T23:47:25.981761160Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 23:47:25.981801 containerd[1495]: time="2025-09-09T23:47:25.981789200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:47:25.981874 containerd[1495]: time="2025-09-09T23:47:25.981856320Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:47:25.981874 containerd[1495]: time="2025-09-09T23:47:25.981872320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982150 containerd[1495]: time="2025-09-09T23:47:25.982122920Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982150 containerd[1495]: time="2025-09-09T23:47:25.982145720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982210 containerd[1495]: time="2025-09-09T23:47:25.982157200Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982210 containerd[1495]: time="2025-09-09T23:47:25.982165480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982251 containerd[1495]: time="2025-09-09T23:47:25.982236120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982436 containerd[1495]: time="2025-09-09T23:47:25.982413520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982467 containerd[1495]: time="2025-09-09T23:47:25.982448800Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:47:25.982467 containerd[1495]: time="2025-09-09T23:47:25.982460160Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 23:47:25.982507 containerd[1495]: time="2025-09-09T23:47:25.982494880Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 23:47:25.982740 containerd[1495]: time="2025-09-09T23:47:25.982720800Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 23:47:25.982811 containerd[1495]: time="2025-09-09T23:47:25.982794480Z" level=info msg="metadata content store policy set" policy=shared Sep 9 23:47:25.990502 containerd[1495]: time="2025-09-09T23:47:25.990458360Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 23:47:25.990602 containerd[1495]: time="2025-09-09T23:47:25.990526120Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 23:47:25.990602 containerd[1495]: time="2025-09-09T23:47:25.990554920Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 23:47:25.990602 containerd[1495]: time="2025-09-09T23:47:25.990567840Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 23:47:25.990602 containerd[1495]: time="2025-09-09T23:47:25.990580240Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 23:47:25.990602 containerd[1495]: time="2025-09-09T23:47:25.990592320Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 23:47:25.990706 containerd[1495]: time="2025-09-09T23:47:25.990610680Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 23:47:25.990706 containerd[1495]: time="2025-09-09T23:47:25.990623600Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 23:47:25.990706 containerd[1495]: time="2025-09-09T23:47:25.990636720Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 23:47:25.990706 containerd[1495]: time="2025-09-09T23:47:25.990647800Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 23:47:25.990706 containerd[1495]: time="2025-09-09T23:47:25.990658520Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 23:47:25.990706 containerd[1495]: time="2025-09-09T23:47:25.990671520Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 23:47:25.990847 containerd[1495]: time="2025-09-09T23:47:25.990819440Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 23:47:25.990873 containerd[1495]: time="2025-09-09T23:47:25.990860640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 23:47:25.990891 containerd[1495]: time="2025-09-09T23:47:25.990878600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 23:47:25.990891 containerd[1495]: time="2025-09-09T23:47:25.990890160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 23:47:25.990891 containerd[1495]: time="2025-09-09T23:47:25.990904760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 23:47:25.990891 containerd[1495]: time="2025-09-09T23:47:25.990917800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 23:47:25.990891 containerd[1495]: time="2025-09-09T23:47:25.990931400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 23:47:25.991075 containerd[1495]: time="2025-09-09T23:47:25.990952360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 23:47:25.991075 containerd[1495]: time="2025-09-09T23:47:25.990964280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 23:47:25.991075 containerd[1495]: time="2025-09-09T23:47:25.990975160Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 23:47:25.991075 containerd[1495]: time="2025-09-09T23:47:25.990999760Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 23:47:25.991218 containerd[1495]: time="2025-09-09T23:47:25.991190880Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 23:47:25.991247 containerd[1495]: time="2025-09-09T23:47:25.991219480Z" level=info msg="Start snapshots syncer" Sep 9 23:47:25.991266 containerd[1495]: time="2025-09-09T23:47:25.991248400Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 23:47:25.991514 containerd[1495]: time="2025-09-09T23:47:25.991474880Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991563480Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991642360Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991756840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991785760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991797520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991808480Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991820520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991852640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991865680Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991895160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991906640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 23:47:25.991823 containerd[1495]: time="2025-09-09T23:47:25.991917840Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.991969240Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.991987200Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.991996080Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992004720Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992012080Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992021160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992031800Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992456520Z" level=info msg="runtime interface created" Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992489560Z" level=info msg="created NRI interface" Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992512320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992550480Z" level=info msg="Connect containerd service" Sep 9 23:47:25.992988 containerd[1495]: time="2025-09-09T23:47:25.992606280Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 23:47:25.994099 containerd[1495]: time="2025-09-09T23:47:25.994059120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 23:47:26.069727 containerd[1495]: time="2025-09-09T23:47:26.069648104Z" level=info msg="Start subscribing containerd event" Sep 9 23:47:26.069926 containerd[1495]: time="2025-09-09T23:47:26.069827126Z" level=info msg="Start recovering state" Sep 9 23:47:26.070079 containerd[1495]: time="2025-09-09T23:47:26.070012221Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 23:47:26.070079 containerd[1495]: time="2025-09-09T23:47:26.070067583Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 23:47:26.070218 containerd[1495]: time="2025-09-09T23:47:26.070157843Z" level=info msg="Start event monitor" Sep 9 23:47:26.070218 containerd[1495]: time="2025-09-09T23:47:26.070182134Z" level=info msg="Start cni network conf syncer for default" Sep 9 23:47:26.070218 containerd[1495]: time="2025-09-09T23:47:26.070200519Z" level=info msg="Start streaming server" Sep 9 23:47:26.070308 containerd[1495]: time="2025-09-09T23:47:26.070297101Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 23:47:26.070371 containerd[1495]: time="2025-09-09T23:47:26.070342439Z" level=info msg="runtime interface starting up..." Sep 9 23:47:26.070371 containerd[1495]: time="2025-09-09T23:47:26.070353254Z" level=info msg="starting plugins..." Sep 9 23:47:26.070457 containerd[1495]: time="2025-09-09T23:47:26.070444761Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 23:47:26.070722 containerd[1495]: time="2025-09-09T23:47:26.070692498Z" level=info msg="containerd successfully booted in 0.100807s" Sep 9 23:47:26.070806 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 23:47:26.141954 tar[1490]: linux-arm64/README.md Sep 9 23:47:26.158401 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 23:47:26.745016 systemd-networkd[1418]: eth0: Gained IPv6LL Sep 9 23:47:26.749922 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 23:47:26.751465 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 23:47:26.754157 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 23:47:26.756759 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:26.766715 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 23:47:26.784382 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 23:47:26.786237 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 23:47:26.788503 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 23:47:26.795523 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 23:47:27.013503 sshd_keygen[1491]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 23:47:27.034380 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 23:47:27.037835 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 23:47:27.060327 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 23:47:27.061913 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 23:47:27.064641 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 23:47:27.092913 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 23:47:27.095659 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 23:47:27.097939 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 23:47:27.099131 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 23:47:27.354820 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:27.356330 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 23:47:27.358784 (kubelet)[1596]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:47:27.360960 systemd[1]: Startup finished in 2.023s (kernel) + 5.368s (initrd) + 3.431s (userspace) = 10.823s. Sep 9 23:47:27.766883 kubelet[1596]: E0909 23:47:27.766757 1596 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:47:27.769443 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:47:27.769583 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:47:27.769986 systemd[1]: kubelet.service: Consumed 756ms CPU time, 257.2M memory peak. Sep 9 23:47:31.872485 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 23:47:31.873687 systemd[1]: Started sshd@0-10.0.0.67:22-10.0.0.1:55892.service - OpenSSH per-connection server daemon (10.0.0.1:55892). Sep 9 23:47:31.962105 sshd[1610]: Accepted publickey for core from 10.0.0.1 port 55892 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:47:31.963967 sshd-session[1610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:31.982153 systemd-logind[1477]: New session 1 of user core. Sep 9 23:47:31.983974 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 23:47:31.986412 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 23:47:32.018912 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 23:47:32.022891 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 23:47:32.047205 (systemd)[1615]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 23:47:32.049741 systemd-logind[1477]: New session c1 of user core. Sep 9 23:47:32.156933 systemd[1615]: Queued start job for default target default.target. Sep 9 23:47:32.177017 systemd[1615]: Created slice app.slice - User Application Slice. Sep 9 23:47:32.177050 systemd[1615]: Reached target paths.target - Paths. Sep 9 23:47:32.177088 systemd[1615]: Reached target timers.target - Timers. Sep 9 23:47:32.178331 systemd[1615]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 23:47:32.187986 systemd[1615]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 23:47:32.188049 systemd[1615]: Reached target sockets.target - Sockets. Sep 9 23:47:32.188085 systemd[1615]: Reached target basic.target - Basic System. Sep 9 23:47:32.188113 systemd[1615]: Reached target default.target - Main User Target. Sep 9 23:47:32.188137 systemd[1615]: Startup finished in 132ms. Sep 9 23:47:32.188281 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 23:47:32.190040 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 23:47:32.254147 systemd[1]: Started sshd@1-10.0.0.67:22-10.0.0.1:55898.service - OpenSSH per-connection server daemon (10.0.0.1:55898). Sep 9 23:47:32.322681 sshd[1627]: Accepted publickey for core from 10.0.0.1 port 55898 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:47:32.323198 sshd-session[1627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:32.327156 systemd-logind[1477]: New session 2 of user core. Sep 9 23:47:32.337123 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 23:47:32.395630 sshd[1630]: Connection closed by 10.0.0.1 port 55898 Sep 9 23:47:32.396124 sshd-session[1627]: pam_unix(sshd:session): session closed for user core Sep 9 23:47:32.413258 systemd[1]: sshd@1-10.0.0.67:22-10.0.0.1:55898.service: Deactivated successfully. Sep 9 23:47:32.416180 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 23:47:32.416823 systemd-logind[1477]: Session 2 logged out. Waiting for processes to exit. Sep 9 23:47:32.418661 systemd[1]: Started sshd@2-10.0.0.67:22-10.0.0.1:55910.service - OpenSSH per-connection server daemon (10.0.0.1:55910). Sep 9 23:47:32.419935 systemd-logind[1477]: Removed session 2. Sep 9 23:47:32.500323 sshd[1636]: Accepted publickey for core from 10.0.0.1 port 55910 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:47:32.501807 sshd-session[1636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:32.507194 systemd-logind[1477]: New session 3 of user core. Sep 9 23:47:32.520112 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 23:47:32.568653 sshd[1639]: Connection closed by 10.0.0.1 port 55910 Sep 9 23:47:32.568943 sshd-session[1636]: pam_unix(sshd:session): session closed for user core Sep 9 23:47:32.585595 systemd[1]: sshd@2-10.0.0.67:22-10.0.0.1:55910.service: Deactivated successfully. Sep 9 23:47:32.588178 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 23:47:32.589009 systemd-logind[1477]: Session 3 logged out. Waiting for processes to exit. Sep 9 23:47:32.591058 systemd[1]: Started sshd@3-10.0.0.67:22-10.0.0.1:55922.service - OpenSSH per-connection server daemon (10.0.0.1:55922). Sep 9 23:47:32.594652 systemd-logind[1477]: Removed session 3. Sep 9 23:47:32.657495 sshd[1645]: Accepted publickey for core from 10.0.0.1 port 55922 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:47:32.659659 sshd-session[1645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:32.664434 systemd-logind[1477]: New session 4 of user core. Sep 9 23:47:32.673032 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 23:47:32.726819 sshd[1648]: Connection closed by 10.0.0.1 port 55922 Sep 9 23:47:32.727339 sshd-session[1645]: pam_unix(sshd:session): session closed for user core Sep 9 23:47:32.741415 systemd[1]: sshd@3-10.0.0.67:22-10.0.0.1:55922.service: Deactivated successfully. Sep 9 23:47:32.743108 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 23:47:32.746058 systemd-logind[1477]: Session 4 logged out. Waiting for processes to exit. Sep 9 23:47:32.748050 systemd[1]: Started sshd@4-10.0.0.67:22-10.0.0.1:55932.service - OpenSSH per-connection server daemon (10.0.0.1:55932). Sep 9 23:47:32.749172 systemd-logind[1477]: Removed session 4. Sep 9 23:47:32.834113 sshd[1654]: Accepted publickey for core from 10.0.0.1 port 55932 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:47:32.836357 sshd-session[1654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:32.840320 systemd-logind[1477]: New session 5 of user core. Sep 9 23:47:32.849043 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 23:47:32.913678 sudo[1658]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 23:47:32.913961 sudo[1658]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:47:32.926914 sudo[1658]: pam_unix(sudo:session): session closed for user root Sep 9 23:47:32.929212 sshd[1657]: Connection closed by 10.0.0.1 port 55932 Sep 9 23:47:32.929744 sshd-session[1654]: pam_unix(sshd:session): session closed for user core Sep 9 23:47:32.942679 systemd[1]: sshd@4-10.0.0.67:22-10.0.0.1:55932.service: Deactivated successfully. Sep 9 23:47:32.945496 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 23:47:32.949699 systemd-logind[1477]: Session 5 logged out. Waiting for processes to exit. Sep 9 23:47:32.956795 systemd[1]: Started sshd@5-10.0.0.67:22-10.0.0.1:55946.service - OpenSSH per-connection server daemon (10.0.0.1:55946). Sep 9 23:47:32.957908 systemd-logind[1477]: Removed session 5. Sep 9 23:47:33.019859 sshd[1664]: Accepted publickey for core from 10.0.0.1 port 55946 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:47:33.023670 sshd-session[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:33.027753 systemd-logind[1477]: New session 6 of user core. Sep 9 23:47:33.043081 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 23:47:33.099058 sudo[1669]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 23:47:33.099348 sudo[1669]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:47:33.176466 sudo[1669]: pam_unix(sudo:session): session closed for user root Sep 9 23:47:33.182484 sudo[1668]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 23:47:33.183254 sudo[1668]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:47:33.193298 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:47:33.245936 augenrules[1691]: No rules Sep 9 23:47:33.247365 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:47:33.247639 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:47:33.250596 sudo[1668]: pam_unix(sudo:session): session closed for user root Sep 9 23:47:33.253902 sshd[1667]: Connection closed by 10.0.0.1 port 55946 Sep 9 23:47:33.254916 sshd-session[1664]: pam_unix(sshd:session): session closed for user core Sep 9 23:47:33.266760 systemd[1]: sshd@5-10.0.0.67:22-10.0.0.1:55946.service: Deactivated successfully. Sep 9 23:47:33.269379 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 23:47:33.271001 systemd-logind[1477]: Session 6 logged out. Waiting for processes to exit. Sep 9 23:47:33.274116 systemd[1]: Started sshd@6-10.0.0.67:22-10.0.0.1:55962.service - OpenSSH per-connection server daemon (10.0.0.1:55962). Sep 9 23:47:33.276587 systemd-logind[1477]: Removed session 6. Sep 9 23:47:33.344270 sshd[1700]: Accepted publickey for core from 10.0.0.1 port 55962 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:47:33.346202 sshd-session[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:47:33.353065 systemd-logind[1477]: New session 7 of user core. Sep 9 23:47:33.366054 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 23:47:33.422781 sudo[1704]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 23:47:33.423073 sudo[1704]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:47:33.746615 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 23:47:33.760261 (dockerd)[1725]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 23:47:33.975855 dockerd[1725]: time="2025-09-09T23:47:33.975078799Z" level=info msg="Starting up" Sep 9 23:47:33.976570 dockerd[1725]: time="2025-09-09T23:47:33.976528745Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 23:47:33.989405 dockerd[1725]: time="2025-09-09T23:47:33.989362820Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 23:47:34.023396 dockerd[1725]: time="2025-09-09T23:47:34.023260847Z" level=info msg="Loading containers: start." Sep 9 23:47:34.031882 kernel: Initializing XFRM netlink socket Sep 9 23:47:34.237065 systemd-networkd[1418]: docker0: Link UP Sep 9 23:47:34.345252 dockerd[1725]: time="2025-09-09T23:47:34.345089718Z" level=info msg="Loading containers: done." Sep 9 23:47:34.361798 dockerd[1725]: time="2025-09-09T23:47:34.361731368Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 23:47:34.362048 dockerd[1725]: time="2025-09-09T23:47:34.361831035Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 23:47:34.362048 dockerd[1725]: time="2025-09-09T23:47:34.361938405Z" level=info msg="Initializing buildkit" Sep 9 23:47:34.387713 dockerd[1725]: time="2025-09-09T23:47:34.387659205Z" level=info msg="Completed buildkit initialization" Sep 9 23:47:34.395874 dockerd[1725]: time="2025-09-09T23:47:34.395587973Z" level=info msg="Daemon has completed initialization" Sep 9 23:47:34.395874 dockerd[1725]: time="2025-09-09T23:47:34.395655242Z" level=info msg="API listen on /run/docker.sock" Sep 9 23:47:34.395821 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 23:47:34.944209 containerd[1495]: time="2025-09-09T23:47:34.944158122Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 23:47:35.600535 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2093270807.mount: Deactivated successfully. Sep 9 23:47:36.613117 containerd[1495]: time="2025-09-09T23:47:36.613065206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:36.614100 containerd[1495]: time="2025-09-09T23:47:36.613779660Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=26328359" Sep 9 23:47:36.615608 containerd[1495]: time="2025-09-09T23:47:36.615564340Z" level=info msg="ImageCreate event name:\"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:36.618550 containerd[1495]: time="2025-09-09T23:47:36.618496691Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:36.619586 containerd[1495]: time="2025-09-09T23:47:36.619531267Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"26325157\" in 1.675315642s" Sep 9 23:47:36.619654 containerd[1495]: time="2025-09-09T23:47:36.619589956Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\"" Sep 9 23:47:36.620315 containerd[1495]: time="2025-09-09T23:47:36.620290425Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 23:47:37.705280 containerd[1495]: time="2025-09-09T23:47:37.705185492Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:37.707937 containerd[1495]: time="2025-09-09T23:47:37.707888387Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=22528554" Sep 9 23:47:37.709544 containerd[1495]: time="2025-09-09T23:47:37.709169353Z" level=info msg="ImageCreate event name:\"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:37.712448 containerd[1495]: time="2025-09-09T23:47:37.712409250Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:37.713299 containerd[1495]: time="2025-09-09T23:47:37.713264022Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"24065666\" in 1.092940349s" Sep 9 23:47:37.713365 containerd[1495]: time="2025-09-09T23:47:37.713302653Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\"" Sep 9 23:47:37.713964 containerd[1495]: time="2025-09-09T23:47:37.713698290Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 23:47:38.019873 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 23:47:38.023827 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:38.183566 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:38.201264 (kubelet)[2010]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:47:38.252507 kubelet[2010]: E0909 23:47:38.252442 2010 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:47:38.255463 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:47:38.255602 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:47:38.257931 systemd[1]: kubelet.service: Consumed 150ms CPU time, 107.2M memory peak. Sep 9 23:47:39.267397 containerd[1495]: time="2025-09-09T23:47:39.266371287Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:39.267936 containerd[1495]: time="2025-09-09T23:47:39.267704064Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=17483529" Sep 9 23:47:39.269217 containerd[1495]: time="2025-09-09T23:47:39.269151965Z" level=info msg="ImageCreate event name:\"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:39.272448 containerd[1495]: time="2025-09-09T23:47:39.272396881Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:39.274033 containerd[1495]: time="2025-09-09T23:47:39.273994385Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"19020659\" in 1.560020933s" Sep 9 23:47:39.274033 containerd[1495]: time="2025-09-09T23:47:39.274033700Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\"" Sep 9 23:47:39.274747 containerd[1495]: time="2025-09-09T23:47:39.274550867Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 23:47:40.313174 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2984935456.mount: Deactivated successfully. Sep 9 23:47:40.531394 containerd[1495]: time="2025-09-09T23:47:40.531337692Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:40.532035 containerd[1495]: time="2025-09-09T23:47:40.531997322Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=27376726" Sep 9 23:47:40.533015 containerd[1495]: time="2025-09-09T23:47:40.532989342Z" level=info msg="ImageCreate event name:\"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:40.535706 containerd[1495]: time="2025-09-09T23:47:40.535531634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:40.536381 containerd[1495]: time="2025-09-09T23:47:40.536338563Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"27375743\" in 1.261751651s" Sep 9 23:47:40.536381 containerd[1495]: time="2025-09-09T23:47:40.536375710Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\"" Sep 9 23:47:40.536979 containerd[1495]: time="2025-09-09T23:47:40.536957022Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 23:47:41.055570 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1290718336.mount: Deactivated successfully. Sep 9 23:47:42.054113 containerd[1495]: time="2025-09-09T23:47:42.054061628Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:42.055117 containerd[1495]: time="2025-09-09T23:47:42.054919446Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 9 23:47:42.056413 containerd[1495]: time="2025-09-09T23:47:42.056370965Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:42.058995 containerd[1495]: time="2025-09-09T23:47:42.058958325Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:42.061867 containerd[1495]: time="2025-09-09T23:47:42.060335978Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.523257479s" Sep 9 23:47:42.061867 containerd[1495]: time="2025-09-09T23:47:42.060384003Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 23:47:42.062828 containerd[1495]: time="2025-09-09T23:47:42.062795338Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 23:47:42.510624 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount503575304.mount: Deactivated successfully. Sep 9 23:47:42.517026 containerd[1495]: time="2025-09-09T23:47:42.516955804Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:47:42.517747 containerd[1495]: time="2025-09-09T23:47:42.517614047Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 9 23:47:42.518737 containerd[1495]: time="2025-09-09T23:47:42.518708212Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:47:42.520946 containerd[1495]: time="2025-09-09T23:47:42.520894855Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:47:42.521620 containerd[1495]: time="2025-09-09T23:47:42.521437436Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 458.486008ms" Sep 9 23:47:42.521620 containerd[1495]: time="2025-09-09T23:47:42.521468823Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 23:47:42.521971 containerd[1495]: time="2025-09-09T23:47:42.521943286Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 23:47:43.019188 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2240832804.mount: Deactivated successfully. Sep 9 23:47:44.722608 containerd[1495]: time="2025-09-09T23:47:44.722560994Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:44.723556 containerd[1495]: time="2025-09-09T23:47:44.723506990Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943167" Sep 9 23:47:44.724948 containerd[1495]: time="2025-09-09T23:47:44.724896378Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:44.727412 containerd[1495]: time="2025-09-09T23:47:44.727373672Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:47:44.729896 containerd[1495]: time="2025-09-09T23:47:44.729494647Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.207519463s" Sep 9 23:47:44.729896 containerd[1495]: time="2025-09-09T23:47:44.729538043Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 9 23:47:48.506057 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 23:47:48.507958 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:48.724570 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:48.748211 (kubelet)[2172]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:47:48.786345 kubelet[2172]: E0909 23:47:48.786201 2172 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:47:48.789048 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:47:48.789314 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:47:48.789775 systemd[1]: kubelet.service: Consumed 143ms CPU time, 107.1M memory peak. Sep 9 23:47:49.119822 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:49.120401 systemd[1]: kubelet.service: Consumed 143ms CPU time, 107.1M memory peak. Sep 9 23:47:49.122591 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:49.146237 systemd[1]: Reload requested from client PID 2187 ('systemctl') (unit session-7.scope)... Sep 9 23:47:49.146256 systemd[1]: Reloading... Sep 9 23:47:49.217931 zram_generator::config[2230]: No configuration found. Sep 9 23:47:49.444591 systemd[1]: Reloading finished in 297 ms. Sep 9 23:47:49.511925 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:49.513975 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 23:47:49.514195 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:49.514246 systemd[1]: kubelet.service: Consumed 99ms CPU time, 95.1M memory peak. Sep 9 23:47:49.515568 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:49.631029 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:49.635146 (kubelet)[2276]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:47:49.668850 kubelet[2276]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:47:49.668850 kubelet[2276]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 23:47:49.668850 kubelet[2276]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:47:49.669190 kubelet[2276]: I0909 23:47:49.668911 2276 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:47:50.212204 kubelet[2276]: I0909 23:47:50.212155 2276 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 23:47:50.212204 kubelet[2276]: I0909 23:47:50.212193 2276 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:47:50.213139 kubelet[2276]: I0909 23:47:50.212823 2276 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 23:47:50.236631 kubelet[2276]: E0909 23:47:50.236562 2276 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.67:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.67:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:47:50.236762 kubelet[2276]: I0909 23:47:50.236638 2276 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:47:50.242546 kubelet[2276]: I0909 23:47:50.242522 2276 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:47:50.245596 kubelet[2276]: I0909 23:47:50.245564 2276 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:47:50.247784 kubelet[2276]: I0909 23:47:50.247727 2276 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:47:50.247981 kubelet[2276]: I0909 23:47:50.247779 2276 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:47:50.248090 kubelet[2276]: I0909 23:47:50.248047 2276 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:47:50.248090 kubelet[2276]: I0909 23:47:50.248058 2276 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 23:47:50.248265 kubelet[2276]: I0909 23:47:50.248250 2276 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:47:50.251496 kubelet[2276]: I0909 23:47:50.251297 2276 kubelet.go:446] "Attempting to sync node with API server" Sep 9 23:47:50.251496 kubelet[2276]: I0909 23:47:50.251326 2276 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:47:50.251496 kubelet[2276]: I0909 23:47:50.251363 2276 kubelet.go:352] "Adding apiserver pod source" Sep 9 23:47:50.251496 kubelet[2276]: I0909 23:47:50.251378 2276 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:47:50.253961 kubelet[2276]: W0909 23:47:50.253911 2276 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.67:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.67:6443: connect: connection refused Sep 9 23:47:50.254034 kubelet[2276]: E0909 23:47:50.253969 2276 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.67:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.67:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:47:50.254490 kubelet[2276]: I0909 23:47:50.254465 2276 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:47:50.255273 kubelet[2276]: W0909 23:47:50.255229 2276 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.67:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.67:6443: connect: connection refused Sep 9 23:47:50.255417 kubelet[2276]: E0909 23:47:50.255379 2276 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.67:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.67:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:47:50.255479 kubelet[2276]: I0909 23:47:50.255340 2276 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:47:50.255636 kubelet[2276]: W0909 23:47:50.255625 2276 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 23:47:50.256557 kubelet[2276]: I0909 23:47:50.256534 2276 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 23:47:50.256663 kubelet[2276]: I0909 23:47:50.256653 2276 server.go:1287] "Started kubelet" Sep 9 23:47:50.258963 kubelet[2276]: I0909 23:47:50.258896 2276 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:47:50.259777 kubelet[2276]: I0909 23:47:50.259707 2276 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:47:50.260000 kubelet[2276]: I0909 23:47:50.259971 2276 server.go:479] "Adding debug handlers to kubelet server" Sep 9 23:47:50.260252 kubelet[2276]: I0909 23:47:50.260233 2276 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:47:50.261854 kubelet[2276]: E0909 23:47:50.261497 2276 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.67:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.67:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1863c213e202d60b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 23:47:50.256629259 +0000 UTC m=+0.618533393,LastTimestamp:2025-09-09 23:47:50.256629259 +0000 UTC m=+0.618533393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 23:47:50.263611 kubelet[2276]: I0909 23:47:50.263584 2276 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:47:50.263945 kubelet[2276]: I0909 23:47:50.263921 2276 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:47:50.264887 kubelet[2276]: E0909 23:47:50.264858 2276 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 23:47:50.264952 kubelet[2276]: I0909 23:47:50.264898 2276 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 23:47:50.265079 kubelet[2276]: I0909 23:47:50.265056 2276 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 23:47:50.265148 kubelet[2276]: I0909 23:47:50.265132 2276 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:47:50.265464 kubelet[2276]: E0909 23:47:50.265405 2276 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.67:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.67:6443: connect: connection refused" interval="200ms" Sep 9 23:47:50.265521 kubelet[2276]: W0909 23:47:50.265452 2276 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.67:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.67:6443: connect: connection refused Sep 9 23:47:50.265521 kubelet[2276]: E0909 23:47:50.265500 2276 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.67:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.67:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:47:50.265609 kubelet[2276]: I0909 23:47:50.265587 2276 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:47:50.265609 kubelet[2276]: E0909 23:47:50.265599 2276 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 23:47:50.265681 kubelet[2276]: I0909 23:47:50.265661 2276 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:47:50.266695 kubelet[2276]: I0909 23:47:50.266664 2276 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:47:50.277977 kubelet[2276]: I0909 23:47:50.277935 2276 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 23:47:50.277977 kubelet[2276]: I0909 23:47:50.277977 2276 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 23:47:50.278121 kubelet[2276]: I0909 23:47:50.277997 2276 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:47:50.279915 kubelet[2276]: I0909 23:47:50.279870 2276 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:47:50.281187 kubelet[2276]: I0909 23:47:50.281155 2276 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:47:50.281187 kubelet[2276]: I0909 23:47:50.281183 2276 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 23:47:50.281281 kubelet[2276]: I0909 23:47:50.281203 2276 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 23:47:50.281281 kubelet[2276]: I0909 23:47:50.281210 2276 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 23:47:50.281281 kubelet[2276]: E0909 23:47:50.281251 2276 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 23:47:50.281820 kubelet[2276]: W0909 23:47:50.281788 2276 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.67:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.67:6443: connect: connection refused Sep 9 23:47:50.282446 kubelet[2276]: E0909 23:47:50.282411 2276 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.67:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.67:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:47:50.366028 kubelet[2276]: E0909 23:47:50.365972 2276 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 23:47:50.381392 kubelet[2276]: E0909 23:47:50.381342 2276 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 23:47:50.385066 kubelet[2276]: I0909 23:47:50.385028 2276 policy_none.go:49] "None policy: Start" Sep 9 23:47:50.385066 kubelet[2276]: I0909 23:47:50.385066 2276 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 23:47:50.385167 kubelet[2276]: I0909 23:47:50.385080 2276 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:47:50.392520 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 23:47:50.415418 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 23:47:50.418420 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 23:47:50.439050 kubelet[2276]: I0909 23:47:50.439001 2276 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:47:50.439278 kubelet[2276]: I0909 23:47:50.439259 2276 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:47:50.439318 kubelet[2276]: I0909 23:47:50.439278 2276 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:47:50.439639 kubelet[2276]: I0909 23:47:50.439610 2276 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:47:50.440803 kubelet[2276]: E0909 23:47:50.440755 2276 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 23:47:50.440881 kubelet[2276]: E0909 23:47:50.440821 2276 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 23:47:50.466334 kubelet[2276]: E0909 23:47:50.466222 2276 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.67:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.67:6443: connect: connection refused" interval="400ms" Sep 9 23:47:50.541575 kubelet[2276]: I0909 23:47:50.541536 2276 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 23:47:50.542040 kubelet[2276]: E0909 23:47:50.542002 2276 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.67:6443/api/v1/nodes\": dial tcp 10.0.0.67:6443: connect: connection refused" node="localhost" Sep 9 23:47:50.590703 systemd[1]: Created slice kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice - libcontainer container kubepods-burstable-poda9176403b596d0b29ae8ad12d635226d.slice. Sep 9 23:47:50.612750 kubelet[2276]: E0909 23:47:50.612521 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:50.616449 systemd[1]: Created slice kubepods-burstable-pod48bbfc5ff756c8373a1e2ceb757171bd.slice - libcontainer container kubepods-burstable-pod48bbfc5ff756c8373a1e2ceb757171bd.slice. Sep 9 23:47:50.626987 kubelet[2276]: E0909 23:47:50.626949 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:50.630124 systemd[1]: Created slice kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice - libcontainer container kubepods-burstable-poda88c9297c136b0f15880bf567e89a977.slice. Sep 9 23:47:50.631876 kubelet[2276]: E0909 23:47:50.631829 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:50.667220 kubelet[2276]: I0909 23:47:50.667180 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:50.667467 kubelet[2276]: I0909 23:47:50.667418 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:50.667591 kubelet[2276]: I0909 23:47:50.667531 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:50.667591 kubelet[2276]: I0909 23:47:50.667557 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/48bbfc5ff756c8373a1e2ceb757171bd-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"48bbfc5ff756c8373a1e2ceb757171bd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:50.667721 kubelet[2276]: I0909 23:47:50.667707 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/48bbfc5ff756c8373a1e2ceb757171bd-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"48bbfc5ff756c8373a1e2ceb757171bd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:50.667884 kubelet[2276]: I0909 23:47:50.667780 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/48bbfc5ff756c8373a1e2ceb757171bd-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"48bbfc5ff756c8373a1e2ceb757171bd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:50.667884 kubelet[2276]: I0909 23:47:50.667797 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:50.667884 kubelet[2276]: I0909 23:47:50.667815 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:50.668013 kubelet[2276]: I0909 23:47:50.667981 2276 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:50.743661 kubelet[2276]: I0909 23:47:50.743570 2276 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 23:47:50.744417 kubelet[2276]: E0909 23:47:50.744378 2276 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.67:6443/api/v1/nodes\": dial tcp 10.0.0.67:6443: connect: connection refused" node="localhost" Sep 9 23:47:50.867717 kubelet[2276]: E0909 23:47:50.867677 2276 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.67:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.67:6443: connect: connection refused" interval="800ms" Sep 9 23:47:50.915643 containerd[1495]: time="2025-09-09T23:47:50.915595783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,}" Sep 9 23:47:50.927783 containerd[1495]: time="2025-09-09T23:47:50.927741471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:48bbfc5ff756c8373a1e2ceb757171bd,Namespace:kube-system,Attempt:0,}" Sep 9 23:47:50.934124 containerd[1495]: time="2025-09-09T23:47:50.934063478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,}" Sep 9 23:47:50.939476 containerd[1495]: time="2025-09-09T23:47:50.939434269Z" level=info msg="connecting to shim 29e0f7e0594cdfe2cb611ec0387b38ea43603ce665854759d3df18cc0cc5742d" address="unix:///run/containerd/s/fd0251de8296cafdad2330f3917e80b7582aa11e3cb18b8b681ed7f5462a0658" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:47:50.964112 containerd[1495]: time="2025-09-09T23:47:50.963657697Z" level=info msg="connecting to shim 040ffe1c1eeba46abbd9d982c619faf4d49976decd1edac86fa5b697870e97b0" address="unix:///run/containerd/s/b4deb83180dae65f4e2bbb93bf7e9fb07b445a29a6156b90b4c55697ec93caa8" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:47:50.964615 containerd[1495]: time="2025-09-09T23:47:50.964582751Z" level=info msg="connecting to shim 0429fcc3c2a9c5cbdfc8e5b6f0136e05c493e0815795324d37eedfd08cafca42" address="unix:///run/containerd/s/e19814c091e54c5f03c799d63bb13f438af8d41454a475253359917769970cf7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:47:50.968206 systemd[1]: Started cri-containerd-29e0f7e0594cdfe2cb611ec0387b38ea43603ce665854759d3df18cc0cc5742d.scope - libcontainer container 29e0f7e0594cdfe2cb611ec0387b38ea43603ce665854759d3df18cc0cc5742d. Sep 9 23:47:50.994049 systemd[1]: Started cri-containerd-040ffe1c1eeba46abbd9d982c619faf4d49976decd1edac86fa5b697870e97b0.scope - libcontainer container 040ffe1c1eeba46abbd9d982c619faf4d49976decd1edac86fa5b697870e97b0. Sep 9 23:47:50.999102 systemd[1]: Started cri-containerd-0429fcc3c2a9c5cbdfc8e5b6f0136e05c493e0815795324d37eedfd08cafca42.scope - libcontainer container 0429fcc3c2a9c5cbdfc8e5b6f0136e05c493e0815795324d37eedfd08cafca42. Sep 9 23:47:51.014665 containerd[1495]: time="2025-09-09T23:47:51.014609942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:a9176403b596d0b29ae8ad12d635226d,Namespace:kube-system,Attempt:0,} returns sandbox id \"29e0f7e0594cdfe2cb611ec0387b38ea43603ce665854759d3df18cc0cc5742d\"" Sep 9 23:47:51.019560 containerd[1495]: time="2025-09-09T23:47:51.019520800Z" level=info msg="CreateContainer within sandbox \"29e0f7e0594cdfe2cb611ec0387b38ea43603ce665854759d3df18cc0cc5742d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 23:47:51.034696 containerd[1495]: time="2025-09-09T23:47:51.034627786Z" level=info msg="Container d72c76ef450a5384cb473e297224d3f07699febc1ece1a50b797864907ec194e: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:47:51.041684 containerd[1495]: time="2025-09-09T23:47:51.041568152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:48bbfc5ff756c8373a1e2ceb757171bd,Namespace:kube-system,Attempt:0,} returns sandbox id \"0429fcc3c2a9c5cbdfc8e5b6f0136e05c493e0815795324d37eedfd08cafca42\"" Sep 9 23:47:51.043726 containerd[1495]: time="2025-09-09T23:47:51.043664274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a88c9297c136b0f15880bf567e89a977,Namespace:kube-system,Attempt:0,} returns sandbox id \"040ffe1c1eeba46abbd9d982c619faf4d49976decd1edac86fa5b697870e97b0\"" Sep 9 23:47:51.044925 containerd[1495]: time="2025-09-09T23:47:51.044891328Z" level=info msg="CreateContainer within sandbox \"0429fcc3c2a9c5cbdfc8e5b6f0136e05c493e0815795324d37eedfd08cafca42\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 23:47:51.045641 containerd[1495]: time="2025-09-09T23:47:51.045602532Z" level=info msg="CreateContainer within sandbox \"29e0f7e0594cdfe2cb611ec0387b38ea43603ce665854759d3df18cc0cc5742d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d72c76ef450a5384cb473e297224d3f07699febc1ece1a50b797864907ec194e\"" Sep 9 23:47:51.046333 containerd[1495]: time="2025-09-09T23:47:51.046307969Z" level=info msg="StartContainer for \"d72c76ef450a5384cb473e297224d3f07699febc1ece1a50b797864907ec194e\"" Sep 9 23:47:51.047521 containerd[1495]: time="2025-09-09T23:47:51.047486595Z" level=info msg="connecting to shim d72c76ef450a5384cb473e297224d3f07699febc1ece1a50b797864907ec194e" address="unix:///run/containerd/s/fd0251de8296cafdad2330f3917e80b7582aa11e3cb18b8b681ed7f5462a0658" protocol=ttrpc version=3 Sep 9 23:47:51.048422 containerd[1495]: time="2025-09-09T23:47:51.048390311Z" level=info msg="CreateContainer within sandbox \"040ffe1c1eeba46abbd9d982c619faf4d49976decd1edac86fa5b697870e97b0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 23:47:51.051481 containerd[1495]: time="2025-09-09T23:47:51.051433011Z" level=info msg="Container 14c4d77607c24675c7e2953778867ab17d5791159d1f5096e5b63f36c2e3b73a: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:47:51.059605 containerd[1495]: time="2025-09-09T23:47:51.059550160Z" level=info msg="Container 85760cf5ad9a3e81138c2d23a649a0c4c07fd22dd5513d9947133c3f11ed3fff: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:47:51.063069 systemd[1]: Started cri-containerd-d72c76ef450a5384cb473e297224d3f07699febc1ece1a50b797864907ec194e.scope - libcontainer container d72c76ef450a5384cb473e297224d3f07699febc1ece1a50b797864907ec194e. Sep 9 23:47:51.068666 containerd[1495]: time="2025-09-09T23:47:51.068610802Z" level=info msg="CreateContainer within sandbox \"040ffe1c1eeba46abbd9d982c619faf4d49976decd1edac86fa5b697870e97b0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"85760cf5ad9a3e81138c2d23a649a0c4c07fd22dd5513d9947133c3f11ed3fff\"" Sep 9 23:47:51.069388 containerd[1495]: time="2025-09-09T23:47:51.069365348Z" level=info msg="StartContainer for \"85760cf5ad9a3e81138c2d23a649a0c4c07fd22dd5513d9947133c3f11ed3fff\"" Sep 9 23:47:51.071841 containerd[1495]: time="2025-09-09T23:47:51.071800228Z" level=info msg="connecting to shim 85760cf5ad9a3e81138c2d23a649a0c4c07fd22dd5513d9947133c3f11ed3fff" address="unix:///run/containerd/s/b4deb83180dae65f4e2bbb93bf7e9fb07b445a29a6156b90b4c55697ec93caa8" protocol=ttrpc version=3 Sep 9 23:47:51.072184 containerd[1495]: time="2025-09-09T23:47:51.072158495Z" level=info msg="CreateContainer within sandbox \"0429fcc3c2a9c5cbdfc8e5b6f0136e05c493e0815795324d37eedfd08cafca42\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"14c4d77607c24675c7e2953778867ab17d5791159d1f5096e5b63f36c2e3b73a\"" Sep 9 23:47:51.072647 containerd[1495]: time="2025-09-09T23:47:51.072615581Z" level=info msg="StartContainer for \"14c4d77607c24675c7e2953778867ab17d5791159d1f5096e5b63f36c2e3b73a\"" Sep 9 23:47:51.074066 containerd[1495]: time="2025-09-09T23:47:51.074019524Z" level=info msg="connecting to shim 14c4d77607c24675c7e2953778867ab17d5791159d1f5096e5b63f36c2e3b73a" address="unix:///run/containerd/s/e19814c091e54c5f03c799d63bb13f438af8d41454a475253359917769970cf7" protocol=ttrpc version=3 Sep 9 23:47:51.094016 systemd[1]: Started cri-containerd-14c4d77607c24675c7e2953778867ab17d5791159d1f5096e5b63f36c2e3b73a.scope - libcontainer container 14c4d77607c24675c7e2953778867ab17d5791159d1f5096e5b63f36c2e3b73a. Sep 9 23:47:51.103068 systemd[1]: Started cri-containerd-85760cf5ad9a3e81138c2d23a649a0c4c07fd22dd5513d9947133c3f11ed3fff.scope - libcontainer container 85760cf5ad9a3e81138c2d23a649a0c4c07fd22dd5513d9947133c3f11ed3fff. Sep 9 23:47:51.116643 containerd[1495]: time="2025-09-09T23:47:51.116582023Z" level=info msg="StartContainer for \"d72c76ef450a5384cb473e297224d3f07699febc1ece1a50b797864907ec194e\" returns successfully" Sep 9 23:47:51.145661 containerd[1495]: time="2025-09-09T23:47:51.145600705Z" level=info msg="StartContainer for \"14c4d77607c24675c7e2953778867ab17d5791159d1f5096e5b63f36c2e3b73a\" returns successfully" Sep 9 23:47:51.146767 kubelet[2276]: I0909 23:47:51.146744 2276 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 23:47:51.147337 kubelet[2276]: E0909 23:47:51.147303 2276 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.67:6443/api/v1/nodes\": dial tcp 10.0.0.67:6443: connect: connection refused" node="localhost" Sep 9 23:47:51.155092 containerd[1495]: time="2025-09-09T23:47:51.154977233Z" level=info msg="StartContainer for \"85760cf5ad9a3e81138c2d23a649a0c4c07fd22dd5513d9947133c3f11ed3fff\" returns successfully" Sep 9 23:47:51.289463 kubelet[2276]: E0909 23:47:51.289364 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:51.291036 kubelet[2276]: E0909 23:47:51.290939 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:51.292781 kubelet[2276]: E0909 23:47:51.292757 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:51.948655 kubelet[2276]: I0909 23:47:51.948571 2276 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 23:47:52.294971 kubelet[2276]: E0909 23:47:52.294710 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:52.295786 kubelet[2276]: E0909 23:47:52.295670 2276 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 23:47:52.775999 kubelet[2276]: E0909 23:47:52.775772 2276 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 23:47:52.867690 kubelet[2276]: I0909 23:47:52.867544 2276 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 23:47:52.867690 kubelet[2276]: E0909 23:47:52.867586 2276 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 9 23:47:52.965964 kubelet[2276]: I0909 23:47:52.965910 2276 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:52.972817 kubelet[2276]: E0909 23:47:52.972715 2276 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:52.972817 kubelet[2276]: I0909 23:47:52.972752 2276 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:52.974720 kubelet[2276]: E0909 23:47:52.974683 2276 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:52.974956 kubelet[2276]: I0909 23:47:52.974821 2276 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:52.977076 kubelet[2276]: E0909 23:47:52.977049 2276 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:53.255788 kubelet[2276]: I0909 23:47:53.255663 2276 apiserver.go:52] "Watching apiserver" Sep 9 23:47:53.265396 kubelet[2276]: I0909 23:47:53.265350 2276 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 23:47:53.295392 kubelet[2276]: I0909 23:47:53.295362 2276 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:53.297497 kubelet[2276]: E0909 23:47:53.297445 2276 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:55.079003 systemd[1]: Reload requested from client PID 2551 ('systemctl') (unit session-7.scope)... Sep 9 23:47:55.079019 systemd[1]: Reloading... Sep 9 23:47:55.142377 zram_generator::config[2590]: No configuration found. Sep 9 23:47:55.360144 systemd[1]: Reloading finished in 280 ms. Sep 9 23:47:55.385932 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:55.398698 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 23:47:55.398991 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:55.399053 systemd[1]: kubelet.service: Consumed 989ms CPU time, 128.1M memory peak. Sep 9 23:47:55.401037 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:47:55.582154 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:47:55.607239 (kubelet)[2636]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:47:55.654986 kubelet[2636]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:47:55.654986 kubelet[2636]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 23:47:55.654986 kubelet[2636]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:47:55.655425 kubelet[2636]: I0909 23:47:55.655020 2636 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:47:55.661857 kubelet[2636]: I0909 23:47:55.661099 2636 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 23:47:55.661857 kubelet[2636]: I0909 23:47:55.661127 2636 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:47:55.661857 kubelet[2636]: I0909 23:47:55.661365 2636 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 23:47:55.662821 kubelet[2636]: I0909 23:47:55.662785 2636 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 23:47:55.665558 kubelet[2636]: I0909 23:47:55.665516 2636 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:47:55.670455 kubelet[2636]: I0909 23:47:55.670431 2636 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:47:55.673592 kubelet[2636]: I0909 23:47:55.673566 2636 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:47:55.673787 kubelet[2636]: I0909 23:47:55.673764 2636 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:47:55.673984 kubelet[2636]: I0909 23:47:55.673788 2636 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:47:55.674089 kubelet[2636]: I0909 23:47:55.673987 2636 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:47:55.674089 kubelet[2636]: I0909 23:47:55.673997 2636 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 23:47:55.674089 kubelet[2636]: I0909 23:47:55.674045 2636 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:47:55.674196 kubelet[2636]: I0909 23:47:55.674173 2636 kubelet.go:446] "Attempting to sync node with API server" Sep 9 23:47:55.674196 kubelet[2636]: I0909 23:47:55.674191 2636 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:47:55.674243 kubelet[2636]: I0909 23:47:55.674211 2636 kubelet.go:352] "Adding apiserver pod source" Sep 9 23:47:55.674243 kubelet[2636]: I0909 23:47:55.674222 2636 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:47:55.675931 kubelet[2636]: I0909 23:47:55.675564 2636 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:47:55.676224 kubelet[2636]: I0909 23:47:55.676205 2636 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:47:55.676635 kubelet[2636]: I0909 23:47:55.676606 2636 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 23:47:55.676635 kubelet[2636]: I0909 23:47:55.676641 2636 server.go:1287] "Started kubelet" Sep 9 23:47:55.677342 kubelet[2636]: I0909 23:47:55.677277 2636 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:47:55.677620 kubelet[2636]: I0909 23:47:55.677601 2636 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:47:55.677748 kubelet[2636]: I0909 23:47:55.677729 2636 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:47:55.683175 kubelet[2636]: I0909 23:47:55.681589 2636 server.go:479] "Adding debug handlers to kubelet server" Sep 9 23:47:55.684648 kubelet[2636]: E0909 23:47:55.684436 2636 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 23:47:55.685557 kubelet[2636]: I0909 23:47:55.685518 2636 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:47:55.689339 kubelet[2636]: I0909 23:47:55.689293 2636 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:47:55.690339 kubelet[2636]: I0909 23:47:55.690321 2636 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 23:47:55.690541 kubelet[2636]: E0909 23:47:55.690521 2636 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 23:47:55.693362 kubelet[2636]: I0909 23:47:55.693326 2636 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:47:55.693456 kubelet[2636]: I0909 23:47:55.693432 2636 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:47:55.693504 kubelet[2636]: I0909 23:47:55.693485 2636 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:47:55.697756 kubelet[2636]: I0909 23:47:55.693343 2636 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 23:47:55.699330 kubelet[2636]: I0909 23:47:55.699309 2636 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:47:55.705163 kubelet[2636]: I0909 23:47:55.705130 2636 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:47:55.707271 kubelet[2636]: I0909 23:47:55.707251 2636 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:47:55.707363 kubelet[2636]: I0909 23:47:55.707354 2636 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 23:47:55.707423 kubelet[2636]: I0909 23:47:55.707413 2636 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 23:47:55.707476 kubelet[2636]: I0909 23:47:55.707468 2636 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 23:47:55.707568 kubelet[2636]: E0909 23:47:55.707551 2636 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 23:47:55.731471 kubelet[2636]: I0909 23:47:55.731437 2636 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 23:47:55.731471 kubelet[2636]: I0909 23:47:55.731457 2636 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 23:47:55.731471 kubelet[2636]: I0909 23:47:55.731477 2636 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:47:55.731635 kubelet[2636]: I0909 23:47:55.731624 2636 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 23:47:55.731660 kubelet[2636]: I0909 23:47:55.731634 2636 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 23:47:55.731660 kubelet[2636]: I0909 23:47:55.731651 2636 policy_none.go:49] "None policy: Start" Sep 9 23:47:55.731660 kubelet[2636]: I0909 23:47:55.731660 2636 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 23:47:55.731715 kubelet[2636]: I0909 23:47:55.731669 2636 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:47:55.731775 kubelet[2636]: I0909 23:47:55.731755 2636 state_mem.go:75] "Updated machine memory state" Sep 9 23:47:55.736167 kubelet[2636]: I0909 23:47:55.735983 2636 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:47:55.736527 kubelet[2636]: I0909 23:47:55.736381 2636 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:47:55.736527 kubelet[2636]: I0909 23:47:55.736397 2636 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:47:55.736760 kubelet[2636]: I0909 23:47:55.736730 2636 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:47:55.738934 kubelet[2636]: E0909 23:47:55.738896 2636 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 23:47:55.808676 kubelet[2636]: I0909 23:47:55.808604 2636 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:55.808790 kubelet[2636]: I0909 23:47:55.808693 2636 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:55.808790 kubelet[2636]: I0909 23:47:55.808604 2636 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:55.838681 kubelet[2636]: I0909 23:47:55.838655 2636 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 23:47:55.846845 kubelet[2636]: I0909 23:47:55.846078 2636 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 9 23:47:55.846845 kubelet[2636]: I0909 23:47:55.846164 2636 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 23:47:55.994662 kubelet[2636]: I0909 23:47:55.994544 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/48bbfc5ff756c8373a1e2ceb757171bd-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"48bbfc5ff756c8373a1e2ceb757171bd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:55.994662 kubelet[2636]: I0909 23:47:55.994587 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:55.994662 kubelet[2636]: I0909 23:47:55.994608 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:55.994662 kubelet[2636]: I0909 23:47:55.994640 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a9176403b596d0b29ae8ad12d635226d-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"a9176403b596d0b29ae8ad12d635226d\") " pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:55.995546 kubelet[2636]: I0909 23:47:55.995313 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/48bbfc5ff756c8373a1e2ceb757171bd-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"48bbfc5ff756c8373a1e2ceb757171bd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:55.995546 kubelet[2636]: I0909 23:47:55.995349 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/48bbfc5ff756c8373a1e2ceb757171bd-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"48bbfc5ff756c8373a1e2ceb757171bd\") " pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:55.995546 kubelet[2636]: I0909 23:47:55.995366 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:55.995546 kubelet[2636]: I0909 23:47:55.995382 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:55.995546 kubelet[2636]: I0909 23:47:55.995400 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a88c9297c136b0f15880bf567e89a977-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a88c9297c136b0f15880bf567e89a977\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 23:47:56.675461 kubelet[2636]: I0909 23:47:56.675429 2636 apiserver.go:52] "Watching apiserver" Sep 9 23:47:56.698187 kubelet[2636]: I0909 23:47:56.698144 2636 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 23:47:56.719968 kubelet[2636]: I0909 23:47:56.719820 2636 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:56.719968 kubelet[2636]: I0909 23:47:56.719935 2636 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:56.724743 kubelet[2636]: E0909 23:47:56.724710 2636 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 23:47:56.725068 kubelet[2636]: E0909 23:47:56.725044 2636 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 9 23:47:56.745610 kubelet[2636]: I0909 23:47:56.745545 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.745527562 podStartE2EDuration="1.745527562s" podCreationTimestamp="2025-09-09 23:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:47:56.738002544 +0000 UTC m=+1.127306244" watchObservedRunningTime="2025-09-09 23:47:56.745527562 +0000 UTC m=+1.134831302" Sep 9 23:47:56.752602 kubelet[2636]: I0909 23:47:56.752514 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.752496616 podStartE2EDuration="1.752496616s" podCreationTimestamp="2025-09-09 23:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:47:56.745978689 +0000 UTC m=+1.135282469" watchObservedRunningTime="2025-09-09 23:47:56.752496616 +0000 UTC m=+1.141800356" Sep 9 23:47:56.760948 kubelet[2636]: I0909 23:47:56.760679 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.760661737 podStartE2EDuration="1.760661737s" podCreationTimestamp="2025-09-09 23:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:47:56.752719337 +0000 UTC m=+1.142023077" watchObservedRunningTime="2025-09-09 23:47:56.760661737 +0000 UTC m=+1.149965477" Sep 9 23:48:02.518604 systemd[1]: Created slice kubepods-besteffort-podddae4b92_c099_4322_9baa_82769109705e.slice - libcontainer container kubepods-besteffort-podddae4b92_c099_4322_9baa_82769109705e.slice. Sep 9 23:48:02.528313 kubelet[2636]: I0909 23:48:02.528272 2636 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 23:48:02.529736 containerd[1495]: time="2025-09-09T23:48:02.529689938Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 23:48:02.530587 kubelet[2636]: I0909 23:48:02.530474 2636 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 23:48:02.539140 kubelet[2636]: I0909 23:48:02.539091 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ddae4b92-c099-4322-9baa-82769109705e-xtables-lock\") pod \"kube-proxy-28fsf\" (UID: \"ddae4b92-c099-4322-9baa-82769109705e\") " pod="kube-system/kube-proxy-28fsf" Sep 9 23:48:02.539299 kubelet[2636]: I0909 23:48:02.539136 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddae4b92-c099-4322-9baa-82769109705e-lib-modules\") pod \"kube-proxy-28fsf\" (UID: \"ddae4b92-c099-4322-9baa-82769109705e\") " pod="kube-system/kube-proxy-28fsf" Sep 9 23:48:02.539365 kubelet[2636]: I0909 23:48:02.539331 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hkf6\" (UniqueName: \"kubernetes.io/projected/ddae4b92-c099-4322-9baa-82769109705e-kube-api-access-9hkf6\") pod \"kube-proxy-28fsf\" (UID: \"ddae4b92-c099-4322-9baa-82769109705e\") " pod="kube-system/kube-proxy-28fsf" Sep 9 23:48:02.539442 kubelet[2636]: I0909 23:48:02.539366 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ddae4b92-c099-4322-9baa-82769109705e-kube-proxy\") pod \"kube-proxy-28fsf\" (UID: \"ddae4b92-c099-4322-9baa-82769109705e\") " pod="kube-system/kube-proxy-28fsf" Sep 9 23:48:02.651855 kubelet[2636]: E0909 23:48:02.651787 2636 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 9 23:48:02.651855 kubelet[2636]: E0909 23:48:02.651845 2636 projected.go:194] Error preparing data for projected volume kube-api-access-9hkf6 for pod kube-system/kube-proxy-28fsf: configmap "kube-root-ca.crt" not found Sep 9 23:48:02.652040 kubelet[2636]: E0909 23:48:02.651911 2636 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ddae4b92-c099-4322-9baa-82769109705e-kube-api-access-9hkf6 podName:ddae4b92-c099-4322-9baa-82769109705e nodeName:}" failed. No retries permitted until 2025-09-09 23:48:03.151888976 +0000 UTC m=+7.541192716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9hkf6" (UniqueName: "kubernetes.io/projected/ddae4b92-c099-4322-9baa-82769109705e-kube-api-access-9hkf6") pod "kube-proxy-28fsf" (UID: "ddae4b92-c099-4322-9baa-82769109705e") : configmap "kube-root-ca.crt" not found Sep 9 23:48:03.439412 containerd[1495]: time="2025-09-09T23:48:03.439341981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-28fsf,Uid:ddae4b92-c099-4322-9baa-82769109705e,Namespace:kube-system,Attempt:0,}" Sep 9 23:48:03.457069 containerd[1495]: time="2025-09-09T23:48:03.457013393Z" level=info msg="connecting to shim 54e119b559774c00477d624c60c5caeae62f8d2edfce0f0372df6b083ae3d461" address="unix:///run/containerd/s/85f4a2a925c2e6f1deae78a536684557c022fe6f361ca6103929b721f55c5bc0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:03.482028 systemd[1]: Started cri-containerd-54e119b559774c00477d624c60c5caeae62f8d2edfce0f0372df6b083ae3d461.scope - libcontainer container 54e119b559774c00477d624c60c5caeae62f8d2edfce0f0372df6b083ae3d461. Sep 9 23:48:03.506816 containerd[1495]: time="2025-09-09T23:48:03.506777301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-28fsf,Uid:ddae4b92-c099-4322-9baa-82769109705e,Namespace:kube-system,Attempt:0,} returns sandbox id \"54e119b559774c00477d624c60c5caeae62f8d2edfce0f0372df6b083ae3d461\"" Sep 9 23:48:03.510905 containerd[1495]: time="2025-09-09T23:48:03.510856388Z" level=info msg="CreateContainer within sandbox \"54e119b559774c00477d624c60c5caeae62f8d2edfce0f0372df6b083ae3d461\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 23:48:03.528067 containerd[1495]: time="2025-09-09T23:48:03.527887028Z" level=info msg="Container 9308e04a57932310329449792ae25b7360686894498cffe102baec657f9ffe02: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:03.541727 containerd[1495]: time="2025-09-09T23:48:03.541661939Z" level=info msg="CreateContainer within sandbox \"54e119b559774c00477d624c60c5caeae62f8d2edfce0f0372df6b083ae3d461\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9308e04a57932310329449792ae25b7360686894498cffe102baec657f9ffe02\"" Sep 9 23:48:03.543589 containerd[1495]: time="2025-09-09T23:48:03.542920389Z" level=info msg="StartContainer for \"9308e04a57932310329449792ae25b7360686894498cffe102baec657f9ffe02\"" Sep 9 23:48:03.546884 containerd[1495]: time="2025-09-09T23:48:03.546820811Z" level=info msg="connecting to shim 9308e04a57932310329449792ae25b7360686894498cffe102baec657f9ffe02" address="unix:///run/containerd/s/85f4a2a925c2e6f1deae78a536684557c022fe6f361ca6103929b721f55c5bc0" protocol=ttrpc version=3 Sep 9 23:48:03.572696 systemd[1]: Started cri-containerd-9308e04a57932310329449792ae25b7360686894498cffe102baec657f9ffe02.scope - libcontainer container 9308e04a57932310329449792ae25b7360686894498cffe102baec657f9ffe02. Sep 9 23:48:03.595086 systemd[1]: Created slice kubepods-besteffort-pod2157fd1c_e5d5_43eb_982a_cdd60f287a7c.slice - libcontainer container kubepods-besteffort-pod2157fd1c_e5d5_43eb_982a_cdd60f287a7c.slice. Sep 9 23:48:03.622307 containerd[1495]: time="2025-09-09T23:48:03.622267939Z" level=info msg="StartContainer for \"9308e04a57932310329449792ae25b7360686894498cffe102baec657f9ffe02\" returns successfully" Sep 9 23:48:03.647180 kubelet[2636]: I0909 23:48:03.647085 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nzb\" (UniqueName: \"kubernetes.io/projected/2157fd1c-e5d5-43eb-982a-cdd60f287a7c-kube-api-access-v9nzb\") pod \"tigera-operator-755d956888-k4w62\" (UID: \"2157fd1c-e5d5-43eb-982a-cdd60f287a7c\") " pod="tigera-operator/tigera-operator-755d956888-k4w62" Sep 9 23:48:03.647180 kubelet[2636]: I0909 23:48:03.647133 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2157fd1c-e5d5-43eb-982a-cdd60f287a7c-var-lib-calico\") pod \"tigera-operator-755d956888-k4w62\" (UID: \"2157fd1c-e5d5-43eb-982a-cdd60f287a7c\") " pod="tigera-operator/tigera-operator-755d956888-k4w62" Sep 9 23:48:03.752702 kubelet[2636]: I0909 23:48:03.752330 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-28fsf" podStartSLOduration=1.752300653 podStartE2EDuration="1.752300653s" podCreationTimestamp="2025-09-09 23:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:48:03.751722758 +0000 UTC m=+8.141026458" watchObservedRunningTime="2025-09-09 23:48:03.752300653 +0000 UTC m=+8.141604393" Sep 9 23:48:03.899718 containerd[1495]: time="2025-09-09T23:48:03.899663861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-k4w62,Uid:2157fd1c-e5d5-43eb-982a-cdd60f287a7c,Namespace:tigera-operator,Attempt:0,}" Sep 9 23:48:03.920583 containerd[1495]: time="2025-09-09T23:48:03.920534208Z" level=info msg="connecting to shim 08a4cd9494c4eb4fc4b04a3cbed8a56671e60a1534749fd38216bc8c1be1f0a0" address="unix:///run/containerd/s/19ee1b5ff578af579d996a9602d5f17014ffce9b1c2f6c57573f526668ed0f9f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:03.947292 systemd[1]: Started cri-containerd-08a4cd9494c4eb4fc4b04a3cbed8a56671e60a1534749fd38216bc8c1be1f0a0.scope - libcontainer container 08a4cd9494c4eb4fc4b04a3cbed8a56671e60a1534749fd38216bc8c1be1f0a0. Sep 9 23:48:03.985304 containerd[1495]: time="2025-09-09T23:48:03.985266641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-k4w62,Uid:2157fd1c-e5d5-43eb-982a-cdd60f287a7c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"08a4cd9494c4eb4fc4b04a3cbed8a56671e60a1534749fd38216bc8c1be1f0a0\"" Sep 9 23:48:03.988040 containerd[1495]: time="2025-09-09T23:48:03.987956521Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 23:48:05.120209 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1085852204.mount: Deactivated successfully. Sep 9 23:48:05.669689 containerd[1495]: time="2025-09-09T23:48:05.669628457Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 23:48:05.672998 containerd[1495]: time="2025-09-09T23:48:05.672953987Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.684912177s" Sep 9 23:48:05.672998 containerd[1495]: time="2025-09-09T23:48:05.672995769Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 23:48:05.675743 containerd[1495]: time="2025-09-09T23:48:05.675565666Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:05.676599 containerd[1495]: time="2025-09-09T23:48:05.676406583Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:05.677753 containerd[1495]: time="2025-09-09T23:48:05.677724149Z" level=info msg="CreateContainer within sandbox \"08a4cd9494c4eb4fc4b04a3cbed8a56671e60a1534749fd38216bc8c1be1f0a0\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 23:48:05.678044 containerd[1495]: time="2025-09-09T23:48:05.678021623Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:05.687550 containerd[1495]: time="2025-09-09T23:48:05.687411508Z" level=info msg="Container eaab9f111773d6d81a22320aaa200e2a698ca8f1198070429b7aa021da6934de: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:05.692313 containerd[1495]: time="2025-09-09T23:48:05.692259430Z" level=info msg="CreateContainer within sandbox \"08a4cd9494c4eb4fc4b04a3cbed8a56671e60a1534749fd38216bc8c1be1f0a0\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"eaab9f111773d6d81a22320aaa200e2a698ca8f1198070429b7aa021da6934de\"" Sep 9 23:48:05.692769 containerd[1495]: time="2025-09-09T23:48:05.692730315Z" level=info msg="StartContainer for \"eaab9f111773d6d81a22320aaa200e2a698ca8f1198070429b7aa021da6934de\"" Sep 9 23:48:05.694569 containerd[1495]: time="2025-09-09T23:48:05.694537375Z" level=info msg="connecting to shim eaab9f111773d6d81a22320aaa200e2a698ca8f1198070429b7aa021da6934de" address="unix:///run/containerd/s/19ee1b5ff578af579d996a9602d5f17014ffce9b1c2f6c57573f526668ed0f9f" protocol=ttrpc version=3 Sep 9 23:48:05.719044 systemd[1]: Started cri-containerd-eaab9f111773d6d81a22320aaa200e2a698ca8f1198070429b7aa021da6934de.scope - libcontainer container eaab9f111773d6d81a22320aaa200e2a698ca8f1198070429b7aa021da6934de. Sep 9 23:48:05.750545 containerd[1495]: time="2025-09-09T23:48:05.750259084Z" level=info msg="StartContainer for \"eaab9f111773d6d81a22320aaa200e2a698ca8f1198070429b7aa021da6934de\" returns successfully" Sep 9 23:48:06.764568 kubelet[2636]: I0909 23:48:06.764363 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-k4w62" podStartSLOduration=2.076127489 podStartE2EDuration="3.764345698s" podCreationTimestamp="2025-09-09 23:48:03 +0000 UTC" firstStartedPulling="2025-09-09 23:48:03.986820142 +0000 UTC m=+8.376123842" lastFinishedPulling="2025-09-09 23:48:05.675038311 +0000 UTC m=+10.064342051" observedRunningTime="2025-09-09 23:48:06.763580601 +0000 UTC m=+11.152884341" watchObservedRunningTime="2025-09-09 23:48:06.764345698 +0000 UTC m=+11.153649398" Sep 9 23:48:11.095812 sudo[1704]: pam_unix(sudo:session): session closed for user root Sep 9 23:48:11.097509 sshd[1703]: Connection closed by 10.0.0.1 port 55962 Sep 9 23:48:11.098136 sshd-session[1700]: pam_unix(sshd:session): session closed for user core Sep 9 23:48:11.102661 systemd[1]: sshd@6-10.0.0.67:22-10.0.0.1:55962.service: Deactivated successfully. Sep 9 23:48:11.105758 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 23:48:11.107905 systemd[1]: session-7.scope: Consumed 6.330s CPU time, 223.1M memory peak. Sep 9 23:48:11.109954 systemd-logind[1477]: Session 7 logged out. Waiting for processes to exit. Sep 9 23:48:11.112763 systemd-logind[1477]: Removed session 7. Sep 9 23:48:11.538440 update_engine[1480]: I20250909 23:48:11.538296 1480 update_attempter.cc:509] Updating boot flags... Sep 9 23:48:15.495975 systemd[1]: Created slice kubepods-besteffort-pod0ea53fae_9627_42aa_b7f8_910c626f0857.slice - libcontainer container kubepods-besteffort-pod0ea53fae_9627_42aa_b7f8_910c626f0857.slice. Sep 9 23:48:15.528208 kubelet[2636]: I0909 23:48:15.528162 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znw8w\" (UniqueName: \"kubernetes.io/projected/0ea53fae-9627-42aa-b7f8-910c626f0857-kube-api-access-znw8w\") pod \"calico-typha-9f47566d-r42x4\" (UID: \"0ea53fae-9627-42aa-b7f8-910c626f0857\") " pod="calico-system/calico-typha-9f47566d-r42x4" Sep 9 23:48:15.528673 kubelet[2636]: I0909 23:48:15.528274 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0ea53fae-9627-42aa-b7f8-910c626f0857-typha-certs\") pod \"calico-typha-9f47566d-r42x4\" (UID: \"0ea53fae-9627-42aa-b7f8-910c626f0857\") " pod="calico-system/calico-typha-9f47566d-r42x4" Sep 9 23:48:15.528673 kubelet[2636]: I0909 23:48:15.528363 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea53fae-9627-42aa-b7f8-910c626f0857-tigera-ca-bundle\") pod \"calico-typha-9f47566d-r42x4\" (UID: \"0ea53fae-9627-42aa-b7f8-910c626f0857\") " pod="calico-system/calico-typha-9f47566d-r42x4" Sep 9 23:48:15.801683 containerd[1495]: time="2025-09-09T23:48:15.801137710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9f47566d-r42x4,Uid:0ea53fae-9627-42aa-b7f8-910c626f0857,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:15.842978 containerd[1495]: time="2025-09-09T23:48:15.842038173Z" level=info msg="connecting to shim 42934fe5fb369729c157eb0daec773587fa79353e2149f70517265d0aed715d5" address="unix:///run/containerd/s/60280ebbd623f75b5a4007242efd4aab9bbd18c0cda767908d079dc9c109f27a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:15.869367 systemd[1]: Created slice kubepods-besteffort-podd44a141c_8fe2_430a_9b36_1a9ccbe26aa9.slice - libcontainer container kubepods-besteffort-podd44a141c_8fe2_430a_9b36_1a9ccbe26aa9.slice. Sep 9 23:48:15.907349 systemd[1]: Started cri-containerd-42934fe5fb369729c157eb0daec773587fa79353e2149f70517265d0aed715d5.scope - libcontainer container 42934fe5fb369729c157eb0daec773587fa79353e2149f70517265d0aed715d5. Sep 9 23:48:15.931079 kubelet[2636]: I0909 23:48:15.931014 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-lib-modules\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931079 kubelet[2636]: I0909 23:48:15.931056 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-var-lib-calico\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931079 kubelet[2636]: I0909 23:48:15.931075 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-cni-net-dir\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931403 kubelet[2636]: I0909 23:48:15.931091 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-flexvol-driver-host\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931403 kubelet[2636]: I0909 23:48:15.931112 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-policysync\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931403 kubelet[2636]: I0909 23:48:15.931126 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-xtables-lock\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931403 kubelet[2636]: I0909 23:48:15.931146 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-cni-bin-dir\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931403 kubelet[2636]: I0909 23:48:15.931163 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-cni-log-dir\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931523 kubelet[2636]: I0909 23:48:15.931198 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-node-certs\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931523 kubelet[2636]: I0909 23:48:15.931233 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-tigera-ca-bundle\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931523 kubelet[2636]: I0909 23:48:15.931253 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45sp\" (UniqueName: \"kubernetes.io/projected/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-kube-api-access-x45sp\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.931523 kubelet[2636]: I0909 23:48:15.931286 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d44a141c-8fe2-430a-9b36-1a9ccbe26aa9-var-run-calico\") pod \"calico-node-hqsl6\" (UID: \"d44a141c-8fe2-430a-9b36-1a9ccbe26aa9\") " pod="calico-system/calico-node-hqsl6" Sep 9 23:48:15.944684 containerd[1495]: time="2025-09-09T23:48:15.944638380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9f47566d-r42x4,Uid:0ea53fae-9627-42aa-b7f8-910c626f0857,Namespace:calico-system,Attempt:0,} returns sandbox id \"42934fe5fb369729c157eb0daec773587fa79353e2149f70517265d0aed715d5\"" Sep 9 23:48:15.948722 containerd[1495]: time="2025-09-09T23:48:15.948567772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 23:48:16.033085 kubelet[2636]: E0909 23:48:16.032910 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.033085 kubelet[2636]: W0909 23:48:16.032935 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.033085 kubelet[2636]: E0909 23:48:16.033055 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.033307 kubelet[2636]: E0909 23:48:16.033231 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.033307 kubelet[2636]: W0909 23:48:16.033240 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.033307 kubelet[2636]: E0909 23:48:16.033258 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.033410 kubelet[2636]: E0909 23:48:16.033399 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.033410 kubelet[2636]: W0909 23:48:16.033409 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.033471 kubelet[2636]: E0909 23:48:16.033419 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.033603 kubelet[2636]: E0909 23:48:16.033591 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.033603 kubelet[2636]: W0909 23:48:16.033603 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.033654 kubelet[2636]: E0909 23:48:16.033613 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.035003 kubelet[2636]: E0909 23:48:16.034740 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.035003 kubelet[2636]: W0909 23:48:16.034763 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.035003 kubelet[2636]: E0909 23:48:16.034784 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.035003 kubelet[2636]: E0909 23:48:16.034970 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.035003 kubelet[2636]: W0909 23:48:16.034979 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.035151 kubelet[2636]: E0909 23:48:16.035104 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.035151 kubelet[2636]: W0909 23:48:16.035111 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.035151 kubelet[2636]: E0909 23:48:16.035121 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.037891 kubelet[2636]: E0909 23:48:16.035235 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.037891 kubelet[2636]: W0909 23:48:16.035247 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.037891 kubelet[2636]: E0909 23:48:16.035254 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.037891 kubelet[2636]: E0909 23:48:16.035362 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.037891 kubelet[2636]: W0909 23:48:16.035369 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.037891 kubelet[2636]: E0909 23:48:16.035375 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.037891 kubelet[2636]: E0909 23:48:16.035531 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.037891 kubelet[2636]: W0909 23:48:16.035541 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.037891 kubelet[2636]: E0909 23:48:16.035550 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.037891 kubelet[2636]: E0909 23:48:16.035653 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.038995 kubelet[2636]: E0909 23:48:16.038850 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.038995 kubelet[2636]: W0909 23:48:16.038866 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.038995 kubelet[2636]: E0909 23:48:16.038884 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.040969 kubelet[2636]: E0909 23:48:16.040950 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.041086 kubelet[2636]: W0909 23:48:16.041043 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.041235 kubelet[2636]: E0909 23:48:16.041185 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.041457 kubelet[2636]: E0909 23:48:16.041400 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.041457 kubelet[2636]: W0909 23:48:16.041413 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.041457 kubelet[2636]: E0909 23:48:16.041442 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.041767 kubelet[2636]: E0909 23:48:16.041716 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.041767 kubelet[2636]: W0909 23:48:16.041728 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.041767 kubelet[2636]: E0909 23:48:16.041740 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.045779 kubelet[2636]: E0909 23:48:16.045710 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.045779 kubelet[2636]: W0909 23:48:16.045730 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.045779 kubelet[2636]: E0909 23:48:16.045744 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.154043 kubelet[2636]: E0909 23:48:16.153994 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9wzpb" podUID="22c99b86-d4b0-412a-a0e1-8a13f3d0c130" Sep 9 23:48:16.173894 containerd[1495]: time="2025-09-09T23:48:16.173855058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hqsl6,Uid:d44a141c-8fe2-430a-9b36-1a9ccbe26aa9,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:16.189856 containerd[1495]: time="2025-09-09T23:48:16.189249545Z" level=info msg="connecting to shim 0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925" address="unix:///run/containerd/s/32cc4237ebc24cfd6c4ea6f632479d7480db8fadeedcdbb53980314c5e6080aa" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:16.215902 kubelet[2636]: E0909 23:48:16.215855 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.215902 kubelet[2636]: W0909 23:48:16.215884 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.216066 kubelet[2636]: E0909 23:48:16.215910 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.216616 kubelet[2636]: E0909 23:48:16.216084 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.216616 kubelet[2636]: W0909 23:48:16.216101 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.216616 kubelet[2636]: E0909 23:48:16.216151 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.216616 kubelet[2636]: E0909 23:48:16.216352 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.216616 kubelet[2636]: W0909 23:48:16.216362 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.216616 kubelet[2636]: E0909 23:48:16.216372 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.216616 kubelet[2636]: E0909 23:48:16.216561 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.216616 kubelet[2636]: W0909 23:48:16.216571 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.216616 kubelet[2636]: E0909 23:48:16.216580 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.217486 kubelet[2636]: E0909 23:48:16.216814 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.217486 kubelet[2636]: W0909 23:48:16.216823 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.217486 kubelet[2636]: E0909 23:48:16.216848 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.217735 kubelet[2636]: E0909 23:48:16.217690 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.217735 kubelet[2636]: W0909 23:48:16.217714 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.217735 kubelet[2636]: E0909 23:48:16.217728 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.218085 kubelet[2636]: E0909 23:48:16.217934 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.218085 kubelet[2636]: W0909 23:48:16.217944 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.218085 kubelet[2636]: E0909 23:48:16.217952 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.218178 kubelet[2636]: E0909 23:48:16.218088 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.218178 kubelet[2636]: W0909 23:48:16.218097 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.218178 kubelet[2636]: E0909 23:48:16.218105 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.218406 kubelet[2636]: E0909 23:48:16.218389 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.218406 kubelet[2636]: W0909 23:48:16.218401 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.218459 kubelet[2636]: E0909 23:48:16.218409 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.218546 kubelet[2636]: E0909 23:48:16.218532 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.218546 kubelet[2636]: W0909 23:48:16.218544 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.218643 kubelet[2636]: E0909 23:48:16.218552 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.218768 kubelet[2636]: E0909 23:48:16.218756 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.218768 kubelet[2636]: W0909 23:48:16.218766 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.218859 kubelet[2636]: E0909 23:48:16.218774 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.218968 kubelet[2636]: E0909 23:48:16.218954 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.218968 kubelet[2636]: W0909 23:48:16.218967 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.228687 kubelet[2636]: E0909 23:48:16.218976 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.230126 kubelet[2636]: E0909 23:48:16.229028 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.230126 kubelet[2636]: W0909 23:48:16.229047 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.230126 kubelet[2636]: E0909 23:48:16.229066 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.230126 kubelet[2636]: E0909 23:48:16.229340 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.230126 kubelet[2636]: W0909 23:48:16.229353 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.230126 kubelet[2636]: E0909 23:48:16.229363 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.230126 kubelet[2636]: E0909 23:48:16.229513 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.230126 kubelet[2636]: W0909 23:48:16.229536 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.230126 kubelet[2636]: E0909 23:48:16.229545 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.230126 kubelet[2636]: E0909 23:48:16.229685 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.230393 kubelet[2636]: W0909 23:48:16.229694 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.230393 kubelet[2636]: E0909 23:48:16.229701 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.230619 kubelet[2636]: E0909 23:48:16.230510 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.230619 kubelet[2636]: W0909 23:48:16.230578 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.230619 kubelet[2636]: E0909 23:48:16.230593 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.230823 kubelet[2636]: E0909 23:48:16.230768 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.231048 kubelet[2636]: W0909 23:48:16.231007 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.231048 kubelet[2636]: E0909 23:48:16.231037 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.231368 kubelet[2636]: E0909 23:48:16.231315 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.231368 kubelet[2636]: W0909 23:48:16.231331 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.231368 kubelet[2636]: E0909 23:48:16.231341 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.231631 kubelet[2636]: E0909 23:48:16.231590 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.231631 kubelet[2636]: W0909 23:48:16.231605 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.231631 kubelet[2636]: E0909 23:48:16.231616 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.234072 systemd[1]: Started cri-containerd-0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925.scope - libcontainer container 0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925. Sep 9 23:48:16.235310 kubelet[2636]: E0909 23:48:16.235269 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.235310 kubelet[2636]: W0909 23:48:16.235291 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.235310 kubelet[2636]: E0909 23:48:16.235306 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.237249 kubelet[2636]: I0909 23:48:16.237206 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/22c99b86-d4b0-412a-a0e1-8a13f3d0c130-varrun\") pod \"csi-node-driver-9wzpb\" (UID: \"22c99b86-d4b0-412a-a0e1-8a13f3d0c130\") " pod="calico-system/csi-node-driver-9wzpb" Sep 9 23:48:16.237886 kubelet[2636]: E0909 23:48:16.237860 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.237886 kubelet[2636]: W0909 23:48:16.237879 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.238131 kubelet[2636]: E0909 23:48:16.237921 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.238556 kubelet[2636]: I0909 23:48:16.237947 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/22c99b86-d4b0-412a-a0e1-8a13f3d0c130-socket-dir\") pod \"csi-node-driver-9wzpb\" (UID: \"22c99b86-d4b0-412a-a0e1-8a13f3d0c130\") " pod="calico-system/csi-node-driver-9wzpb" Sep 9 23:48:16.238713 kubelet[2636]: E0909 23:48:16.238644 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.238713 kubelet[2636]: W0909 23:48:16.238663 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.238713 kubelet[2636]: E0909 23:48:16.238683 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.241424 kubelet[2636]: E0909 23:48:16.240917 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.241424 kubelet[2636]: W0909 23:48:16.240939 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.241424 kubelet[2636]: E0909 23:48:16.240963 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.242983 kubelet[2636]: E0909 23:48:16.242739 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.243399 kubelet[2636]: W0909 23:48:16.243374 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.243731 kubelet[2636]: E0909 23:48:16.243711 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.243881 kubelet[2636]: I0909 23:48:16.243820 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/22c99b86-d4b0-412a-a0e1-8a13f3d0c130-registration-dir\") pod \"csi-node-driver-9wzpb\" (UID: \"22c99b86-d4b0-412a-a0e1-8a13f3d0c130\") " pod="calico-system/csi-node-driver-9wzpb" Sep 9 23:48:16.244198 kubelet[2636]: E0909 23:48:16.244148 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.244198 kubelet[2636]: W0909 23:48:16.244170 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.244198 kubelet[2636]: E0909 23:48:16.244189 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.244375 kubelet[2636]: E0909 23:48:16.244359 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.244422 kubelet[2636]: W0909 23:48:16.244374 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.244422 kubelet[2636]: E0909 23:48:16.244386 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.244710 kubelet[2636]: E0909 23:48:16.244690 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.244710 kubelet[2636]: W0909 23:48:16.244706 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.244801 kubelet[2636]: E0909 23:48:16.244719 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.245786 kubelet[2636]: E0909 23:48:16.245738 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.245786 kubelet[2636]: W0909 23:48:16.245758 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.246082 kubelet[2636]: E0909 23:48:16.245851 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.246589 kubelet[2636]: E0909 23:48:16.246084 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.246589 kubelet[2636]: W0909 23:48:16.246097 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.246589 kubelet[2636]: E0909 23:48:16.246109 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.246589 kubelet[2636]: I0909 23:48:16.246137 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22c99b86-d4b0-412a-a0e1-8a13f3d0c130-kubelet-dir\") pod \"csi-node-driver-9wzpb\" (UID: \"22c99b86-d4b0-412a-a0e1-8a13f3d0c130\") " pod="calico-system/csi-node-driver-9wzpb" Sep 9 23:48:16.246589 kubelet[2636]: E0909 23:48:16.246320 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.246589 kubelet[2636]: W0909 23:48:16.246330 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.246589 kubelet[2636]: E0909 23:48:16.246342 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.246589 kubelet[2636]: I0909 23:48:16.246357 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfkn7\" (UniqueName: \"kubernetes.io/projected/22c99b86-d4b0-412a-a0e1-8a13f3d0c130-kube-api-access-rfkn7\") pod \"csi-node-driver-9wzpb\" (UID: \"22c99b86-d4b0-412a-a0e1-8a13f3d0c130\") " pod="calico-system/csi-node-driver-9wzpb" Sep 9 23:48:16.247273 kubelet[2636]: E0909 23:48:16.247006 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.247273 kubelet[2636]: W0909 23:48:16.247023 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.247273 kubelet[2636]: E0909 23:48:16.247045 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.248962 kubelet[2636]: E0909 23:48:16.248932 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.248962 kubelet[2636]: W0909 23:48:16.248951 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.249066 kubelet[2636]: E0909 23:48:16.248969 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.249188 kubelet[2636]: E0909 23:48:16.249164 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.249188 kubelet[2636]: W0909 23:48:16.249178 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.249243 kubelet[2636]: E0909 23:48:16.249189 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.249430 kubelet[2636]: E0909 23:48:16.249397 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.249430 kubelet[2636]: W0909 23:48:16.249412 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.249430 kubelet[2636]: E0909 23:48:16.249423 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.272879 containerd[1495]: time="2025-09-09T23:48:16.272413714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hqsl6,Uid:d44a141c-8fe2-430a-9b36-1a9ccbe26aa9,Namespace:calico-system,Attempt:0,} returns sandbox id \"0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925\"" Sep 9 23:48:16.347693 kubelet[2636]: E0909 23:48:16.347660 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.347693 kubelet[2636]: W0909 23:48:16.347685 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.347825 kubelet[2636]: E0909 23:48:16.347706 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.348011 kubelet[2636]: E0909 23:48:16.347993 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.348011 kubelet[2636]: W0909 23:48:16.348008 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.348076 kubelet[2636]: E0909 23:48:16.348025 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.348304 kubelet[2636]: E0909 23:48:16.348287 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.348304 kubelet[2636]: W0909 23:48:16.348299 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.348474 kubelet[2636]: E0909 23:48:16.348315 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.349175 kubelet[2636]: E0909 23:48:16.349156 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.349175 kubelet[2636]: W0909 23:48:16.349172 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.349261 kubelet[2636]: E0909 23:48:16.349205 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.349412 kubelet[2636]: E0909 23:48:16.349399 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.349412 kubelet[2636]: W0909 23:48:16.349411 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.349485 kubelet[2636]: E0909 23:48:16.349461 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.349605 kubelet[2636]: E0909 23:48:16.349592 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.349605 kubelet[2636]: W0909 23:48:16.349603 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.349669 kubelet[2636]: E0909 23:48:16.349639 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.349756 kubelet[2636]: E0909 23:48:16.349746 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.349756 kubelet[2636]: W0909 23:48:16.349756 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.349807 kubelet[2636]: E0909 23:48:16.349776 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.349898 kubelet[2636]: E0909 23:48:16.349888 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.349898 kubelet[2636]: W0909 23:48:16.349898 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.349959 kubelet[2636]: E0909 23:48:16.349920 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.350026 kubelet[2636]: E0909 23:48:16.350016 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.350026 kubelet[2636]: W0909 23:48:16.350026 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.350078 kubelet[2636]: E0909 23:48:16.350040 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.350173 kubelet[2636]: E0909 23:48:16.350163 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.350204 kubelet[2636]: W0909 23:48:16.350173 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.350204 kubelet[2636]: E0909 23:48:16.350186 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.350327 kubelet[2636]: E0909 23:48:16.350313 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.350327 kubelet[2636]: W0909 23:48:16.350322 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.350370 kubelet[2636]: E0909 23:48:16.350331 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.350511 kubelet[2636]: E0909 23:48:16.350501 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.350511 kubelet[2636]: W0909 23:48:16.350511 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.350569 kubelet[2636]: E0909 23:48:16.350523 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.350759 kubelet[2636]: E0909 23:48:16.350745 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.350790 kubelet[2636]: W0909 23:48:16.350766 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.350790 kubelet[2636]: E0909 23:48:16.350784 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.351918 kubelet[2636]: E0909 23:48:16.351897 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.352011 kubelet[2636]: W0909 23:48:16.351921 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.352011 kubelet[2636]: E0909 23:48:16.351941 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.352259 kubelet[2636]: E0909 23:48:16.352237 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.352259 kubelet[2636]: W0909 23:48:16.352252 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.352360 kubelet[2636]: E0909 23:48:16.352285 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.352444 kubelet[2636]: E0909 23:48:16.352413 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.352444 kubelet[2636]: W0909 23:48:16.352421 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.352488 kubelet[2636]: E0909 23:48:16.352451 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.352795 kubelet[2636]: E0909 23:48:16.352565 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.352795 kubelet[2636]: W0909 23:48:16.352578 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.352795 kubelet[2636]: E0909 23:48:16.352610 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.352795 kubelet[2636]: E0909 23:48:16.352753 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.352795 kubelet[2636]: W0909 23:48:16.352763 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.352795 kubelet[2636]: E0909 23:48:16.352798 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.353123 kubelet[2636]: E0909 23:48:16.352945 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.353123 kubelet[2636]: W0909 23:48:16.352954 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.353123 kubelet[2636]: E0909 23:48:16.352970 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.353341 kubelet[2636]: E0909 23:48:16.353200 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.353341 kubelet[2636]: W0909 23:48:16.353209 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.353341 kubelet[2636]: E0909 23:48:16.353229 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.353548 kubelet[2636]: E0909 23:48:16.353522 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.353548 kubelet[2636]: W0909 23:48:16.353534 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.353614 kubelet[2636]: E0909 23:48:16.353566 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.353743 kubelet[2636]: E0909 23:48:16.353730 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.353743 kubelet[2636]: W0909 23:48:16.353742 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.353789 kubelet[2636]: E0909 23:48:16.353760 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.353967 kubelet[2636]: E0909 23:48:16.353954 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.353967 kubelet[2636]: W0909 23:48:16.353966 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.354028 kubelet[2636]: E0909 23:48:16.354004 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.354125 kubelet[2636]: E0909 23:48:16.354115 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.354125 kubelet[2636]: W0909 23:48:16.354125 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.354171 kubelet[2636]: E0909 23:48:16.354133 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.354319 kubelet[2636]: E0909 23:48:16.354309 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.354319 kubelet[2636]: W0909 23:48:16.354319 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.354366 kubelet[2636]: E0909 23:48:16.354326 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.364153 kubelet[2636]: E0909 23:48:16.364120 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:16.364153 kubelet[2636]: W0909 23:48:16.364141 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:16.364153 kubelet[2636]: E0909 23:48:16.364160 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:16.902388 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2544339753.mount: Deactivated successfully. Sep 9 23:48:17.687265 containerd[1495]: time="2025-09-09T23:48:17.687209224Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:17.687847 containerd[1495]: time="2025-09-09T23:48:17.687807395Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 23:48:17.688561 containerd[1495]: time="2025-09-09T23:48:17.688533562Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:17.692861 containerd[1495]: time="2025-09-09T23:48:17.692041885Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:17.699799 containerd[1495]: time="2025-09-09T23:48:17.699745128Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.750811881s" Sep 9 23:48:17.699799 containerd[1495]: time="2025-09-09T23:48:17.699786940Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 23:48:17.701809 containerd[1495]: time="2025-09-09T23:48:17.701773028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 23:48:17.708654 kubelet[2636]: E0909 23:48:17.708568 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9wzpb" podUID="22c99b86-d4b0-412a-a0e1-8a13f3d0c130" Sep 9 23:48:17.721314 containerd[1495]: time="2025-09-09T23:48:17.721275523Z" level=info msg="CreateContainer within sandbox \"42934fe5fb369729c157eb0daec773587fa79353e2149f70517265d0aed715d5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 23:48:17.734610 containerd[1495]: time="2025-09-09T23:48:17.734293925Z" level=info msg="Container b82045d9422ec06d041ef10eca48575f6c56d7e7bd8ef0df359b77f8900f0bf4: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:17.741007 containerd[1495]: time="2025-09-09T23:48:17.740952429Z" level=info msg="CreateContainer within sandbox \"42934fe5fb369729c157eb0daec773587fa79353e2149f70517265d0aed715d5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b82045d9422ec06d041ef10eca48575f6c56d7e7bd8ef0df359b77f8900f0bf4\"" Sep 9 23:48:17.741667 containerd[1495]: time="2025-09-09T23:48:17.741632344Z" level=info msg="StartContainer for \"b82045d9422ec06d041ef10eca48575f6c56d7e7bd8ef0df359b77f8900f0bf4\"" Sep 9 23:48:17.743031 containerd[1495]: time="2025-09-09T23:48:17.742997814Z" level=info msg="connecting to shim b82045d9422ec06d041ef10eca48575f6c56d7e7bd8ef0df359b77f8900f0bf4" address="unix:///run/containerd/s/60280ebbd623f75b5a4007242efd4aab9bbd18c0cda767908d079dc9c109f27a" protocol=ttrpc version=3 Sep 9 23:48:17.772081 systemd[1]: Started cri-containerd-b82045d9422ec06d041ef10eca48575f6c56d7e7bd8ef0df359b77f8900f0bf4.scope - libcontainer container b82045d9422ec06d041ef10eca48575f6c56d7e7bd8ef0df359b77f8900f0bf4. Sep 9 23:48:17.811363 containerd[1495]: time="2025-09-09T23:48:17.811324789Z" level=info msg="StartContainer for \"b82045d9422ec06d041ef10eca48575f6c56d7e7bd8ef0df359b77f8900f0bf4\" returns successfully" Sep 9 23:48:18.756782 containerd[1495]: time="2025-09-09T23:48:18.756109926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:18.756782 containerd[1495]: time="2025-09-09T23:48:18.756746700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 23:48:18.758135 containerd[1495]: time="2025-09-09T23:48:18.758107792Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:18.760016 containerd[1495]: time="2025-09-09T23:48:18.759976063Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:18.760730 containerd[1495]: time="2025-09-09T23:48:18.760706143Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.058893544s" Sep 9 23:48:18.760824 containerd[1495]: time="2025-09-09T23:48:18.760801249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 23:48:18.764886 containerd[1495]: time="2025-09-09T23:48:18.764857558Z" level=info msg="CreateContainer within sandbox \"0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 23:48:18.773986 containerd[1495]: time="2025-09-09T23:48:18.773943562Z" level=info msg="Container d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:18.780906 containerd[1495]: time="2025-09-09T23:48:18.780860373Z" level=info msg="CreateContainer within sandbox \"0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32\"" Sep 9 23:48:18.781496 containerd[1495]: time="2025-09-09T23:48:18.781454055Z" level=info msg="StartContainer for \"d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32\"" Sep 9 23:48:18.783054 containerd[1495]: time="2025-09-09T23:48:18.783026685Z" level=info msg="connecting to shim d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32" address="unix:///run/containerd/s/32cc4237ebc24cfd6c4ea6f632479d7480db8fadeedcdbb53980314c5e6080aa" protocol=ttrpc version=3 Sep 9 23:48:18.814089 systemd[1]: Started cri-containerd-d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32.scope - libcontainer container d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32. Sep 9 23:48:18.850408 kubelet[2636]: E0909 23:48:18.850321 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.850978 kubelet[2636]: W0909 23:48:18.850443 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.850978 kubelet[2636]: E0909 23:48:18.850470 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.850978 kubelet[2636]: E0909 23:48:18.850684 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.850978 kubelet[2636]: W0909 23:48:18.850694 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.850978 kubelet[2636]: E0909 23:48:18.850737 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.850978 kubelet[2636]: E0909 23:48:18.850900 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.850978 kubelet[2636]: W0909 23:48:18.850909 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.850978 kubelet[2636]: E0909 23:48:18.850918 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.851343 kubelet[2636]: E0909 23:48:18.851060 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.851343 kubelet[2636]: W0909 23:48:18.851069 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.851343 kubelet[2636]: E0909 23:48:18.851077 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.851343 kubelet[2636]: E0909 23:48:18.851218 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.851343 kubelet[2636]: W0909 23:48:18.851226 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.851343 kubelet[2636]: E0909 23:48:18.851234 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.851489 kubelet[2636]: E0909 23:48:18.851386 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.851489 kubelet[2636]: W0909 23:48:18.851395 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.851489 kubelet[2636]: E0909 23:48:18.851404 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.852221 kubelet[2636]: E0909 23:48:18.851587 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.852221 kubelet[2636]: W0909 23:48:18.851692 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.852221 kubelet[2636]: E0909 23:48:18.851701 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.852221 kubelet[2636]: E0909 23:48:18.851895 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.852221 kubelet[2636]: W0909 23:48:18.851904 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.852221 kubelet[2636]: E0909 23:48:18.851913 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.852928 kubelet[2636]: E0909 23:48:18.852798 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.852928 kubelet[2636]: W0909 23:48:18.852816 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.852928 kubelet[2636]: E0909 23:48:18.852845 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.853140 kubelet[2636]: E0909 23:48:18.853043 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.853140 kubelet[2636]: W0909 23:48:18.853052 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.853140 kubelet[2636]: E0909 23:48:18.853064 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.853254 kubelet[2636]: E0909 23:48:18.853226 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.853254 kubelet[2636]: W0909 23:48:18.853239 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.853254 kubelet[2636]: E0909 23:48:18.853247 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.853432 kubelet[2636]: E0909 23:48:18.853422 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.853459 kubelet[2636]: W0909 23:48:18.853432 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.853459 kubelet[2636]: E0909 23:48:18.853442 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.853626 kubelet[2636]: E0909 23:48:18.853609 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.853626 kubelet[2636]: W0909 23:48:18.853619 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.853626 kubelet[2636]: E0909 23:48:18.853627 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.856139 kubelet[2636]: E0909 23:48:18.856058 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.856139 kubelet[2636]: W0909 23:48:18.856080 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.856139 kubelet[2636]: E0909 23:48:18.856093 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.856290 kubelet[2636]: E0909 23:48:18.856272 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.856290 kubelet[2636]: W0909 23:48:18.856286 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.856435 kubelet[2636]: E0909 23:48:18.856297 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.862560 containerd[1495]: time="2025-09-09T23:48:18.862501091Z" level=info msg="StartContainer for \"d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32\" returns successfully" Sep 9 23:48:18.866664 kubelet[2636]: E0909 23:48:18.866636 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.866664 kubelet[2636]: W0909 23:48:18.866658 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.867239 kubelet[2636]: E0909 23:48:18.866677 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.867239 kubelet[2636]: E0909 23:48:18.867123 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.867239 kubelet[2636]: W0909 23:48:18.867132 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.867239 kubelet[2636]: E0909 23:48:18.867146 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.867466 kubelet[2636]: E0909 23:48:18.867438 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.867647 kubelet[2636]: W0909 23:48:18.867526 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.867647 kubelet[2636]: E0909 23:48:18.867584 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.867903 kubelet[2636]: E0909 23:48:18.867879 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.868058 kubelet[2636]: W0909 23:48:18.868005 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.868058 kubelet[2636]: E0909 23:48:18.868030 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.868354 kubelet[2636]: E0909 23:48:18.868330 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.868405 kubelet[2636]: W0909 23:48:18.868359 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.868405 kubelet[2636]: E0909 23:48:18.868380 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.868744 kubelet[2636]: E0909 23:48:18.868711 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.868744 kubelet[2636]: W0909 23:48:18.868728 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.868798 kubelet[2636]: E0909 23:48:18.868779 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.869045 kubelet[2636]: E0909 23:48:18.869028 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.869045 kubelet[2636]: W0909 23:48:18.869044 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.869124 kubelet[2636]: E0909 23:48:18.869073 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.869305 kubelet[2636]: E0909 23:48:18.869289 2636 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:48:18.869333 kubelet[2636]: W0909 23:48:18.869304 2636 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:48:18.869333 kubelet[2636]: E0909 23:48:18.869323 2636 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:48:18.878401 systemd[1]: cri-containerd-d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32.scope: Deactivated successfully. Sep 9 23:48:18.878717 systemd[1]: cri-containerd-d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32.scope: Consumed 32ms CPU time, 6.1M memory peak, 4.5M written to disk. Sep 9 23:48:18.905256 containerd[1495]: time="2025-09-09T23:48:18.905206286Z" level=info msg="received exit event container_id:\"d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32\" id:\"d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32\" pid:3304 exited_at:{seconds:1757461698 nanos:891519784}" Sep 9 23:48:18.905528 containerd[1495]: time="2025-09-09T23:48:18.905466357Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32\" id:\"d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32\" pid:3304 exited_at:{seconds:1757461698 nanos:891519784}" Sep 9 23:48:18.945365 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d15fb3bf6dd2dc9556ea321ac0bbfbab1c019943bdca493deb0c978128445e32-rootfs.mount: Deactivated successfully. Sep 9 23:48:19.708224 kubelet[2636]: E0909 23:48:19.708140 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9wzpb" podUID="22c99b86-d4b0-412a-a0e1-8a13f3d0c130" Sep 9 23:48:19.810788 kubelet[2636]: I0909 23:48:19.810735 2636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:48:19.812586 containerd[1495]: time="2025-09-09T23:48:19.812473752Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 23:48:19.840861 kubelet[2636]: I0909 23:48:19.840693 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-9f47566d-r42x4" podStartSLOduration=3.085089242 podStartE2EDuration="4.840668889s" podCreationTimestamp="2025-09-09 23:48:15 +0000 UTC" firstStartedPulling="2025-09-09 23:48:15.94594499 +0000 UTC m=+20.335248730" lastFinishedPulling="2025-09-09 23:48:17.701524637 +0000 UTC m=+22.090828377" observedRunningTime="2025-09-09 23:48:18.818969791 +0000 UTC m=+23.208273531" watchObservedRunningTime="2025-09-09 23:48:19.840668889 +0000 UTC m=+24.229972629" Sep 9 23:48:21.708217 kubelet[2636]: E0909 23:48:21.708148 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9wzpb" podUID="22c99b86-d4b0-412a-a0e1-8a13f3d0c130" Sep 9 23:48:23.229909 containerd[1495]: time="2025-09-09T23:48:23.229863781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:23.230690 containerd[1495]: time="2025-09-09T23:48:23.230398819Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 23:48:23.231455 containerd[1495]: time="2025-09-09T23:48:23.231405482Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:23.234274 containerd[1495]: time="2025-09-09T23:48:23.234030864Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:23.234686 containerd[1495]: time="2025-09-09T23:48:23.234663484Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.422077502s" Sep 9 23:48:23.234767 containerd[1495]: time="2025-09-09T23:48:23.234753344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 23:48:23.238905 containerd[1495]: time="2025-09-09T23:48:23.238873617Z" level=info msg="CreateContainer within sandbox \"0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 23:48:23.250367 containerd[1495]: time="2025-09-09T23:48:23.248961172Z" level=info msg="Container 32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:23.270042 containerd[1495]: time="2025-09-09T23:48:23.269997632Z" level=info msg="CreateContainer within sandbox \"0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01\"" Sep 9 23:48:23.271418 containerd[1495]: time="2025-09-09T23:48:23.271381699Z" level=info msg="StartContainer for \"32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01\"" Sep 9 23:48:23.272893 containerd[1495]: time="2025-09-09T23:48:23.272865708Z" level=info msg="connecting to shim 32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01" address="unix:///run/containerd/s/32cc4237ebc24cfd6c4ea6f632479d7480db8fadeedcdbb53980314c5e6080aa" protocol=ttrpc version=3 Sep 9 23:48:23.298053 systemd[1]: Started cri-containerd-32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01.scope - libcontainer container 32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01. Sep 9 23:48:23.374195 containerd[1495]: time="2025-09-09T23:48:23.374161270Z" level=info msg="StartContainer for \"32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01\" returns successfully" Sep 9 23:48:23.711072 kubelet[2636]: E0909 23:48:23.711005 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9wzpb" podUID="22c99b86-d4b0-412a-a0e1-8a13f3d0c130" Sep 9 23:48:24.078644 systemd[1]: cri-containerd-32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01.scope: Deactivated successfully. Sep 9 23:48:24.079287 systemd[1]: cri-containerd-32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01.scope: Consumed 473ms CPU time, 175.7M memory peak, 2.6M read from disk, 165.8M written to disk. Sep 9 23:48:24.083334 containerd[1495]: time="2025-09-09T23:48:24.083294396Z" level=info msg="received exit event container_id:\"32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01\" id:\"32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01\" pid:3394 exited_at:{seconds:1757461704 nanos:82966246}" Sep 9 23:48:24.083474 containerd[1495]: time="2025-09-09T23:48:24.083302958Z" level=info msg="TaskExit event in podsandbox handler container_id:\"32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01\" id:\"32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01\" pid:3394 exited_at:{seconds:1757461704 nanos:82966246}" Sep 9 23:48:24.112757 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-32d3d8507b808ea6626522c9df62296196a8c7368ea501a5b5e3130f5ec18d01-rootfs.mount: Deactivated successfully. Sep 9 23:48:24.170157 kubelet[2636]: I0909 23:48:24.170089 2636 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 23:48:24.236432 systemd[1]: Created slice kubepods-burstable-poda3348f5c_2b82_408a_934b_6f961e1301f3.slice - libcontainer container kubepods-burstable-poda3348f5c_2b82_408a_934b_6f961e1301f3.slice. Sep 9 23:48:24.252612 systemd[1]: Created slice kubepods-besteffort-pod0465cd3f_1694_44a6_a024_da1b1aa40cc9.slice - libcontainer container kubepods-besteffort-pod0465cd3f_1694_44a6_a024_da1b1aa40cc9.slice. Sep 9 23:48:24.262972 systemd[1]: Created slice kubepods-burstable-podc8a833a6_a0f5_4e14_93ab_d38231856daa.slice - libcontainer container kubepods-burstable-podc8a833a6_a0f5_4e14_93ab_d38231856daa.slice. Sep 9 23:48:24.267987 systemd[1]: Created slice kubepods-besteffort-pod3839dd2f_45e7_4d5c_bf16_5e360f2b81f3.slice - libcontainer container kubepods-besteffort-pod3839dd2f_45e7_4d5c_bf16_5e360f2b81f3.slice. Sep 9 23:48:24.278767 systemd[1]: Created slice kubepods-besteffort-pod2e1f7856_288c_49e6_9aa8_fd07a3af9e50.slice - libcontainer container kubepods-besteffort-pod2e1f7856_288c_49e6_9aa8_fd07a3af9e50.slice. Sep 9 23:48:24.282888 systemd[1]: Created slice kubepods-besteffort-pod6cf560ed_9d34_4085_8953_98c1c1f9b412.slice - libcontainer container kubepods-besteffort-pod6cf560ed_9d34_4085_8953_98c1c1f9b412.slice. Sep 9 23:48:24.289303 systemd[1]: Created slice kubepods-besteffort-pod895db007_8816_44f9_ad3c_3821dc38d527.slice - libcontainer container kubepods-besteffort-pod895db007_8816_44f9_ad3c_3821dc38d527.slice. Sep 9 23:48:24.315629 kubelet[2636]: I0909 23:48:24.315571 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6q6b\" (UniqueName: \"kubernetes.io/projected/6cf560ed-9d34-4085-8953-98c1c1f9b412-kube-api-access-q6q6b\") pod \"calico-apiserver-59f49d6c46-t45hn\" (UID: \"6cf560ed-9d34-4085-8953-98c1c1f9b412\") " pod="calico-apiserver/calico-apiserver-59f49d6c46-t45hn" Sep 9 23:48:24.315629 kubelet[2636]: I0909 23:48:24.315624 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtbc\" (UniqueName: \"kubernetes.io/projected/a3348f5c-2b82-408a-934b-6f961e1301f3-kube-api-access-bhtbc\") pod \"coredns-668d6bf9bc-srgvd\" (UID: \"a3348f5c-2b82-408a-934b-6f961e1301f3\") " pod="kube-system/coredns-668d6bf9bc-srgvd" Sep 9 23:48:24.315866 kubelet[2636]: I0909 23:48:24.315645 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2e1f7856-288c-49e6-9aa8-fd07a3af9e50-goldmane-key-pair\") pod \"goldmane-54d579b49d-8cmhs\" (UID: \"2e1f7856-288c-49e6-9aa8-fd07a3af9e50\") " pod="calico-system/goldmane-54d579b49d-8cmhs" Sep 9 23:48:24.315866 kubelet[2636]: I0909 23:48:24.315663 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/895db007-8816-44f9-ad3c-3821dc38d527-whisker-backend-key-pair\") pod \"whisker-5b874b9786-fcb9j\" (UID: \"895db007-8816-44f9-ad3c-3821dc38d527\") " pod="calico-system/whisker-5b874b9786-fcb9j" Sep 9 23:48:24.315866 kubelet[2636]: I0909 23:48:24.315682 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8a833a6-a0f5-4e14-93ab-d38231856daa-config-volume\") pod \"coredns-668d6bf9bc-hj5xg\" (UID: \"c8a833a6-a0f5-4e14-93ab-d38231856daa\") " pod="kube-system/coredns-668d6bf9bc-hj5xg" Sep 9 23:48:24.315866 kubelet[2636]: I0909 23:48:24.315701 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgkp\" (UniqueName: \"kubernetes.io/projected/3839dd2f-45e7-4d5c-bf16-5e360f2b81f3-kube-api-access-7wgkp\") pod \"calico-apiserver-59f49d6c46-wrgjk\" (UID: \"3839dd2f-45e7-4d5c-bf16-5e360f2b81f3\") " pod="calico-apiserver/calico-apiserver-59f49d6c46-wrgjk" Sep 9 23:48:24.315866 kubelet[2636]: I0909 23:48:24.315719 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3348f5c-2b82-408a-934b-6f961e1301f3-config-volume\") pod \"coredns-668d6bf9bc-srgvd\" (UID: \"a3348f5c-2b82-408a-934b-6f961e1301f3\") " pod="kube-system/coredns-668d6bf9bc-srgvd" Sep 9 23:48:24.315995 kubelet[2636]: I0909 23:48:24.315738 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3839dd2f-45e7-4d5c-bf16-5e360f2b81f3-calico-apiserver-certs\") pod \"calico-apiserver-59f49d6c46-wrgjk\" (UID: \"3839dd2f-45e7-4d5c-bf16-5e360f2b81f3\") " pod="calico-apiserver/calico-apiserver-59f49d6c46-wrgjk" Sep 9 23:48:24.315995 kubelet[2636]: I0909 23:48:24.315755 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1f7856-288c-49e6-9aa8-fd07a3af9e50-config\") pod \"goldmane-54d579b49d-8cmhs\" (UID: \"2e1f7856-288c-49e6-9aa8-fd07a3af9e50\") " pod="calico-system/goldmane-54d579b49d-8cmhs" Sep 9 23:48:24.315995 kubelet[2636]: I0909 23:48:24.315773 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4stzs\" (UniqueName: \"kubernetes.io/projected/2e1f7856-288c-49e6-9aa8-fd07a3af9e50-kube-api-access-4stzs\") pod \"goldmane-54d579b49d-8cmhs\" (UID: \"2e1f7856-288c-49e6-9aa8-fd07a3af9e50\") " pod="calico-system/goldmane-54d579b49d-8cmhs" Sep 9 23:48:24.315995 kubelet[2636]: I0909 23:48:24.315789 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6cf560ed-9d34-4085-8953-98c1c1f9b412-calico-apiserver-certs\") pod \"calico-apiserver-59f49d6c46-t45hn\" (UID: \"6cf560ed-9d34-4085-8953-98c1c1f9b412\") " pod="calico-apiserver/calico-apiserver-59f49d6c46-t45hn" Sep 9 23:48:24.315995 kubelet[2636]: I0909 23:48:24.315805 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/895db007-8816-44f9-ad3c-3821dc38d527-whisker-ca-bundle\") pod \"whisker-5b874b9786-fcb9j\" (UID: \"895db007-8816-44f9-ad3c-3821dc38d527\") " pod="calico-system/whisker-5b874b9786-fcb9j" Sep 9 23:48:24.316101 kubelet[2636]: I0909 23:48:24.315821 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0465cd3f-1694-44a6-a024-da1b1aa40cc9-tigera-ca-bundle\") pod \"calico-kube-controllers-75db498865-vhqk2\" (UID: \"0465cd3f-1694-44a6-a024-da1b1aa40cc9\") " pod="calico-system/calico-kube-controllers-75db498865-vhqk2" Sep 9 23:48:24.316101 kubelet[2636]: I0909 23:48:24.315870 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnj79\" (UniqueName: \"kubernetes.io/projected/0465cd3f-1694-44a6-a024-da1b1aa40cc9-kube-api-access-jnj79\") pod \"calico-kube-controllers-75db498865-vhqk2\" (UID: \"0465cd3f-1694-44a6-a024-da1b1aa40cc9\") " pod="calico-system/calico-kube-controllers-75db498865-vhqk2" Sep 9 23:48:24.316101 kubelet[2636]: I0909 23:48:24.315891 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e1f7856-288c-49e6-9aa8-fd07a3af9e50-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-8cmhs\" (UID: \"2e1f7856-288c-49e6-9aa8-fd07a3af9e50\") " pod="calico-system/goldmane-54d579b49d-8cmhs" Sep 9 23:48:24.316101 kubelet[2636]: I0909 23:48:24.315907 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srcfb\" (UniqueName: \"kubernetes.io/projected/895db007-8816-44f9-ad3c-3821dc38d527-kube-api-access-srcfb\") pod \"whisker-5b874b9786-fcb9j\" (UID: \"895db007-8816-44f9-ad3c-3821dc38d527\") " pod="calico-system/whisker-5b874b9786-fcb9j" Sep 9 23:48:24.316101 kubelet[2636]: I0909 23:48:24.316005 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znctz\" (UniqueName: \"kubernetes.io/projected/c8a833a6-a0f5-4e14-93ab-d38231856daa-kube-api-access-znctz\") pod \"coredns-668d6bf9bc-hj5xg\" (UID: \"c8a833a6-a0f5-4e14-93ab-d38231856daa\") " pod="kube-system/coredns-668d6bf9bc-hj5xg" Sep 9 23:48:24.542280 containerd[1495]: time="2025-09-09T23:48:24.541588913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-srgvd,Uid:a3348f5c-2b82-408a-934b-6f961e1301f3,Namespace:kube-system,Attempt:0,}" Sep 9 23:48:24.558537 containerd[1495]: time="2025-09-09T23:48:24.558473390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75db498865-vhqk2,Uid:0465cd3f-1694-44a6-a024-da1b1aa40cc9,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:24.566451 containerd[1495]: time="2025-09-09T23:48:24.566413682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hj5xg,Uid:c8a833a6-a0f5-4e14-93ab-d38231856daa,Namespace:kube-system,Attempt:0,}" Sep 9 23:48:24.574406 containerd[1495]: time="2025-09-09T23:48:24.574303322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-wrgjk,Uid:3839dd2f-45e7-4d5c-bf16-5e360f2b81f3,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:48:24.583168 containerd[1495]: time="2025-09-09T23:48:24.583126242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-8cmhs,Uid:2e1f7856-288c-49e6-9aa8-fd07a3af9e50,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:24.587114 containerd[1495]: time="2025-09-09T23:48:24.587076204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-t45hn,Uid:6cf560ed-9d34-4085-8953-98c1c1f9b412,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:48:24.597471 containerd[1495]: time="2025-09-09T23:48:24.597415486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b874b9786-fcb9j,Uid:895db007-8816-44f9-ad3c-3821dc38d527,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:24.733228 containerd[1495]: time="2025-09-09T23:48:24.733182691Z" level=error msg="Failed to destroy network for sandbox \"3a388dc150fb65333d2c37e55894c779e03e77069d841411b494270baa84451d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.759727 containerd[1495]: time="2025-09-09T23:48:24.759505779Z" level=error msg="Failed to destroy network for sandbox \"4a864377dd952cee57bd2363b2c0b5786214dba2a14495a7eb22b786c245bd2a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.801616 containerd[1495]: time="2025-09-09T23:48:24.801214344Z" level=error msg="Failed to destroy network for sandbox \"032de99580c4a3f560cc2471906273b6c7af928fe85c2ad3957230de5743d563\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.845084 containerd[1495]: time="2025-09-09T23:48:24.845045042Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 23:48:24.898037 containerd[1495]: time="2025-09-09T23:48:24.897973758Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-srgvd,Uid:a3348f5c-2b82-408a-934b-6f961e1301f3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a388dc150fb65333d2c37e55894c779e03e77069d841411b494270baa84451d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.904424 kubelet[2636]: E0909 23:48:24.904334 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a388dc150fb65333d2c37e55894c779e03e77069d841411b494270baa84451d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.904424 kubelet[2636]: E0909 23:48:24.904431 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a388dc150fb65333d2c37e55894c779e03e77069d841411b494270baa84451d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-srgvd" Sep 9 23:48:24.904816 kubelet[2636]: E0909 23:48:24.904452 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a388dc150fb65333d2c37e55894c779e03e77069d841411b494270baa84451d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-srgvd" Sep 9 23:48:24.904816 kubelet[2636]: E0909 23:48:24.904509 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-srgvd_kube-system(a3348f5c-2b82-408a-934b-6f961e1301f3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-srgvd_kube-system(a3348f5c-2b82-408a-934b-6f961e1301f3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a388dc150fb65333d2c37e55894c779e03e77069d841411b494270baa84451d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-srgvd" podUID="a3348f5c-2b82-408a-934b-6f961e1301f3" Sep 9 23:48:24.915021 containerd[1495]: time="2025-09-09T23:48:24.914961697Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hj5xg,Uid:c8a833a6-a0f5-4e14-93ab-d38231856daa,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a864377dd952cee57bd2363b2c0b5786214dba2a14495a7eb22b786c245bd2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.915498 kubelet[2636]: E0909 23:48:24.915384 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a864377dd952cee57bd2363b2c0b5786214dba2a14495a7eb22b786c245bd2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.915498 kubelet[2636]: E0909 23:48:24.915470 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a864377dd952cee57bd2363b2c0b5786214dba2a14495a7eb22b786c245bd2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hj5xg" Sep 9 23:48:24.915715 kubelet[2636]: E0909 23:48:24.915593 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a864377dd952cee57bd2363b2c0b5786214dba2a14495a7eb22b786c245bd2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hj5xg" Sep 9 23:48:24.915802 kubelet[2636]: E0909 23:48:24.915775 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-hj5xg_kube-system(c8a833a6-a0f5-4e14-93ab-d38231856daa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-hj5xg_kube-system(c8a833a6-a0f5-4e14-93ab-d38231856daa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a864377dd952cee57bd2363b2c0b5786214dba2a14495a7eb22b786c245bd2a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-hj5xg" podUID="c8a833a6-a0f5-4e14-93ab-d38231856daa" Sep 9 23:48:24.932845 containerd[1495]: time="2025-09-09T23:48:24.932786215Z" level=error msg="Failed to destroy network for sandbox \"0ad0e955bc09da7f9d6c328849dc132058b3f30913f7aa84567bb910c04e0ad8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.964459 containerd[1495]: time="2025-09-09T23:48:24.964410512Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75db498865-vhqk2,Uid:0465cd3f-1694-44a6-a024-da1b1aa40cc9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"032de99580c4a3f560cc2471906273b6c7af928fe85c2ad3957230de5743d563\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.966047 kubelet[2636]: E0909 23:48:24.965981 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"032de99580c4a3f560cc2471906273b6c7af928fe85c2ad3957230de5743d563\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:24.966047 kubelet[2636]: E0909 23:48:24.966041 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"032de99580c4a3f560cc2471906273b6c7af928fe85c2ad3957230de5743d563\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75db498865-vhqk2" Sep 9 23:48:24.966381 kubelet[2636]: E0909 23:48:24.966061 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"032de99580c4a3f560cc2471906273b6c7af928fe85c2ad3957230de5743d563\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75db498865-vhqk2" Sep 9 23:48:24.966381 kubelet[2636]: E0909 23:48:24.966096 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-75db498865-vhqk2_calico-system(0465cd3f-1694-44a6-a024-da1b1aa40cc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-75db498865-vhqk2_calico-system(0465cd3f-1694-44a6-a024-da1b1aa40cc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"032de99580c4a3f560cc2471906273b6c7af928fe85c2ad3957230de5743d563\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-75db498865-vhqk2" podUID="0465cd3f-1694-44a6-a024-da1b1aa40cc9" Sep 9 23:48:24.982135 containerd[1495]: time="2025-09-09T23:48:24.982088038Z" level=error msg="Failed to destroy network for sandbox \"1a731aff3d1f511f5f9471ecce5809ccd320bf50ea0e8f33b2aa8fbf7776b582\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.027096 containerd[1495]: time="2025-09-09T23:48:25.027046367Z" level=error msg="Failed to destroy network for sandbox \"c655e4c77d8598e643725dc1dc6dc6a5c5f6faa2653c00ff128f3b83a5f05e6e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.053725 containerd[1495]: time="2025-09-09T23:48:25.053274346Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-wrgjk,Uid:3839dd2f-45e7-4d5c-bf16-5e360f2b81f3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ad0e955bc09da7f9d6c328849dc132058b3f30913f7aa84567bb910c04e0ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.054325 kubelet[2636]: E0909 23:48:25.053744 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ad0e955bc09da7f9d6c328849dc132058b3f30913f7aa84567bb910c04e0ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.054325 kubelet[2636]: E0909 23:48:25.053803 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ad0e955bc09da7f9d6c328849dc132058b3f30913f7aa84567bb910c04e0ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59f49d6c46-wrgjk" Sep 9 23:48:25.054325 kubelet[2636]: E0909 23:48:25.053821 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ad0e955bc09da7f9d6c328849dc132058b3f30913f7aa84567bb910c04e0ad8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59f49d6c46-wrgjk" Sep 9 23:48:25.054590 kubelet[2636]: E0909 23:48:25.053887 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59f49d6c46-wrgjk_calico-apiserver(3839dd2f-45e7-4d5c-bf16-5e360f2b81f3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59f49d6c46-wrgjk_calico-apiserver(3839dd2f-45e7-4d5c-bf16-5e360f2b81f3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ad0e955bc09da7f9d6c328849dc132058b3f30913f7aa84567bb910c04e0ad8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59f49d6c46-wrgjk" podUID="3839dd2f-45e7-4d5c-bf16-5e360f2b81f3" Sep 9 23:48:25.058469 containerd[1495]: time="2025-09-09T23:48:25.058214599Z" level=error msg="Failed to destroy network for sandbox \"d4e59b88b57bcb0adbe2f41696c29f7a2062fdd2072fffd147e23c505448a1a8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.081067 containerd[1495]: time="2025-09-09T23:48:25.081012794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-8cmhs,Uid:2e1f7856-288c-49e6-9aa8-fd07a3af9e50,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a731aff3d1f511f5f9471ecce5809ccd320bf50ea0e8f33b2aa8fbf7776b582\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.081549 kubelet[2636]: E0909 23:48:25.081511 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a731aff3d1f511f5f9471ecce5809ccd320bf50ea0e8f33b2aa8fbf7776b582\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.081624 kubelet[2636]: E0909 23:48:25.081575 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a731aff3d1f511f5f9471ecce5809ccd320bf50ea0e8f33b2aa8fbf7776b582\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-8cmhs" Sep 9 23:48:25.081624 kubelet[2636]: E0909 23:48:25.081602 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a731aff3d1f511f5f9471ecce5809ccd320bf50ea0e8f33b2aa8fbf7776b582\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-8cmhs" Sep 9 23:48:25.081683 kubelet[2636]: E0909 23:48:25.081642 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-8cmhs_calico-system(2e1f7856-288c-49e6-9aa8-fd07a3af9e50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-8cmhs_calico-system(2e1f7856-288c-49e6-9aa8-fd07a3af9e50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a731aff3d1f511f5f9471ecce5809ccd320bf50ea0e8f33b2aa8fbf7776b582\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-8cmhs" podUID="2e1f7856-288c-49e6-9aa8-fd07a3af9e50" Sep 9 23:48:25.099109 containerd[1495]: time="2025-09-09T23:48:25.099001923Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-t45hn,Uid:6cf560ed-9d34-4085-8953-98c1c1f9b412,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c655e4c77d8598e643725dc1dc6dc6a5c5f6faa2653c00ff128f3b83a5f05e6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.099354 kubelet[2636]: E0909 23:48:25.099285 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c655e4c77d8598e643725dc1dc6dc6a5c5f6faa2653c00ff128f3b83a5f05e6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.099397 kubelet[2636]: E0909 23:48:25.099342 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c655e4c77d8598e643725dc1dc6dc6a5c5f6faa2653c00ff128f3b83a5f05e6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59f49d6c46-t45hn" Sep 9 23:48:25.099397 kubelet[2636]: E0909 23:48:25.099378 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c655e4c77d8598e643725dc1dc6dc6a5c5f6faa2653c00ff128f3b83a5f05e6e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59f49d6c46-t45hn" Sep 9 23:48:25.099485 kubelet[2636]: E0909 23:48:25.099447 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59f49d6c46-t45hn_calico-apiserver(6cf560ed-9d34-4085-8953-98c1c1f9b412)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59f49d6c46-t45hn_calico-apiserver(6cf560ed-9d34-4085-8953-98c1c1f9b412)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c655e4c77d8598e643725dc1dc6dc6a5c5f6faa2653c00ff128f3b83a5f05e6e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59f49d6c46-t45hn" podUID="6cf560ed-9d34-4085-8953-98c1c1f9b412" Sep 9 23:48:25.127342 containerd[1495]: time="2025-09-09T23:48:25.127200946Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b874b9786-fcb9j,Uid:895db007-8816-44f9-ad3c-3821dc38d527,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e59b88b57bcb0adbe2f41696c29f7a2062fdd2072fffd147e23c505448a1a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.127525 kubelet[2636]: E0909 23:48:25.127475 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e59b88b57bcb0adbe2f41696c29f7a2062fdd2072fffd147e23c505448a1a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.127571 kubelet[2636]: E0909 23:48:25.127536 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e59b88b57bcb0adbe2f41696c29f7a2062fdd2072fffd147e23c505448a1a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b874b9786-fcb9j" Sep 9 23:48:25.127571 kubelet[2636]: E0909 23:48:25.127557 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e59b88b57bcb0adbe2f41696c29f7a2062fdd2072fffd147e23c505448a1a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b874b9786-fcb9j" Sep 9 23:48:25.127622 kubelet[2636]: E0909 23:48:25.127596 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5b874b9786-fcb9j_calico-system(895db007-8816-44f9-ad3c-3821dc38d527)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5b874b9786-fcb9j_calico-system(895db007-8816-44f9-ad3c-3821dc38d527)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4e59b88b57bcb0adbe2f41696c29f7a2062fdd2072fffd147e23c505448a1a8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5b874b9786-fcb9j" podUID="895db007-8816-44f9-ad3c-3821dc38d527" Sep 9 23:48:25.718374 systemd[1]: Created slice kubepods-besteffort-pod22c99b86_d4b0_412a_a0e1_8a13f3d0c130.slice - libcontainer container kubepods-besteffort-pod22c99b86_d4b0_412a_a0e1_8a13f3d0c130.slice. Sep 9 23:48:25.724307 containerd[1495]: time="2025-09-09T23:48:25.724259584Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9wzpb,Uid:22c99b86-d4b0-412a-a0e1-8a13f3d0c130,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:25.892940 containerd[1495]: time="2025-09-09T23:48:25.892815469Z" level=error msg="Failed to destroy network for sandbox \"92688063b1e894cda8257d69055328e4052a135eeccb793390aeb8dfbb222ca6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.895372 systemd[1]: run-netns-cni\x2dbb0429c2\x2d50f2\x2dd127\x2dead4\x2d90de2c6c8233.mount: Deactivated successfully. Sep 9 23:48:25.898093 containerd[1495]: time="2025-09-09T23:48:25.898029499Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9wzpb,Uid:22c99b86-d4b0-412a-a0e1-8a13f3d0c130,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"92688063b1e894cda8257d69055328e4052a135eeccb793390aeb8dfbb222ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.898709 kubelet[2636]: E0909 23:48:25.898596 2636 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92688063b1e894cda8257d69055328e4052a135eeccb793390aeb8dfbb222ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:48:25.898709 kubelet[2636]: E0909 23:48:25.898677 2636 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92688063b1e894cda8257d69055328e4052a135eeccb793390aeb8dfbb222ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9wzpb" Sep 9 23:48:25.898709 kubelet[2636]: E0909 23:48:25.898708 2636 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92688063b1e894cda8257d69055328e4052a135eeccb793390aeb8dfbb222ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9wzpb" Sep 9 23:48:25.900057 kubelet[2636]: E0909 23:48:25.898754 2636 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9wzpb_calico-system(22c99b86-d4b0-412a-a0e1-8a13f3d0c130)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9wzpb_calico-system(22c99b86-d4b0-412a-a0e1-8a13f3d0c130)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"92688063b1e894cda8257d69055328e4052a135eeccb793390aeb8dfbb222ca6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9wzpb" podUID="22c99b86-d4b0-412a-a0e1-8a13f3d0c130" Sep 9 23:48:29.205056 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount774358069.mount: Deactivated successfully. Sep 9 23:48:29.456044 containerd[1495]: time="2025-09-09T23:48:29.439220542Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 23:48:29.456044 containerd[1495]: time="2025-09-09T23:48:29.443087829Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:29.456044 containerd[1495]: time="2025-09-09T23:48:29.443882331Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.598610481s" Sep 9 23:48:29.456044 containerd[1495]: time="2025-09-09T23:48:29.456000446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 23:48:29.456702 containerd[1495]: time="2025-09-09T23:48:29.456666604Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:29.457180 containerd[1495]: time="2025-09-09T23:48:29.457140369Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:29.466643 containerd[1495]: time="2025-09-09T23:48:29.466583608Z" level=info msg="CreateContainer within sandbox \"0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 23:48:29.488670 containerd[1495]: time="2025-09-09T23:48:29.488337197Z" level=info msg="Container 2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:29.512764 containerd[1495]: time="2025-09-09T23:48:29.512711572Z" level=info msg="CreateContainer within sandbox \"0852db9e71fce8e98aa72e8d99eeca7179497cd07518a02cf8d83afe7d1b5925\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5\"" Sep 9 23:48:29.513585 containerd[1495]: time="2025-09-09T23:48:29.513461905Z" level=info msg="StartContainer for \"2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5\"" Sep 9 23:48:29.515656 containerd[1495]: time="2025-09-09T23:48:29.515610528Z" level=info msg="connecting to shim 2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5" address="unix:///run/containerd/s/32cc4237ebc24cfd6c4ea6f632479d7480db8fadeedcdbb53980314c5e6080aa" protocol=ttrpc version=3 Sep 9 23:48:29.546097 systemd[1]: Started cri-containerd-2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5.scope - libcontainer container 2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5. Sep 9 23:48:29.595361 containerd[1495]: time="2025-09-09T23:48:29.595228488Z" level=info msg="StartContainer for \"2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5\" returns successfully" Sep 9 23:48:29.732856 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 23:48:29.733015 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 23:48:29.891530 kubelet[2636]: I0909 23:48:29.891275 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hqsl6" podStartSLOduration=1.70783819 podStartE2EDuration="14.891181884s" podCreationTimestamp="2025-09-09 23:48:15 +0000 UTC" firstStartedPulling="2025-09-09 23:48:16.274015193 +0000 UTC m=+20.663318933" lastFinishedPulling="2025-09-09 23:48:29.457358887 +0000 UTC m=+33.846662627" observedRunningTime="2025-09-09 23:48:29.888297811 +0000 UTC m=+34.277601551" watchObservedRunningTime="2025-09-09 23:48:29.891181884 +0000 UTC m=+34.280485624" Sep 9 23:48:29.953582 kubelet[2636]: I0909 23:48:29.953535 2636 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srcfb\" (UniqueName: \"kubernetes.io/projected/895db007-8816-44f9-ad3c-3821dc38d527-kube-api-access-srcfb\") pod \"895db007-8816-44f9-ad3c-3821dc38d527\" (UID: \"895db007-8816-44f9-ad3c-3821dc38d527\") " Sep 9 23:48:29.953582 kubelet[2636]: I0909 23:48:29.953592 2636 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/895db007-8816-44f9-ad3c-3821dc38d527-whisker-backend-key-pair\") pod \"895db007-8816-44f9-ad3c-3821dc38d527\" (UID: \"895db007-8816-44f9-ad3c-3821dc38d527\") " Sep 9 23:48:29.953769 kubelet[2636]: I0909 23:48:29.953625 2636 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/895db007-8816-44f9-ad3c-3821dc38d527-whisker-ca-bundle\") pod \"895db007-8816-44f9-ad3c-3821dc38d527\" (UID: \"895db007-8816-44f9-ad3c-3821dc38d527\") " Sep 9 23:48:29.963328 kubelet[2636]: I0909 23:48:29.963257 2636 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/895db007-8816-44f9-ad3c-3821dc38d527-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "895db007-8816-44f9-ad3c-3821dc38d527" (UID: "895db007-8816-44f9-ad3c-3821dc38d527"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 23:48:29.965981 kubelet[2636]: I0909 23:48:29.965918 2636 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895db007-8816-44f9-ad3c-3821dc38d527-kube-api-access-srcfb" (OuterVolumeSpecName: "kube-api-access-srcfb") pod "895db007-8816-44f9-ad3c-3821dc38d527" (UID: "895db007-8816-44f9-ad3c-3821dc38d527"). InnerVolumeSpecName "kube-api-access-srcfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 23:48:29.974143 kubelet[2636]: I0909 23:48:29.974092 2636 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895db007-8816-44f9-ad3c-3821dc38d527-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "895db007-8816-44f9-ad3c-3821dc38d527" (UID: "895db007-8816-44f9-ad3c-3821dc38d527"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 23:48:30.054140 kubelet[2636]: I0909 23:48:30.054086 2636 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srcfb\" (UniqueName: \"kubernetes.io/projected/895db007-8816-44f9-ad3c-3821dc38d527-kube-api-access-srcfb\") on node \"localhost\" DevicePath \"\"" Sep 9 23:48:30.054140 kubelet[2636]: I0909 23:48:30.054125 2636 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/895db007-8816-44f9-ad3c-3821dc38d527-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 23:48:30.054140 kubelet[2636]: I0909 23:48:30.054135 2636 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/895db007-8816-44f9-ad3c-3821dc38d527-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 23:48:30.206201 systemd[1]: var-lib-kubelet-pods-895db007\x2d8816\x2d44f9\x2dad3c\x2d3821dc38d527-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dsrcfb.mount: Deactivated successfully. Sep 9 23:48:30.206297 systemd[1]: var-lib-kubelet-pods-895db007\x2d8816\x2d44f9\x2dad3c\x2d3821dc38d527-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 23:48:30.868092 kubelet[2636]: I0909 23:48:30.866920 2636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:48:30.879949 systemd[1]: Removed slice kubepods-besteffort-pod895db007_8816_44f9_ad3c_3821dc38d527.slice - libcontainer container kubepods-besteffort-pod895db007_8816_44f9_ad3c_3821dc38d527.slice. Sep 9 23:48:30.987235 systemd[1]: Created slice kubepods-besteffort-podddb775db_f136_4a84_829f_4c410bd5eea3.slice - libcontainer container kubepods-besteffort-podddb775db_f136_4a84_829f_4c410bd5eea3.slice. Sep 9 23:48:31.063874 kubelet[2636]: I0909 23:48:31.063191 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srfg7\" (UniqueName: \"kubernetes.io/projected/ddb775db-f136-4a84-829f-4c410bd5eea3-kube-api-access-srfg7\") pod \"whisker-5ffdcf7949-lsgxd\" (UID: \"ddb775db-f136-4a84-829f-4c410bd5eea3\") " pod="calico-system/whisker-5ffdcf7949-lsgxd" Sep 9 23:48:31.063874 kubelet[2636]: I0909 23:48:31.063248 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ddb775db-f136-4a84-829f-4c410bd5eea3-whisker-backend-key-pair\") pod \"whisker-5ffdcf7949-lsgxd\" (UID: \"ddb775db-f136-4a84-829f-4c410bd5eea3\") " pod="calico-system/whisker-5ffdcf7949-lsgxd" Sep 9 23:48:31.063874 kubelet[2636]: I0909 23:48:31.063274 2636 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddb775db-f136-4a84-829f-4c410bd5eea3-whisker-ca-bundle\") pod \"whisker-5ffdcf7949-lsgxd\" (UID: \"ddb775db-f136-4a84-829f-4c410bd5eea3\") " pod="calico-system/whisker-5ffdcf7949-lsgxd" Sep 9 23:48:31.291175 containerd[1495]: time="2025-09-09T23:48:31.290779984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ffdcf7949-lsgxd,Uid:ddb775db-f136-4a84-829f-4c410bd5eea3,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:31.513531 systemd-networkd[1418]: cali2d15d9cb7ac: Link UP Sep 9 23:48:31.514217 systemd-networkd[1418]: cali2d15d9cb7ac: Gained carrier Sep 9 23:48:31.529611 containerd[1495]: 2025-09-09 23:48:31.337 [INFO][3870] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:31.529611 containerd[1495]: 2025-09-09 23:48:31.383 [INFO][3870] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0 whisker-5ffdcf7949- calico-system ddb775db-f136-4a84-829f-4c410bd5eea3 853 0 2025-09-09 23:48:30 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5ffdcf7949 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5ffdcf7949-lsgxd eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2d15d9cb7ac [] [] }} ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-" Sep 9 23:48:31.529611 containerd[1495]: 2025-09-09 23:48:31.384 [INFO][3870] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" Sep 9 23:48:31.529611 containerd[1495]: 2025-09-09 23:48:31.456 [INFO][3887] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" HandleID="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Workload="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.456 [INFO][3887] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" HandleID="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Workload="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3360), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5ffdcf7949-lsgxd", "timestamp":"2025-09-09 23:48:31.456145584 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.456 [INFO][3887] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.456 [INFO][3887] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.456 [INFO][3887] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.469 [INFO][3887] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" host="localhost" Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.476 [INFO][3887] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.483 [INFO][3887] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.485 [INFO][3887] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.488 [INFO][3887] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:31.529851 containerd[1495]: 2025-09-09 23:48:31.488 [INFO][3887] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" host="localhost" Sep 9 23:48:31.530131 containerd[1495]: 2025-09-09 23:48:31.490 [INFO][3887] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f Sep 9 23:48:31.530131 containerd[1495]: 2025-09-09 23:48:31.494 [INFO][3887] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" host="localhost" Sep 9 23:48:31.530131 containerd[1495]: 2025-09-09 23:48:31.503 [INFO][3887] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" host="localhost" Sep 9 23:48:31.530131 containerd[1495]: 2025-09-09 23:48:31.503 [INFO][3887] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" host="localhost" Sep 9 23:48:31.530131 containerd[1495]: 2025-09-09 23:48:31.503 [INFO][3887] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:31.530131 containerd[1495]: 2025-09-09 23:48:31.503 [INFO][3887] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" HandleID="k8s-pod-network.d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Workload="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" Sep 9 23:48:31.530417 containerd[1495]: 2025-09-09 23:48:31.506 [INFO][3870] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0", GenerateName:"whisker-5ffdcf7949-", Namespace:"calico-system", SelfLink:"", UID:"ddb775db-f136-4a84-829f-4c410bd5eea3", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5ffdcf7949", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5ffdcf7949-lsgxd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2d15d9cb7ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:31.530417 containerd[1495]: 2025-09-09 23:48:31.506 [INFO][3870] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" Sep 9 23:48:31.530514 containerd[1495]: 2025-09-09 23:48:31.506 [INFO][3870] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d15d9cb7ac ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" Sep 9 23:48:31.530514 containerd[1495]: 2025-09-09 23:48:31.514 [INFO][3870] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" Sep 9 23:48:31.530670 containerd[1495]: 2025-09-09 23:48:31.514 [INFO][3870] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0", GenerateName:"whisker-5ffdcf7949-", Namespace:"calico-system", SelfLink:"", UID:"ddb775db-f136-4a84-829f-4c410bd5eea3", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5ffdcf7949", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f", Pod:"whisker-5ffdcf7949-lsgxd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2d15d9cb7ac", MAC:"56:ae:f7:b7:da:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:31.530747 containerd[1495]: 2025-09-09 23:48:31.526 [INFO][3870] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" Namespace="calico-system" Pod="whisker-5ffdcf7949-lsgxd" WorkloadEndpoint="localhost-k8s-whisker--5ffdcf7949--lsgxd-eth0" Sep 9 23:48:31.561883 containerd[1495]: time="2025-09-09T23:48:31.561633885Z" level=info msg="connecting to shim d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f" address="unix:///run/containerd/s/549a1464b59282a99492ffbc9bcb6251ceff4e9d9da7ed0ae5845039f9ee082b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:31.595102 systemd[1]: Started cri-containerd-d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f.scope - libcontainer container d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f. Sep 9 23:48:31.610282 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:31.632700 containerd[1495]: time="2025-09-09T23:48:31.632584269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ffdcf7949-lsgxd,Uid:ddb775db-f136-4a84-829f-4c410bd5eea3,Namespace:calico-system,Attempt:0,} returns sandbox id \"d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f\"" Sep 9 23:48:31.639038 containerd[1495]: time="2025-09-09T23:48:31.638983216Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 23:48:31.715464 kubelet[2636]: I0909 23:48:31.712997 2636 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895db007-8816-44f9-ad3c-3821dc38d527" path="/var/lib/kubelet/pods/895db007-8816-44f9-ad3c-3821dc38d527/volumes" Sep 9 23:48:32.698538 containerd[1495]: time="2025-09-09T23:48:32.698484453Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:32.699291 containerd[1495]: time="2025-09-09T23:48:32.699267780Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 23:48:32.700113 containerd[1495]: time="2025-09-09T23:48:32.700085272Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:32.702957 containerd[1495]: time="2025-09-09T23:48:32.702925251Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:32.703656 containerd[1495]: time="2025-09-09T23:48:32.703625324Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.064205636s" Sep 9 23:48:32.703728 containerd[1495]: time="2025-09-09T23:48:32.703658809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 23:48:32.705783 containerd[1495]: time="2025-09-09T23:48:32.705742026Z" level=info msg="CreateContainer within sandbox \"d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 23:48:32.717497 containerd[1495]: time="2025-09-09T23:48:32.717455039Z" level=info msg="Container 0b7738c3abad51ecd5247a056481539d17c86e1e528bb1e29ef68bedb7bf9d66: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:32.726866 containerd[1495]: time="2025-09-09T23:48:32.726789307Z" level=info msg="CreateContainer within sandbox \"d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0b7738c3abad51ecd5247a056481539d17c86e1e528bb1e29ef68bedb7bf9d66\"" Sep 9 23:48:32.727864 containerd[1495]: time="2025-09-09T23:48:32.727600238Z" level=info msg="StartContainer for \"0b7738c3abad51ecd5247a056481539d17c86e1e528bb1e29ef68bedb7bf9d66\"" Sep 9 23:48:32.742917 containerd[1495]: time="2025-09-09T23:48:32.742796613Z" level=info msg="connecting to shim 0b7738c3abad51ecd5247a056481539d17c86e1e528bb1e29ef68bedb7bf9d66" address="unix:///run/containerd/s/549a1464b59282a99492ffbc9bcb6251ceff4e9d9da7ed0ae5845039f9ee082b" protocol=ttrpc version=3 Sep 9 23:48:32.766047 systemd[1]: Started cri-containerd-0b7738c3abad51ecd5247a056481539d17c86e1e528bb1e29ef68bedb7bf9d66.scope - libcontainer container 0b7738c3abad51ecd5247a056481539d17c86e1e528bb1e29ef68bedb7bf9d66. Sep 9 23:48:32.812708 containerd[1495]: time="2025-09-09T23:48:32.812672584Z" level=info msg="StartContainer for \"0b7738c3abad51ecd5247a056481539d17c86e1e528bb1e29ef68bedb7bf9d66\" returns successfully" Sep 9 23:48:32.814494 containerd[1495]: time="2025-09-09T23:48:32.814456873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 23:48:32.857006 systemd-networkd[1418]: cali2d15d9cb7ac: Gained IPv6LL Sep 9 23:48:34.263663 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3140999806.mount: Deactivated successfully. Sep 9 23:48:34.504254 containerd[1495]: time="2025-09-09T23:48:34.504133184Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:34.544155 containerd[1495]: time="2025-09-09T23:48:34.543172492Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 23:48:34.584652 containerd[1495]: time="2025-09-09T23:48:34.584503549Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:34.602801 containerd[1495]: time="2025-09-09T23:48:34.602720685Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:34.603983 containerd[1495]: time="2025-09-09T23:48:34.603949792Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.789408467s" Sep 9 23:48:34.604046 containerd[1495]: time="2025-09-09T23:48:34.603989038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 23:48:34.611435 containerd[1495]: time="2025-09-09T23:48:34.611186535Z" level=info msg="CreateContainer within sandbox \"d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 23:48:34.707241 containerd[1495]: time="2025-09-09T23:48:34.707200764Z" level=info msg="Container 066577d905af5572422e5c97df776c0c1dbc4624b742ede96e516ef73e163380: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:34.798143 containerd[1495]: time="2025-09-09T23:48:34.797784606Z" level=info msg="CreateContainer within sandbox \"d4152257a9e25aaccf0717705556db955058dba6245a29dddb467591ff3cea6f\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"066577d905af5572422e5c97df776c0c1dbc4624b742ede96e516ef73e163380\"" Sep 9 23:48:34.798855 containerd[1495]: time="2025-09-09T23:48:34.798601450Z" level=info msg="StartContainer for \"066577d905af5572422e5c97df776c0c1dbc4624b742ede96e516ef73e163380\"" Sep 9 23:48:34.799761 containerd[1495]: time="2025-09-09T23:48:34.799730542Z" level=info msg="connecting to shim 066577d905af5572422e5c97df776c0c1dbc4624b742ede96e516ef73e163380" address="unix:///run/containerd/s/549a1464b59282a99492ffbc9bcb6251ceff4e9d9da7ed0ae5845039f9ee082b" protocol=ttrpc version=3 Sep 9 23:48:34.819144 systemd[1]: Started cri-containerd-066577d905af5572422e5c97df776c0c1dbc4624b742ede96e516ef73e163380.scope - libcontainer container 066577d905af5572422e5c97df776c0c1dbc4624b742ede96e516ef73e163380. Sep 9 23:48:34.909069 containerd[1495]: time="2025-09-09T23:48:34.908954144Z" level=info msg="StartContainer for \"066577d905af5572422e5c97df776c0c1dbc4624b742ede96e516ef73e163380\" returns successfully" Sep 9 23:48:35.722487 containerd[1495]: time="2025-09-09T23:48:35.722392351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75db498865-vhqk2,Uid:0465cd3f-1694-44a6-a024-da1b1aa40cc9,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:35.723233 containerd[1495]: time="2025-09-09T23:48:35.722853419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-srgvd,Uid:a3348f5c-2b82-408a-934b-6f961e1301f3,Namespace:kube-system,Attempt:0,}" Sep 9 23:48:35.896705 systemd-networkd[1418]: calic3492cc9ef6: Link UP Sep 9 23:48:35.896925 systemd-networkd[1418]: calic3492cc9ef6: Gained carrier Sep 9 23:48:35.913955 containerd[1495]: 2025-09-09 23:48:35.774 [INFO][4131] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:35.913955 containerd[1495]: 2025-09-09 23:48:35.794 [INFO][4131] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0 calico-kube-controllers-75db498865- calico-system 0465cd3f-1694-44a6-a024-da1b1aa40cc9 789 0 2025-09-09 23:48:16 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:75db498865 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-75db498865-vhqk2 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic3492cc9ef6 [] [] }} ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-" Sep 9 23:48:35.913955 containerd[1495]: 2025-09-09 23:48:35.794 [INFO][4131] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" Sep 9 23:48:35.913955 containerd[1495]: 2025-09-09 23:48:35.831 [INFO][4164] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" HandleID="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Workload="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.831 [INFO][4164] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" HandleID="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Workload="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd600), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-75db498865-vhqk2", "timestamp":"2025-09-09 23:48:35.831786281 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.832 [INFO][4164] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.832 [INFO][4164] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.832 [INFO][4164] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.847 [INFO][4164] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" host="localhost" Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.858 [INFO][4164] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.867 [INFO][4164] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.872 [INFO][4164] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.875 [INFO][4164] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:35.914195 containerd[1495]: 2025-09-09 23:48:35.875 [INFO][4164] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" host="localhost" Sep 9 23:48:35.914398 containerd[1495]: 2025-09-09 23:48:35.877 [INFO][4164] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1 Sep 9 23:48:35.914398 containerd[1495]: 2025-09-09 23:48:35.883 [INFO][4164] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" host="localhost" Sep 9 23:48:35.914398 containerd[1495]: 2025-09-09 23:48:35.892 [INFO][4164] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" host="localhost" Sep 9 23:48:35.914398 containerd[1495]: 2025-09-09 23:48:35.892 [INFO][4164] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" host="localhost" Sep 9 23:48:35.914398 containerd[1495]: 2025-09-09 23:48:35.892 [INFO][4164] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:35.914398 containerd[1495]: 2025-09-09 23:48:35.892 [INFO][4164] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" HandleID="k8s-pod-network.ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Workload="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" Sep 9 23:48:35.914525 containerd[1495]: 2025-09-09 23:48:35.894 [INFO][4131] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0", GenerateName:"calico-kube-controllers-75db498865-", Namespace:"calico-system", SelfLink:"", UID:"0465cd3f-1694-44a6-a024-da1b1aa40cc9", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75db498865", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-75db498865-vhqk2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3492cc9ef6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:35.914599 containerd[1495]: 2025-09-09 23:48:35.895 [INFO][4131] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" Sep 9 23:48:35.914599 containerd[1495]: 2025-09-09 23:48:35.895 [INFO][4131] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic3492cc9ef6 ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" Sep 9 23:48:35.914599 containerd[1495]: 2025-09-09 23:48:35.897 [INFO][4131] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" Sep 9 23:48:35.914676 containerd[1495]: 2025-09-09 23:48:35.897 [INFO][4131] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0", GenerateName:"calico-kube-controllers-75db498865-", Namespace:"calico-system", SelfLink:"", UID:"0465cd3f-1694-44a6-a024-da1b1aa40cc9", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75db498865", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1", Pod:"calico-kube-controllers-75db498865-vhqk2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3492cc9ef6", MAC:"2a:da:d5:a2:d6:ae", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:35.914731 containerd[1495]: 2025-09-09 23:48:35.909 [INFO][4131] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" Namespace="calico-system" Pod="calico-kube-controllers-75db498865-vhqk2" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75db498865--vhqk2-eth0" Sep 9 23:48:35.947896 kubelet[2636]: I0909 23:48:35.945464 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5ffdcf7949-lsgxd" podStartSLOduration=2.971152627 podStartE2EDuration="5.940282318s" podCreationTimestamp="2025-09-09 23:48:30 +0000 UTC" firstStartedPulling="2025-09-09 23:48:31.636207513 +0000 UTC m=+36.025511253" lastFinishedPulling="2025-09-09 23:48:34.605337204 +0000 UTC m=+38.994640944" observedRunningTime="2025-09-09 23:48:35.93928313 +0000 UTC m=+40.328586870" watchObservedRunningTime="2025-09-09 23:48:35.940282318 +0000 UTC m=+40.329586058" Sep 9 23:48:35.968814 containerd[1495]: time="2025-09-09T23:48:35.968520862Z" level=info msg="connecting to shim ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1" address="unix:///run/containerd/s/788cc4976cbf64e1a216c25f60298ee342a565dbbd0f770c49f3140d339cd08b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:36.010106 systemd[1]: Started cri-containerd-ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1.scope - libcontainer container ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1. Sep 9 23:48:36.024034 systemd-networkd[1418]: cali3f52e8d7568: Link UP Sep 9 23:48:36.024630 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:36.025307 systemd-networkd[1418]: cali3f52e8d7568: Gained carrier Sep 9 23:48:36.048872 containerd[1495]: 2025-09-09 23:48:35.780 [INFO][4141] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:36.048872 containerd[1495]: 2025-09-09 23:48:35.799 [INFO][4141] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--srgvd-eth0 coredns-668d6bf9bc- kube-system a3348f5c-2b82-408a-934b-6f961e1301f3 780 0 2025-09-09 23:48:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-srgvd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3f52e8d7568 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-" Sep 9 23:48:36.048872 containerd[1495]: 2025-09-09 23:48:35.800 [INFO][4141] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" Sep 9 23:48:36.048872 containerd[1495]: 2025-09-09 23:48:35.834 [INFO][4158] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" HandleID="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Workload="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.834 [INFO][4158] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" HandleID="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Workload="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3600), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-srgvd", "timestamp":"2025-09-09 23:48:35.834542809 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.834 [INFO][4158] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.892 [INFO][4158] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.892 [INFO][4158] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.952 [INFO][4158] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" host="localhost" Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.971 [INFO][4158] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.982 [INFO][4158] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.989 [INFO][4158] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.994 [INFO][4158] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:36.049121 containerd[1495]: 2025-09-09 23:48:35.994 [INFO][4158] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" host="localhost" Sep 9 23:48:36.049438 containerd[1495]: 2025-09-09 23:48:35.998 [INFO][4158] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03 Sep 9 23:48:36.049438 containerd[1495]: 2025-09-09 23:48:36.007 [INFO][4158] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" host="localhost" Sep 9 23:48:36.049438 containerd[1495]: 2025-09-09 23:48:36.017 [INFO][4158] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" host="localhost" Sep 9 23:48:36.049438 containerd[1495]: 2025-09-09 23:48:36.017 [INFO][4158] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" host="localhost" Sep 9 23:48:36.049438 containerd[1495]: 2025-09-09 23:48:36.017 [INFO][4158] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:36.049438 containerd[1495]: 2025-09-09 23:48:36.017 [INFO][4158] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" HandleID="k8s-pod-network.3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Workload="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" Sep 9 23:48:36.049573 containerd[1495]: 2025-09-09 23:48:36.021 [INFO][4141] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--srgvd-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a3348f5c-2b82-408a-934b-6f961e1301f3", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-srgvd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3f52e8d7568", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:36.049703 containerd[1495]: 2025-09-09 23:48:36.021 [INFO][4141] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" Sep 9 23:48:36.049703 containerd[1495]: 2025-09-09 23:48:36.021 [INFO][4141] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3f52e8d7568 ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" Sep 9 23:48:36.049703 containerd[1495]: 2025-09-09 23:48:36.026 [INFO][4141] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" Sep 9 23:48:36.049767 containerd[1495]: 2025-09-09 23:48:36.026 [INFO][4141] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--srgvd-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a3348f5c-2b82-408a-934b-6f961e1301f3", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03", Pod:"coredns-668d6bf9bc-srgvd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3f52e8d7568", MAC:"9a:39:90:8a:70:9a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:36.049767 containerd[1495]: 2025-09-09 23:48:36.043 [INFO][4141] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" Namespace="kube-system" Pod="coredns-668d6bf9bc-srgvd" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--srgvd-eth0" Sep 9 23:48:36.089595 containerd[1495]: time="2025-09-09T23:48:36.089042776Z" level=info msg="connecting to shim 3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03" address="unix:///run/containerd/s/428bb439bfe0175f1c5ede14a0089dd92e16734e155b6933f82d86f1cce0cc9c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:36.096841 containerd[1495]: time="2025-09-09T23:48:36.096779812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75db498865-vhqk2,Uid:0465cd3f-1694-44a6-a024-da1b1aa40cc9,Namespace:calico-system,Attempt:0,} returns sandbox id \"ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1\"" Sep 9 23:48:36.098419 containerd[1495]: time="2025-09-09T23:48:36.098387004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 23:48:36.122057 systemd[1]: Started cri-containerd-3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03.scope - libcontainer container 3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03. Sep 9 23:48:36.135470 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:36.159863 containerd[1495]: time="2025-09-09T23:48:36.159798143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-srgvd,Uid:a3348f5c-2b82-408a-934b-6f961e1301f3,Namespace:kube-system,Attempt:0,} returns sandbox id \"3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03\"" Sep 9 23:48:36.164372 containerd[1495]: time="2025-09-09T23:48:36.164327156Z" level=info msg="CreateContainer within sandbox \"3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:48:36.175892 containerd[1495]: time="2025-09-09T23:48:36.175846098Z" level=info msg="Container 8e1c1eba3861ef2379b2ca1564750713136a2501ce897af75c0faef65ad6045c: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:36.185323 containerd[1495]: time="2025-09-09T23:48:36.185269137Z" level=info msg="CreateContainer within sandbox \"3b5f075d9c7f9e72de936300f6e1e0624a94de8e3b62f255b1c84dda14481c03\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8e1c1eba3861ef2379b2ca1564750713136a2501ce897af75c0faef65ad6045c\"" Sep 9 23:48:36.187546 containerd[1495]: time="2025-09-09T23:48:36.186175508Z" level=info msg="StartContainer for \"8e1c1eba3861ef2379b2ca1564750713136a2501ce897af75c0faef65ad6045c\"" Sep 9 23:48:36.187546 containerd[1495]: time="2025-09-09T23:48:36.187360639Z" level=info msg="connecting to shim 8e1c1eba3861ef2379b2ca1564750713136a2501ce897af75c0faef65ad6045c" address="unix:///run/containerd/s/428bb439bfe0175f1c5ede14a0089dd92e16734e155b6933f82d86f1cce0cc9c" protocol=ttrpc version=3 Sep 9 23:48:36.218048 systemd[1]: Started cri-containerd-8e1c1eba3861ef2379b2ca1564750713136a2501ce897af75c0faef65ad6045c.scope - libcontainer container 8e1c1eba3861ef2379b2ca1564750713136a2501ce897af75c0faef65ad6045c. Sep 9 23:48:36.252591 containerd[1495]: time="2025-09-09T23:48:36.252536081Z" level=info msg="StartContainer for \"8e1c1eba3861ef2379b2ca1564750713136a2501ce897af75c0faef65ad6045c\" returns successfully" Sep 9 23:48:36.708632 containerd[1495]: time="2025-09-09T23:48:36.708587271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hj5xg,Uid:c8a833a6-a0f5-4e14-93ab-d38231856daa,Namespace:kube-system,Attempt:0,}" Sep 9 23:48:36.709237 containerd[1495]: time="2025-09-09T23:48:36.709089864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-8cmhs,Uid:2e1f7856-288c-49e6-9aa8-fd07a3af9e50,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:36.909640 systemd-networkd[1418]: calie8f5055e48a: Link UP Sep 9 23:48:36.910055 systemd-networkd[1418]: calie8f5055e48a: Gained carrier Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.768 [INFO][4338] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.796 [INFO][4338] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--8cmhs-eth0 goldmane-54d579b49d- calico-system 2e1f7856-288c-49e6-9aa8-fd07a3af9e50 788 0 2025-09-09 23:48:15 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-8cmhs eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie8f5055e48a [] [] }} ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.796 [INFO][4338] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.845 [INFO][4369] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" HandleID="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Workload="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.845 [INFO][4369] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" HandleID="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Workload="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3630), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-8cmhs", "timestamp":"2025-09-09 23:48:36.845462217 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.845 [INFO][4369] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.845 [INFO][4369] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.845 [INFO][4369] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.858 [INFO][4369] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.867 [INFO][4369] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.873 [INFO][4369] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.876 [INFO][4369] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.879 [INFO][4369] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.879 [INFO][4369] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.886 [INFO][4369] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.892 [INFO][4369] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.902 [INFO][4369] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.902 [INFO][4369] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" host="localhost" Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.902 [INFO][4369] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:36.934303 containerd[1495]: 2025-09-09 23:48:36.902 [INFO][4369] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" HandleID="k8s-pod-network.5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Workload="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" Sep 9 23:48:36.937270 containerd[1495]: 2025-09-09 23:48:36.907 [INFO][4338] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--8cmhs-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2e1f7856-288c-49e6-9aa8-fd07a3af9e50", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-8cmhs", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie8f5055e48a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:36.937270 containerd[1495]: 2025-09-09 23:48:36.907 [INFO][4338] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" Sep 9 23:48:36.937270 containerd[1495]: 2025-09-09 23:48:36.907 [INFO][4338] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8f5055e48a ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" Sep 9 23:48:36.937270 containerd[1495]: 2025-09-09 23:48:36.910 [INFO][4338] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" Sep 9 23:48:36.937270 containerd[1495]: 2025-09-09 23:48:36.910 [INFO][4338] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--8cmhs-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2e1f7856-288c-49e6-9aa8-fd07a3af9e50", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d", Pod:"goldmane-54d579b49d-8cmhs", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie8f5055e48a", MAC:"ea:bb:1f:28:d1:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:36.937270 containerd[1495]: 2025-09-09 23:48:36.928 [INFO][4338] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" Namespace="calico-system" Pod="goldmane-54d579b49d-8cmhs" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--8cmhs-eth0" Sep 9 23:48:36.975587 containerd[1495]: time="2025-09-09T23:48:36.975470772Z" level=info msg="connecting to shim 5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d" address="unix:///run/containerd/s/900b76a20109b32c925dc614926a5e011922564d93ac80c643adb272379362e0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:36.987170 kubelet[2636]: I0909 23:48:36.986606 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-srgvd" podStartSLOduration=33.986585135 podStartE2EDuration="33.986585135s" podCreationTimestamp="2025-09-09 23:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:48:36.960355111 +0000 UTC m=+41.349658891" watchObservedRunningTime="2025-09-09 23:48:36.986585135 +0000 UTC m=+41.375888875" Sep 9 23:48:37.013268 systemd[1]: Started cri-containerd-5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d.scope - libcontainer container 5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d. Sep 9 23:48:37.033063 systemd-networkd[1418]: cali021e3e85bc3: Link UP Sep 9 23:48:37.034378 systemd-networkd[1418]: cali021e3e85bc3: Gained carrier Sep 9 23:48:37.036395 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.761 [INFO][4328] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.792 [INFO][4328] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0 coredns-668d6bf9bc- kube-system c8a833a6-a0f5-4e14-93ab-d38231856daa 785 0 2025-09-09 23:48:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-hj5xg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali021e3e85bc3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.792 [INFO][4328] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.860 [INFO][4375] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" HandleID="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Workload="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.861 [INFO][4375] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" HandleID="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Workload="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000500be0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-hj5xg", "timestamp":"2025-09-09 23:48:36.8600436 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.861 [INFO][4375] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.902 [INFO][4375] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.903 [INFO][4375] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.960 [INFO][4375] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.976 [INFO][4375] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.989 [INFO][4375] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:36.993 [INFO][4375] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.002 [INFO][4375] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.002 [INFO][4375] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.006 [INFO][4375] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.012 [INFO][4375] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.022 [INFO][4375] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.022 [INFO][4375] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" host="localhost" Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.022 [INFO][4375] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:37.049994 containerd[1495]: 2025-09-09 23:48:37.022 [INFO][4375] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" HandleID="k8s-pod-network.da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Workload="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" Sep 9 23:48:37.050811 containerd[1495]: 2025-09-09 23:48:37.027 [INFO][4328] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c8a833a6-a0f5-4e14-93ab-d38231856daa", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-hj5xg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali021e3e85bc3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:37.050811 containerd[1495]: 2025-09-09 23:48:37.027 [INFO][4328] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" Sep 9 23:48:37.050811 containerd[1495]: 2025-09-09 23:48:37.027 [INFO][4328] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali021e3e85bc3 ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" Sep 9 23:48:37.050811 containerd[1495]: 2025-09-09 23:48:37.035 [INFO][4328] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" Sep 9 23:48:37.050811 containerd[1495]: 2025-09-09 23:48:37.036 [INFO][4328] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c8a833a6-a0f5-4e14-93ab-d38231856daa", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d", Pod:"coredns-668d6bf9bc-hj5xg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali021e3e85bc3", MAC:"ea:a7:35:e5:6b:ed", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:37.050811 containerd[1495]: 2025-09-09 23:48:37.047 [INFO][4328] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hj5xg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hj5xg-eth0" Sep 9 23:48:37.079899 containerd[1495]: time="2025-09-09T23:48:37.079822055Z" level=info msg="connecting to shim da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d" address="unix:///run/containerd/s/b449b77191fb1c23416571b2f1e0789d414196dec053a41c3a73e547f7be517f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:37.089258 containerd[1495]: time="2025-09-09T23:48:37.089213495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-8cmhs,Uid:2e1f7856-288c-49e6-9aa8-fd07a3af9e50,Namespace:calico-system,Attempt:0,} returns sandbox id \"5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d\"" Sep 9 23:48:37.111074 systemd[1]: Started cri-containerd-da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d.scope - libcontainer container da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d. Sep 9 23:48:37.129140 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:37.157804 containerd[1495]: time="2025-09-09T23:48:37.157761812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hj5xg,Uid:c8a833a6-a0f5-4e14-93ab-d38231856daa,Namespace:kube-system,Attempt:0,} returns sandbox id \"da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d\"" Sep 9 23:48:37.163759 containerd[1495]: time="2025-09-09T23:48:37.163615435Z" level=info msg="CreateContainer within sandbox \"da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:48:37.231148 containerd[1495]: time="2025-09-09T23:48:37.230777477Z" level=info msg="Container 9ff4fb84a69ec570055eed38c4d4e4359f98125ba414ee105a930b7a983941d9: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:37.239184 containerd[1495]: time="2025-09-09T23:48:37.239142573Z" level=info msg="CreateContainer within sandbox \"da6c54241d92006694ca7a12b68ddabbd26599baceb176bde55b38f1642c383d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9ff4fb84a69ec570055eed38c4d4e4359f98125ba414ee105a930b7a983941d9\"" Sep 9 23:48:37.249075 containerd[1495]: time="2025-09-09T23:48:37.249037644Z" level=info msg="StartContainer for \"9ff4fb84a69ec570055eed38c4d4e4359f98125ba414ee105a930b7a983941d9\"" Sep 9 23:48:37.250212 containerd[1495]: time="2025-09-09T23:48:37.250179804Z" level=info msg="connecting to shim 9ff4fb84a69ec570055eed38c4d4e4359f98125ba414ee105a930b7a983941d9" address="unix:///run/containerd/s/b449b77191fb1c23416571b2f1e0789d414196dec053a41c3a73e547f7be517f" protocol=ttrpc version=3 Sep 9 23:48:37.282138 systemd[1]: Started cri-containerd-9ff4fb84a69ec570055eed38c4d4e4359f98125ba414ee105a930b7a983941d9.scope - libcontainer container 9ff4fb84a69ec570055eed38c4d4e4359f98125ba414ee105a930b7a983941d9. Sep 9 23:48:37.376972 containerd[1495]: time="2025-09-09T23:48:37.376933224Z" level=info msg="StartContainer for \"9ff4fb84a69ec570055eed38c4d4e4359f98125ba414ee105a930b7a983941d9\" returns successfully" Sep 9 23:48:37.657394 systemd-networkd[1418]: calic3492cc9ef6: Gained IPv6LL Sep 9 23:48:37.922976 containerd[1495]: time="2025-09-09T23:48:37.921968367Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:37.925019 containerd[1495]: time="2025-09-09T23:48:37.924966308Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 23:48:37.926419 containerd[1495]: time="2025-09-09T23:48:37.926374026Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:37.931357 containerd[1495]: time="2025-09-09T23:48:37.931321081Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:37.933736 containerd[1495]: time="2025-09-09T23:48:37.933575838Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 1.835048095s" Sep 9 23:48:37.933736 containerd[1495]: time="2025-09-09T23:48:37.933621805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 23:48:37.937502 containerd[1495]: time="2025-09-09T23:48:37.937053687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 23:48:37.954206 containerd[1495]: time="2025-09-09T23:48:37.954006271Z" level=info msg="CreateContainer within sandbox \"ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 23:48:37.977283 systemd-networkd[1418]: cali3f52e8d7568: Gained IPv6LL Sep 9 23:48:38.003496 containerd[1495]: time="2025-09-09T23:48:38.003444852Z" level=info msg="Container 0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:38.016569 containerd[1495]: time="2025-09-09T23:48:38.016514924Z" level=info msg="CreateContainer within sandbox \"ca0ea17281275db4a062c954bef07afb3369ee51d608f67afc43efd000a0eae1\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194\"" Sep 9 23:48:38.017253 containerd[1495]: time="2025-09-09T23:48:38.017200898Z" level=info msg="StartContainer for \"0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194\"" Sep 9 23:48:38.019433 containerd[1495]: time="2025-09-09T23:48:38.019383438Z" level=info msg="connecting to shim 0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194" address="unix:///run/containerd/s/788cc4976cbf64e1a216c25f60298ee342a565dbbd0f770c49f3140d339cd08b" protocol=ttrpc version=3 Sep 9 23:48:38.049068 systemd[1]: Started cri-containerd-0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194.scope - libcontainer container 0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194. Sep 9 23:48:38.118996 containerd[1495]: time="2025-09-09T23:48:38.118957373Z" level=info msg="StartContainer for \"0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194\" returns successfully" Sep 9 23:48:38.169355 systemd-networkd[1418]: calie8f5055e48a: Gained IPv6LL Sep 9 23:48:38.525919 systemd[1]: Started sshd@7-10.0.0.67:22-10.0.0.1:52810.service - OpenSSH per-connection server daemon (10.0.0.1:52810). Sep 9 23:48:38.590707 sshd[4608]: Accepted publickey for core from 10.0.0.1 port 52810 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:48:38.592298 sshd-session[4608]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:48:38.596618 systemd-logind[1477]: New session 8 of user core. Sep 9 23:48:38.605033 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 23:48:38.708526 containerd[1495]: time="2025-09-09T23:48:38.708478138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-wrgjk,Uid:3839dd2f-45e7-4d5c-bf16-5e360f2b81f3,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:48:38.808954 systemd-networkd[1418]: cali021e3e85bc3: Gained IPv6LL Sep 9 23:48:38.830861 sshd[4611]: Connection closed by 10.0.0.1 port 52810 Sep 9 23:48:38.831061 sshd-session[4608]: pam_unix(sshd:session): session closed for user core Sep 9 23:48:38.835488 systemd[1]: sshd@7-10.0.0.67:22-10.0.0.1:52810.service: Deactivated successfully. Sep 9 23:48:38.837309 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 23:48:38.838573 systemd-logind[1477]: Session 8 logged out. Waiting for processes to exit. Sep 9 23:48:38.840695 systemd-logind[1477]: Removed session 8. Sep 9 23:48:38.862874 systemd-networkd[1418]: calib7a4af29618: Link UP Sep 9 23:48:38.863551 systemd-networkd[1418]: calib7a4af29618: Gained carrier Sep 9 23:48:38.877021 kubelet[2636]: I0909 23:48:38.876934 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-hj5xg" podStartSLOduration=35.876911636 podStartE2EDuration="35.876911636s" podCreationTimestamp="2025-09-09 23:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:48:37.98003817 +0000 UTC m=+42.369341990" watchObservedRunningTime="2025-09-09 23:48:38.876911636 +0000 UTC m=+43.266215376" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.742 [INFO][4622] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.765 [INFO][4622] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0 calico-apiserver-59f49d6c46- calico-apiserver 3839dd2f-45e7-4d5c-bf16-5e360f2b81f3 792 0 2025-09-09 23:48:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59f49d6c46 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-59f49d6c46-wrgjk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib7a4af29618 [] [] }} ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.765 [INFO][4622] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.798 [INFO][4636] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" HandleID="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Workload="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.799 [INFO][4636] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" HandleID="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Workload="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d790), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-59f49d6c46-wrgjk", "timestamp":"2025-09-09 23:48:38.79869507 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.799 [INFO][4636] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.799 [INFO][4636] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.799 [INFO][4636] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.811 [INFO][4636] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.816 [INFO][4636] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.823 [INFO][4636] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.826 [INFO][4636] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.830 [INFO][4636] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.830 [INFO][4636] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.832 [INFO][4636] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36 Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.840 [INFO][4636] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.852 [INFO][4636] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.853 [INFO][4636] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" host="localhost" Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.853 [INFO][4636] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:38.879736 containerd[1495]: 2025-09-09 23:48:38.853 [INFO][4636] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" HandleID="k8s-pod-network.ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Workload="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" Sep 9 23:48:38.880277 containerd[1495]: 2025-09-09 23:48:38.860 [INFO][4622] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0", GenerateName:"calico-apiserver-59f49d6c46-", Namespace:"calico-apiserver", SelfLink:"", UID:"3839dd2f-45e7-4d5c-bf16-5e360f2b81f3", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f49d6c46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-59f49d6c46-wrgjk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib7a4af29618", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:38.880277 containerd[1495]: 2025-09-09 23:48:38.860 [INFO][4622] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" Sep 9 23:48:38.880277 containerd[1495]: 2025-09-09 23:48:38.860 [INFO][4622] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib7a4af29618 ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" Sep 9 23:48:38.880277 containerd[1495]: 2025-09-09 23:48:38.864 [INFO][4622] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" Sep 9 23:48:38.880277 containerd[1495]: 2025-09-09 23:48:38.864 [INFO][4622] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0", GenerateName:"calico-apiserver-59f49d6c46-", Namespace:"calico-apiserver", SelfLink:"", UID:"3839dd2f-45e7-4d5c-bf16-5e360f2b81f3", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f49d6c46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36", Pod:"calico-apiserver-59f49d6c46-wrgjk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib7a4af29618", MAC:"b6:90:bb:63:78:ec", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:38.880277 containerd[1495]: 2025-09-09 23:48:38.876 [INFO][4622] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-wrgjk" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--wrgjk-eth0" Sep 9 23:48:38.953028 containerd[1495]: time="2025-09-09T23:48:38.952962786Z" level=info msg="connecting to shim ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36" address="unix:///run/containerd/s/dffc07f8f05822e1fc5a84d07c988ed2ae991fcb8630661a5639586981402003" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:38.966233 kubelet[2636]: I0909 23:48:38.965329 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-75db498865-vhqk2" podStartSLOduration=21.127542321 podStartE2EDuration="22.965302318s" podCreationTimestamp="2025-09-09 23:48:16 +0000 UTC" firstStartedPulling="2025-09-09 23:48:36.098161931 +0000 UTC m=+40.487465631" lastFinishedPulling="2025-09-09 23:48:37.935921888 +0000 UTC m=+42.325225628" observedRunningTime="2025-09-09 23:48:38.964746042 +0000 UTC m=+43.354049782" watchObservedRunningTime="2025-09-09 23:48:38.965302318 +0000 UTC m=+43.354606058" Sep 9 23:48:39.019100 systemd[1]: Started cri-containerd-ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36.scope - libcontainer container ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36. Sep 9 23:48:39.058124 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:39.292343 containerd[1495]: time="2025-09-09T23:48:39.292296459Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194\" id:\"45b6ddba78f10a0509a4354685cc0d67e68e25718b8070b310acd92f4d5af9aa\" pid:4740 exited_at:{seconds:1757461719 nanos:291737704}" Sep 9 23:48:39.299217 containerd[1495]: time="2025-09-09T23:48:39.299080928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-wrgjk,Uid:3839dd2f-45e7-4d5c-bf16-5e360f2b81f3,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36\"" Sep 9 23:48:39.643142 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2626752202.mount: Deactivated successfully. Sep 9 23:48:39.709287 containerd[1495]: time="2025-09-09T23:48:39.709008939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9wzpb,Uid:22c99b86-d4b0-412a-a0e1-8a13f3d0c130,Namespace:calico-system,Attempt:0,}" Sep 9 23:48:39.709588 containerd[1495]: time="2025-09-09T23:48:39.709550452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-t45hn,Uid:6cf560ed-9d34-4085-8953-98c1c1f9b412,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:48:39.907307 systemd-networkd[1418]: cali5e6e8980b5a: Link UP Sep 9 23:48:39.907469 systemd-networkd[1418]: cali5e6e8980b5a: Gained carrier Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.777 [INFO][4760] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.796 [INFO][4760] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0 calico-apiserver-59f49d6c46- calico-apiserver 6cf560ed-9d34-4085-8953-98c1c1f9b412 790 0 2025-09-09 23:48:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59f49d6c46 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-59f49d6c46-t45hn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5e6e8980b5a [] [] }} ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.796 [INFO][4760] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.825 [INFO][4791] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" HandleID="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Workload="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.825 [INFO][4791] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" HandleID="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Workload="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c680), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-59f49d6c46-t45hn", "timestamp":"2025-09-09 23:48:39.825052518 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.825 [INFO][4791] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.825 [INFO][4791] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.825 [INFO][4791] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.843 [INFO][4791] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.848 [INFO][4791] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.853 [INFO][4791] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.856 [INFO][4791] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.862 [INFO][4791] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.862 [INFO][4791] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.863 [INFO][4791] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.879 [INFO][4791] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.900 [INFO][4791] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.900 [INFO][4791] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" host="localhost" Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.900 [INFO][4791] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:39.950596 containerd[1495]: 2025-09-09 23:48:39.900 [INFO][4791] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" HandleID="k8s-pod-network.128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Workload="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" Sep 9 23:48:39.951368 containerd[1495]: 2025-09-09 23:48:39.905 [INFO][4760] cni-plugin/k8s.go 418: Populated endpoint ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0", GenerateName:"calico-apiserver-59f49d6c46-", Namespace:"calico-apiserver", SelfLink:"", UID:"6cf560ed-9d34-4085-8953-98c1c1f9b412", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f49d6c46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-59f49d6c46-t45hn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5e6e8980b5a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:39.951368 containerd[1495]: 2025-09-09 23:48:39.905 [INFO][4760] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" Sep 9 23:48:39.951368 containerd[1495]: 2025-09-09 23:48:39.905 [INFO][4760] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5e6e8980b5a ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" Sep 9 23:48:39.951368 containerd[1495]: 2025-09-09 23:48:39.907 [INFO][4760] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" Sep 9 23:48:39.951368 containerd[1495]: 2025-09-09 23:48:39.908 [INFO][4760] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0", GenerateName:"calico-apiserver-59f49d6c46-", Namespace:"calico-apiserver", SelfLink:"", UID:"6cf560ed-9d34-4085-8953-98c1c1f9b412", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59f49d6c46", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b", Pod:"calico-apiserver-59f49d6c46-t45hn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5e6e8980b5a", MAC:"e2:01:a2:be:76:36", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:39.951368 containerd[1495]: 2025-09-09 23:48:39.944 [INFO][4760] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" Namespace="calico-apiserver" Pod="calico-apiserver-59f49d6c46-t45hn" WorkloadEndpoint="localhost-k8s-calico--apiserver--59f49d6c46--t45hn-eth0" Sep 9 23:48:40.038315 systemd-networkd[1418]: cali7794bd863a9: Link UP Sep 9 23:48:40.038728 systemd-networkd[1418]: cali7794bd863a9: Gained carrier Sep 9 23:48:40.047840 containerd[1495]: time="2025-09-09T23:48:40.046984256Z" level=info msg="connecting to shim 128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b" address="unix:///run/containerd/s/eba9d0fea44ebbbbcd9aed8d7f3d6707f008d7c607924e5b63a20caa77d00d1f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.777 [INFO][4775] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.796 [INFO][4775] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--9wzpb-eth0 csi-node-driver- calico-system 22c99b86-d4b0-412a-a0e1-8a13f3d0c130 686 0 2025-09-09 23:48:16 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-9wzpb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7794bd863a9 [] [] }} ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.796 [INFO][4775] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-eth0" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.835 [INFO][4797] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" HandleID="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Workload="localhost-k8s-csi--node--driver--9wzpb-eth0" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.836 [INFO][4797] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" HandleID="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Workload="localhost-k8s-csi--node--driver--9wzpb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003bf510), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-9wzpb", "timestamp":"2025-09-09 23:48:39.835907252 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.836 [INFO][4797] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.900 [INFO][4797] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.901 [INFO][4797] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.943 [INFO][4797] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.952 [INFO][4797] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:39.959 [INFO][4797] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.004 [INFO][4797] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.007 [INFO][4797] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.007 [INFO][4797] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.009 [INFO][4797] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0 Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.017 [INFO][4797] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.033 [INFO][4797] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.033 [INFO][4797] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" host="localhost" Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.033 [INFO][4797] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:48:40.061669 containerd[1495]: 2025-09-09 23:48:40.033 [INFO][4797] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" HandleID="k8s-pod-network.8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Workload="localhost-k8s-csi--node--driver--9wzpb-eth0" Sep 9 23:48:40.062233 containerd[1495]: 2025-09-09 23:48:40.036 [INFO][4775] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--9wzpb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"22c99b86-d4b0-412a-a0e1-8a13f3d0c130", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-9wzpb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7794bd863a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:40.062233 containerd[1495]: 2025-09-09 23:48:40.036 [INFO][4775] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-eth0" Sep 9 23:48:40.062233 containerd[1495]: 2025-09-09 23:48:40.037 [INFO][4775] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7794bd863a9 ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-eth0" Sep 9 23:48:40.062233 containerd[1495]: 2025-09-09 23:48:40.038 [INFO][4775] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-eth0" Sep 9 23:48:40.062233 containerd[1495]: 2025-09-09 23:48:40.039 [INFO][4775] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--9wzpb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"22c99b86-d4b0-412a-a0e1-8a13f3d0c130", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 48, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0", Pod:"csi-node-driver-9wzpb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7794bd863a9", MAC:"2e:a6:b5:47:06:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:48:40.062233 containerd[1495]: 2025-09-09 23:48:40.054 [INFO][4775] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" Namespace="calico-system" Pod="csi-node-driver-9wzpb" WorkloadEndpoint="localhost-k8s-csi--node--driver--9wzpb-eth0" Sep 9 23:48:40.077980 systemd[1]: Started cri-containerd-128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b.scope - libcontainer container 128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b. Sep 9 23:48:40.089568 containerd[1495]: time="2025-09-09T23:48:40.089523744Z" level=info msg="connecting to shim 8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0" address="unix:///run/containerd/s/63abc4756e62ed94ab1800124a239f23cedf85a1d2b729119b5acc5d1aa59c7e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:48:40.099025 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:40.132194 systemd[1]: Started cri-containerd-8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0.scope - libcontainer container 8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0. Sep 9 23:48:40.150373 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 23:48:40.308876 containerd[1495]: time="2025-09-09T23:48:40.308592934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59f49d6c46-t45hn,Uid:6cf560ed-9d34-4085-8953-98c1c1f9b412,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b\"" Sep 9 23:48:40.334451 containerd[1495]: time="2025-09-09T23:48:40.334375229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9wzpb,Uid:22c99b86-d4b0-412a-a0e1-8a13f3d0c130,Namespace:calico-system,Attempt:0,} returns sandbox id \"8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0\"" Sep 9 23:48:40.401401 containerd[1495]: time="2025-09-09T23:48:40.401313189Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:40.408783 containerd[1495]: time="2025-09-09T23:48:40.408732840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 23:48:40.416905 containerd[1495]: time="2025-09-09T23:48:40.416861504Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:40.429813 containerd[1495]: time="2025-09-09T23:48:40.429772594Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:40.430597 containerd[1495]: time="2025-09-09T23:48:40.430555616Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.493456803s" Sep 9 23:48:40.430654 containerd[1495]: time="2025-09-09T23:48:40.430598342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 23:48:40.431839 containerd[1495]: time="2025-09-09T23:48:40.431795699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:48:40.435381 containerd[1495]: time="2025-09-09T23:48:40.435309559Z" level=info msg="CreateContainer within sandbox \"5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 23:48:40.509399 containerd[1495]: time="2025-09-09T23:48:40.509345768Z" level=info msg="Container 4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:40.547918 containerd[1495]: time="2025-09-09T23:48:40.547875891Z" level=info msg="CreateContainer within sandbox \"5047cc678bef1f9d8b91a71efffac2568bd39a370b9dd2dbf264e02f011ca97d\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0\"" Sep 9 23:48:40.548527 containerd[1495]: time="2025-09-09T23:48:40.548475449Z" level=info msg="StartContainer for \"4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0\"" Sep 9 23:48:40.549756 containerd[1495]: time="2025-09-09T23:48:40.549732334Z" level=info msg="connecting to shim 4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0" address="unix:///run/containerd/s/900b76a20109b32c925dc614926a5e011922564d93ac80c643adb272379362e0" protocol=ttrpc version=3 Sep 9 23:48:40.579133 systemd[1]: Started cri-containerd-4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0.scope - libcontainer container 4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0. Sep 9 23:48:40.628247 containerd[1495]: time="2025-09-09T23:48:40.628198323Z" level=info msg="StartContainer for \"4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0\" returns successfully" Sep 9 23:48:40.792974 systemd-networkd[1418]: calib7a4af29618: Gained IPv6LL Sep 9 23:48:41.047552 containerd[1495]: time="2025-09-09T23:48:41.047372614Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0\" id:\"fdd3a9569a237a52927fcfe69acefec7dd1910f293c6da6e1eade3c13814a50f\" pid:4986 exit_status:1 exited_at:{seconds:1757461721 nanos:47026769}" Sep 9 23:48:41.113977 systemd-networkd[1418]: cali7794bd863a9: Gained IPv6LL Sep 9 23:48:41.881000 systemd-networkd[1418]: cali5e6e8980b5a: Gained IPv6LL Sep 9 23:48:42.049585 containerd[1495]: time="2025-09-09T23:48:42.049540358Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0\" id:\"c88d797254604760402d8ee13f8017f9db37527c3b554ebe5c10f21b74e01cd8\" pid:5040 exit_status:1 exited_at:{seconds:1757461722 nanos:49184153}" Sep 9 23:48:43.068941 containerd[1495]: time="2025-09-09T23:48:43.068812138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:43.078782 containerd[1495]: time="2025-09-09T23:48:43.078727316Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 23:48:43.093605 containerd[1495]: time="2025-09-09T23:48:43.093526015Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:43.105705 containerd[1495]: time="2025-09-09T23:48:43.105651985Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:43.106869 containerd[1495]: time="2025-09-09T23:48:43.106796965Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.674968382s" Sep 9 23:48:43.107059 containerd[1495]: time="2025-09-09T23:48:43.106961186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 23:48:43.107986 containerd[1495]: time="2025-09-09T23:48:43.107952747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:48:43.109394 containerd[1495]: time="2025-09-09T23:48:43.109353079Z" level=info msg="CreateContainer within sandbox \"ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:48:43.178720 containerd[1495]: time="2025-09-09T23:48:43.178677598Z" level=info msg="Container 8b696620dd04fb4cf58414682e5c46cacb07c9ca3876f990c19349bf4e103990: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:43.229291 containerd[1495]: time="2025-09-09T23:48:43.229235210Z" level=info msg="CreateContainer within sandbox \"ed5d47bbccb1fcbca32556a0fd277d84f6ae5af4074eecd9c118f0c89bcbbb36\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8b696620dd04fb4cf58414682e5c46cacb07c9ca3876f990c19349bf4e103990\"" Sep 9 23:48:43.229844 containerd[1495]: time="2025-09-09T23:48:43.229797159Z" level=info msg="StartContainer for \"8b696620dd04fb4cf58414682e5c46cacb07c9ca3876f990c19349bf4e103990\"" Sep 9 23:48:43.231744 containerd[1495]: time="2025-09-09T23:48:43.231719075Z" level=info msg="connecting to shim 8b696620dd04fb4cf58414682e5c46cacb07c9ca3876f990c19349bf4e103990" address="unix:///run/containerd/s/dffc07f8f05822e1fc5a84d07c988ed2ae991fcb8630661a5639586981402003" protocol=ttrpc version=3 Sep 9 23:48:43.265065 systemd[1]: Started cri-containerd-8b696620dd04fb4cf58414682e5c46cacb07c9ca3876f990c19349bf4e103990.scope - libcontainer container 8b696620dd04fb4cf58414682e5c46cacb07c9ca3876f990c19349bf4e103990. Sep 9 23:48:43.338170 containerd[1495]: time="2025-09-09T23:48:43.338120790Z" level=info msg="StartContainer for \"8b696620dd04fb4cf58414682e5c46cacb07c9ca3876f990c19349bf4e103990\" returns successfully" Sep 9 23:48:43.409094 containerd[1495]: time="2025-09-09T23:48:43.409034464Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:43.412182 containerd[1495]: time="2025-09-09T23:48:43.412131004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 23:48:43.414208 containerd[1495]: time="2025-09-09T23:48:43.414169735Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 306.173822ms" Sep 9 23:48:43.414208 containerd[1495]: time="2025-09-09T23:48:43.414210900Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 23:48:43.415197 containerd[1495]: time="2025-09-09T23:48:43.415149095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 23:48:43.417128 containerd[1495]: time="2025-09-09T23:48:43.417097614Z" level=info msg="CreateContainer within sandbox \"128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:48:43.431856 containerd[1495]: time="2025-09-09T23:48:43.431441057Z" level=info msg="Container b4b875fdc7e8b4b8df4b3dbf35b1799ddb4cfd319b123bca38577b3f6ed2f187: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:43.461806 containerd[1495]: time="2025-09-09T23:48:43.461743740Z" level=info msg="CreateContainer within sandbox \"128d944eb73f45320aaa02cd54c04424ecafcd72c09b2ecda97f2332641af85b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b4b875fdc7e8b4b8df4b3dbf35b1799ddb4cfd319b123bca38577b3f6ed2f187\"" Sep 9 23:48:43.462371 containerd[1495]: time="2025-09-09T23:48:43.462328772Z" level=info msg="StartContainer for \"b4b875fdc7e8b4b8df4b3dbf35b1799ddb4cfd319b123bca38577b3f6ed2f187\"" Sep 9 23:48:43.463508 containerd[1495]: time="2025-09-09T23:48:43.463477313Z" level=info msg="connecting to shim b4b875fdc7e8b4b8df4b3dbf35b1799ddb4cfd319b123bca38577b3f6ed2f187" address="unix:///run/containerd/s/eba9d0fea44ebbbbcd9aed8d7f3d6707f008d7c607924e5b63a20caa77d00d1f" protocol=ttrpc version=3 Sep 9 23:48:43.489044 systemd[1]: Started cri-containerd-b4b875fdc7e8b4b8df4b3dbf35b1799ddb4cfd319b123bca38577b3f6ed2f187.scope - libcontainer container b4b875fdc7e8b4b8df4b3dbf35b1799ddb4cfd319b123bca38577b3f6ed2f187. Sep 9 23:48:43.533380 containerd[1495]: time="2025-09-09T23:48:43.533271850Z" level=info msg="StartContainer for \"b4b875fdc7e8b4b8df4b3dbf35b1799ddb4cfd319b123bca38577b3f6ed2f187\" returns successfully" Sep 9 23:48:43.643981 kubelet[2636]: I0909 23:48:43.643437 2636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:48:43.719221 containerd[1495]: time="2025-09-09T23:48:43.719067480Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5\" id:\"6f3d021e2004c05e69d8fe2154ec9523ab28e0e5e8f42edcce170fd70b415863\" pid:5166 exit_status:1 exited_at:{seconds:1757461723 nanos:718734679}" Sep 9 23:48:43.807596 containerd[1495]: time="2025-09-09T23:48:43.807553353Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5\" id:\"c69eacc10f0d3bb546d2f7e9d6bd6a9d5e69d5ae5cd887dd6a0b44d65f96d796\" pid:5192 exit_status:1 exited_at:{seconds:1757461723 nanos:806356966}" Sep 9 23:48:43.847534 systemd[1]: Started sshd@8-10.0.0.67:22-10.0.0.1:33522.service - OpenSSH per-connection server daemon (10.0.0.1:33522). Sep 9 23:48:43.991518 kubelet[2636]: I0909 23:48:43.991098 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-8cmhs" podStartSLOduration=25.651307694 podStartE2EDuration="28.991082184s" podCreationTimestamp="2025-09-09 23:48:15 +0000 UTC" firstStartedPulling="2025-09-09 23:48:37.09187827 +0000 UTC m=+41.481182010" lastFinishedPulling="2025-09-09 23:48:40.43165276 +0000 UTC m=+44.820956500" observedRunningTime="2025-09-09 23:48:40.977453712 +0000 UTC m=+45.366757452" watchObservedRunningTime="2025-09-09 23:48:43.991082184 +0000 UTC m=+48.380385884" Sep 9 23:48:43.992975 kubelet[2636]: I0909 23:48:43.992590 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59f49d6c46-t45hn" podStartSLOduration=29.887735111 podStartE2EDuration="32.992578208s" podCreationTimestamp="2025-09-09 23:48:11 +0000 UTC" firstStartedPulling="2025-09-09 23:48:40.310154059 +0000 UTC m=+44.699457799" lastFinishedPulling="2025-09-09 23:48:43.414997156 +0000 UTC m=+47.804300896" observedRunningTime="2025-09-09 23:48:43.992422469 +0000 UTC m=+48.381726209" watchObservedRunningTime="2025-09-09 23:48:43.992578208 +0000 UTC m=+48.381881948" Sep 9 23:48:44.004705 sshd[5219]: Accepted publickey for core from 10.0.0.1 port 33522 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:48:44.009060 sshd-session[5219]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:48:44.016508 systemd-logind[1477]: New session 9 of user core. Sep 9 23:48:44.024029 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 23:48:44.030776 kubelet[2636]: I0909 23:48:44.030219 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59f49d6c46-wrgjk" podStartSLOduration=29.223917327 podStartE2EDuration="33.030199962s" podCreationTimestamp="2025-09-09 23:48:11 +0000 UTC" firstStartedPulling="2025-09-09 23:48:39.301508573 +0000 UTC m=+43.690812313" lastFinishedPulling="2025-09-09 23:48:43.107791208 +0000 UTC m=+47.497094948" observedRunningTime="2025-09-09 23:48:44.030008299 +0000 UTC m=+48.419312079" watchObservedRunningTime="2025-09-09 23:48:44.030199962 +0000 UTC m=+48.419503702" Sep 9 23:48:44.242924 sshd[5235]: Connection closed by 10.0.0.1 port 33522 Sep 9 23:48:44.243223 sshd-session[5219]: pam_unix(sshd:session): session closed for user core Sep 9 23:48:44.249202 systemd[1]: sshd@8-10.0.0.67:22-10.0.0.1:33522.service: Deactivated successfully. Sep 9 23:48:44.251199 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 23:48:44.253198 systemd-logind[1477]: Session 9 logged out. Waiting for processes to exit. Sep 9 23:48:44.254273 systemd-logind[1477]: Removed session 9. Sep 9 23:48:44.970548 kubelet[2636]: I0909 23:48:44.970485 2636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:48:44.982749 kubelet[2636]: I0909 23:48:44.982450 2636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:48:45.339760 containerd[1495]: time="2025-09-09T23:48:45.339708383Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:45.342516 containerd[1495]: time="2025-09-09T23:48:45.342433746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 23:48:45.346280 containerd[1495]: time="2025-09-09T23:48:45.346231675Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:45.354890 containerd[1495]: time="2025-09-09T23:48:45.354823492Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:45.356236 containerd[1495]: time="2025-09-09T23:48:45.356187694Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.941004435s" Sep 9 23:48:45.356236 containerd[1495]: time="2025-09-09T23:48:45.356224178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 23:48:45.360960 containerd[1495]: time="2025-09-09T23:48:45.360913373Z" level=info msg="CreateContainer within sandbox \"8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 23:48:45.414345 containerd[1495]: time="2025-09-09T23:48:45.414230483Z" level=info msg="Container e8752d1fb9810cb413fa6a34226331b394aa82f3ed56abfb73a1b7f6dd6187ec: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:45.417486 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2963803580.mount: Deactivated successfully. Sep 9 23:48:45.453759 containerd[1495]: time="2025-09-09T23:48:45.453694273Z" level=info msg="CreateContainer within sandbox \"8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e8752d1fb9810cb413fa6a34226331b394aa82f3ed56abfb73a1b7f6dd6187ec\"" Sep 9 23:48:45.455401 containerd[1495]: time="2025-09-09T23:48:45.455356510Z" level=info msg="StartContainer for \"e8752d1fb9810cb413fa6a34226331b394aa82f3ed56abfb73a1b7f6dd6187ec\"" Sep 9 23:48:45.456999 containerd[1495]: time="2025-09-09T23:48:45.456964620Z" level=info msg="connecting to shim e8752d1fb9810cb413fa6a34226331b394aa82f3ed56abfb73a1b7f6dd6187ec" address="unix:///run/containerd/s/63abc4756e62ed94ab1800124a239f23cedf85a1d2b729119b5acc5d1aa59c7e" protocol=ttrpc version=3 Sep 9 23:48:45.488037 systemd[1]: Started cri-containerd-e8752d1fb9810cb413fa6a34226331b394aa82f3ed56abfb73a1b7f6dd6187ec.scope - libcontainer container e8752d1fb9810cb413fa6a34226331b394aa82f3ed56abfb73a1b7f6dd6187ec. Sep 9 23:48:45.552865 containerd[1495]: time="2025-09-09T23:48:45.552729553Z" level=info msg="StartContainer for \"e8752d1fb9810cb413fa6a34226331b394aa82f3ed56abfb73a1b7f6dd6187ec\" returns successfully" Sep 9 23:48:45.554929 containerd[1495]: time="2025-09-09T23:48:45.554889409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 23:48:45.982854 kubelet[2636]: I0909 23:48:45.982495 2636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:48:46.142559 systemd-networkd[1418]: vxlan.calico: Link UP Sep 9 23:48:46.142567 systemd-networkd[1418]: vxlan.calico: Gained carrier Sep 9 23:48:46.934924 containerd[1495]: time="2025-09-09T23:48:46.934863798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:46.949297 containerd[1495]: time="2025-09-09T23:48:46.949246871Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 23:48:46.964130 containerd[1495]: time="2025-09-09T23:48:46.964078956Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:46.981106 containerd[1495]: time="2025-09-09T23:48:46.981057810Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:48:46.981819 containerd[1495]: time="2025-09-09T23:48:46.981790935Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.426860763s" Sep 9 23:48:46.981901 containerd[1495]: time="2025-09-09T23:48:46.981826140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 23:48:46.984385 containerd[1495]: time="2025-09-09T23:48:46.984343632Z" level=info msg="CreateContainer within sandbox \"8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 23:48:47.017946 containerd[1495]: time="2025-09-09T23:48:47.017152296Z" level=info msg="Container a05bb4edbeabb483828373099549b45a57e83397738deaf79434dd8658b41000: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:48:47.020201 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1746383953.mount: Deactivated successfully. Sep 9 23:48:47.037461 containerd[1495]: time="2025-09-09T23:48:47.037395131Z" level=info msg="CreateContainer within sandbox \"8b8d678c5788bb5bf41e15713ec10bba204b49e556a75a97a248a585289470a0\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"a05bb4edbeabb483828373099549b45a57e83397738deaf79434dd8658b41000\"" Sep 9 23:48:47.038135 containerd[1495]: time="2025-09-09T23:48:47.038083170Z" level=info msg="StartContainer for \"a05bb4edbeabb483828373099549b45a57e83397738deaf79434dd8658b41000\"" Sep 9 23:48:47.039828 containerd[1495]: time="2025-09-09T23:48:47.039763762Z" level=info msg="connecting to shim a05bb4edbeabb483828373099549b45a57e83397738deaf79434dd8658b41000" address="unix:///run/containerd/s/63abc4756e62ed94ab1800124a239f23cedf85a1d2b729119b5acc5d1aa59c7e" protocol=ttrpc version=3 Sep 9 23:48:47.064059 systemd[1]: Started cri-containerd-a05bb4edbeabb483828373099549b45a57e83397738deaf79434dd8658b41000.scope - libcontainer container a05bb4edbeabb483828373099549b45a57e83397738deaf79434dd8658b41000. Sep 9 23:48:47.187866 containerd[1495]: time="2025-09-09T23:48:47.187384684Z" level=info msg="StartContainer for \"a05bb4edbeabb483828373099549b45a57e83397738deaf79434dd8658b41000\" returns successfully" Sep 9 23:48:47.193028 systemd-networkd[1418]: vxlan.calico: Gained IPv6LL Sep 9 23:48:47.774200 kubelet[2636]: I0909 23:48:47.774134 2636 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 23:48:47.776686 kubelet[2636]: I0909 23:48:47.776632 2636 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 23:48:49.268019 systemd[1]: Started sshd@9-10.0.0.67:22-10.0.0.1:33538.service - OpenSSH per-connection server daemon (10.0.0.1:33538). Sep 9 23:48:49.347524 sshd[5508]: Accepted publickey for core from 10.0.0.1 port 33538 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:48:49.349393 sshd-session[5508]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:48:49.354801 systemd-logind[1477]: New session 10 of user core. Sep 9 23:48:49.369053 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 23:48:49.647323 sshd[5511]: Connection closed by 10.0.0.1 port 33538 Sep 9 23:48:49.647566 sshd-session[5508]: pam_unix(sshd:session): session closed for user core Sep 9 23:48:49.656652 systemd[1]: sshd@9-10.0.0.67:22-10.0.0.1:33538.service: Deactivated successfully. Sep 9 23:48:49.659090 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 23:48:49.663235 systemd-logind[1477]: Session 10 logged out. Waiting for processes to exit. Sep 9 23:48:49.667230 systemd[1]: Started sshd@10-10.0.0.67:22-10.0.0.1:33540.service - OpenSSH per-connection server daemon (10.0.0.1:33540). Sep 9 23:48:49.667854 systemd-logind[1477]: Removed session 10. Sep 9 23:48:49.746235 sshd[5525]: Accepted publickey for core from 10.0.0.1 port 33540 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:48:49.747991 sshd-session[5525]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:48:49.759286 systemd-logind[1477]: New session 11 of user core. Sep 9 23:48:49.773110 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 23:48:50.112760 sshd[5528]: Connection closed by 10.0.0.1 port 33540 Sep 9 23:48:50.113375 sshd-session[5525]: pam_unix(sshd:session): session closed for user core Sep 9 23:48:50.127507 systemd[1]: sshd@10-10.0.0.67:22-10.0.0.1:33540.service: Deactivated successfully. Sep 9 23:48:50.131092 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 23:48:50.135061 systemd-logind[1477]: Session 11 logged out. Waiting for processes to exit. Sep 9 23:48:50.139313 systemd[1]: Started sshd@11-10.0.0.67:22-10.0.0.1:39824.service - OpenSSH per-connection server daemon (10.0.0.1:39824). Sep 9 23:48:50.141124 systemd-logind[1477]: Removed session 11. Sep 9 23:48:50.205198 sshd[5539]: Accepted publickey for core from 10.0.0.1 port 39824 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:48:50.207744 sshd-session[5539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:48:50.212695 systemd-logind[1477]: New session 12 of user core. Sep 9 23:48:50.225094 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 23:48:50.424151 sshd[5542]: Connection closed by 10.0.0.1 port 39824 Sep 9 23:48:50.424458 sshd-session[5539]: pam_unix(sshd:session): session closed for user core Sep 9 23:48:50.428726 systemd[1]: sshd@11-10.0.0.67:22-10.0.0.1:39824.service: Deactivated successfully. Sep 9 23:48:50.432709 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 23:48:50.434090 systemd-logind[1477]: Session 12 logged out. Waiting for processes to exit. Sep 9 23:48:50.436942 systemd-logind[1477]: Removed session 12. Sep 9 23:48:55.437683 systemd[1]: Started sshd@12-10.0.0.67:22-10.0.0.1:39830.service - OpenSSH per-connection server daemon (10.0.0.1:39830). Sep 9 23:48:55.496573 sshd[5568]: Accepted publickey for core from 10.0.0.1 port 39830 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:48:55.498087 sshd-session[5568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:48:55.503272 systemd-logind[1477]: New session 13 of user core. Sep 9 23:48:55.511054 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 23:48:55.688739 sshd[5571]: Connection closed by 10.0.0.1 port 39830 Sep 9 23:48:55.689011 sshd-session[5568]: pam_unix(sshd:session): session closed for user core Sep 9 23:48:55.694665 systemd-logind[1477]: Session 13 logged out. Waiting for processes to exit. Sep 9 23:48:55.695128 systemd[1]: sshd@12-10.0.0.67:22-10.0.0.1:39830.service: Deactivated successfully. Sep 9 23:48:55.697685 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 23:48:55.705786 systemd-logind[1477]: Removed session 13. Sep 9 23:49:00.712445 systemd[1]: Started sshd@13-10.0.0.67:22-10.0.0.1:50436.service - OpenSSH per-connection server daemon (10.0.0.1:50436). Sep 9 23:49:00.807895 sshd[5596]: Accepted publickey for core from 10.0.0.1 port 50436 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:00.809533 sshd-session[5596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:00.818776 systemd-logind[1477]: New session 14 of user core. Sep 9 23:49:00.827082 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 23:49:01.004106 sshd[5599]: Connection closed by 10.0.0.1 port 50436 Sep 9 23:49:01.004670 sshd-session[5596]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:01.009121 systemd[1]: sshd@13-10.0.0.67:22-10.0.0.1:50436.service: Deactivated successfully. Sep 9 23:49:01.013912 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 23:49:01.015202 systemd-logind[1477]: Session 14 logged out. Waiting for processes to exit. Sep 9 23:49:01.016884 systemd-logind[1477]: Removed session 14. Sep 9 23:49:04.557730 kubelet[2636]: I0909 23:49:04.557514 2636 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:49:04.593741 kubelet[2636]: I0909 23:49:04.593670 2636 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9wzpb" podStartSLOduration=41.947799895 podStartE2EDuration="48.593652986s" podCreationTimestamp="2025-09-09 23:48:16 +0000 UTC" firstStartedPulling="2025-09-09 23:48:40.336693132 +0000 UTC m=+44.725996872" lastFinishedPulling="2025-09-09 23:48:46.982546223 +0000 UTC m=+51.371849963" observedRunningTime="2025-09-09 23:48:48.014959458 +0000 UTC m=+52.404263198" watchObservedRunningTime="2025-09-09 23:49:04.593652986 +0000 UTC m=+68.982956726" Sep 9 23:49:06.023086 systemd[1]: Started sshd@14-10.0.0.67:22-10.0.0.1:50438.service - OpenSSH per-connection server daemon (10.0.0.1:50438). Sep 9 23:49:06.099096 sshd[5619]: Accepted publickey for core from 10.0.0.1 port 50438 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:06.100827 sshd-session[5619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:06.106727 systemd-logind[1477]: New session 15 of user core. Sep 9 23:49:06.117021 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 23:49:06.275165 sshd[5622]: Connection closed by 10.0.0.1 port 50438 Sep 9 23:49:06.276067 sshd-session[5619]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:06.285435 systemd[1]: sshd@14-10.0.0.67:22-10.0.0.1:50438.service: Deactivated successfully. Sep 9 23:49:06.287232 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 23:49:06.288755 systemd-logind[1477]: Session 15 logged out. Waiting for processes to exit. Sep 9 23:49:06.292131 systemd[1]: Started sshd@15-10.0.0.67:22-10.0.0.1:50442.service - OpenSSH per-connection server daemon (10.0.0.1:50442). Sep 9 23:49:06.292965 systemd-logind[1477]: Removed session 15. Sep 9 23:49:06.360052 sshd[5635]: Accepted publickey for core from 10.0.0.1 port 50442 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:06.361693 sshd-session[5635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:06.367717 systemd-logind[1477]: New session 16 of user core. Sep 9 23:49:06.375092 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 23:49:06.631189 sshd[5638]: Connection closed by 10.0.0.1 port 50442 Sep 9 23:49:06.631548 sshd-session[5635]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:06.643508 systemd[1]: sshd@15-10.0.0.67:22-10.0.0.1:50442.service: Deactivated successfully. Sep 9 23:49:06.647193 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 23:49:06.648664 systemd-logind[1477]: Session 16 logged out. Waiting for processes to exit. Sep 9 23:49:06.650844 systemd-logind[1477]: Removed session 16. Sep 9 23:49:06.653228 systemd[1]: Started sshd@16-10.0.0.67:22-10.0.0.1:50448.service - OpenSSH per-connection server daemon (10.0.0.1:50448). Sep 9 23:49:06.717575 sshd[5657]: Accepted publickey for core from 10.0.0.1 port 50448 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:06.718999 sshd-session[5657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:06.724232 systemd-logind[1477]: New session 17 of user core. Sep 9 23:49:06.732044 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 23:49:07.423506 sshd[5660]: Connection closed by 10.0.0.1 port 50448 Sep 9 23:49:07.423990 sshd-session[5657]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:07.434686 systemd[1]: sshd@16-10.0.0.67:22-10.0.0.1:50448.service: Deactivated successfully. Sep 9 23:49:07.440765 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 23:49:07.442089 systemd-logind[1477]: Session 17 logged out. Waiting for processes to exit. Sep 9 23:49:07.444257 systemd-logind[1477]: Removed session 17. Sep 9 23:49:07.447516 systemd[1]: Started sshd@17-10.0.0.67:22-10.0.0.1:50458.service - OpenSSH per-connection server daemon (10.0.0.1:50458). Sep 9 23:49:07.512059 sshd[5681]: Accepted publickey for core from 10.0.0.1 port 50458 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:07.513449 sshd-session[5681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:07.518649 systemd-logind[1477]: New session 18 of user core. Sep 9 23:49:07.533049 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 23:49:07.906506 sshd[5684]: Connection closed by 10.0.0.1 port 50458 Sep 9 23:49:07.907732 sshd-session[5681]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:07.917219 systemd[1]: sshd@17-10.0.0.67:22-10.0.0.1:50458.service: Deactivated successfully. Sep 9 23:49:07.920683 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 23:49:07.921656 systemd-logind[1477]: Session 18 logged out. Waiting for processes to exit. Sep 9 23:49:07.926985 systemd[1]: Started sshd@18-10.0.0.67:22-10.0.0.1:50474.service - OpenSSH per-connection server daemon (10.0.0.1:50474). Sep 9 23:49:07.932239 systemd-logind[1477]: Removed session 18. Sep 9 23:49:07.994124 sshd[5696]: Accepted publickey for core from 10.0.0.1 port 50474 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:07.995576 sshd-session[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:08.000614 systemd-logind[1477]: New session 19 of user core. Sep 9 23:49:08.007081 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 23:49:08.158629 sshd[5699]: Connection closed by 10.0.0.1 port 50474 Sep 9 23:49:08.158415 sshd-session[5696]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:08.162977 systemd[1]: sshd@18-10.0.0.67:22-10.0.0.1:50474.service: Deactivated successfully. Sep 9 23:49:08.164748 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 23:49:08.167183 systemd-logind[1477]: Session 19 logged out. Waiting for processes to exit. Sep 9 23:49:08.168733 systemd-logind[1477]: Removed session 19. Sep 9 23:49:08.998112 containerd[1495]: time="2025-09-09T23:49:08.997997659Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194\" id:\"c7a2bb089c1126f5058b5ed8444662f70f30983178dd45afcce60aff68096dd9\" pid:5724 exited_at:{seconds:1757461748 nanos:997586688}" Sep 9 23:49:12.054001 containerd[1495]: time="2025-09-09T23:49:12.053956184Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0\" id:\"6f00ca532cb3baaea9434b9a210a05334d2e2959f7b37d0d153c8d67e0526a6d\" pid:5746 exited_at:{seconds:1757461752 nanos:53179304}" Sep 9 23:49:12.499953 containerd[1495]: time="2025-09-09T23:49:12.499770815Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b75b8625a8d162f815d4cea523aadba3d1657208f4aec64a9cb3492b401e194\" id:\"7ca3c7b5b67108879ec2c140717b146700ce907afdb62be5ff598ff63974eefb\" pid:5772 exited_at:{seconds:1757461752 nanos:499522988}" Sep 9 23:49:13.171911 systemd[1]: Started sshd@19-10.0.0.67:22-10.0.0.1:53096.service - OpenSSH per-connection server daemon (10.0.0.1:53096). Sep 9 23:49:13.236064 sshd[5784]: Accepted publickey for core from 10.0.0.1 port 53096 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:13.237480 sshd-session[5784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:13.242169 systemd-logind[1477]: New session 20 of user core. Sep 9 23:49:13.255077 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 23:49:13.394402 sshd[5787]: Connection closed by 10.0.0.1 port 53096 Sep 9 23:49:13.395278 sshd-session[5784]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:13.399558 systemd[1]: sshd@19-10.0.0.67:22-10.0.0.1:53096.service: Deactivated successfully. Sep 9 23:49:13.403631 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 23:49:13.405483 systemd-logind[1477]: Session 20 logged out. Waiting for processes to exit. Sep 9 23:49:13.407322 systemd-logind[1477]: Removed session 20. Sep 9 23:49:13.829478 containerd[1495]: time="2025-09-09T23:49:13.829429136Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2ed11ab839a77dc8705fedaf74ed5f5abce6d53815bdbc635a11f41ea13c95f5\" id:\"edd52cd65fdea1841eb7b286d6a4964cf535437d5807c085ac2188fd5594c3f6\" pid:5812 exited_at:{seconds:1757461753 nanos:828727251}" Sep 9 23:49:17.739640 containerd[1495]: time="2025-09-09T23:49:17.739580296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4e5198a9bfe6bbed6753d10a06f87775a2c8b314b8c2b196e855c8800e458de0\" id:\"f834a9b7b34543bfdd7230c5d3d24d4ffc2ca3071285d4dbc37b3bdbdf22ac4a\" pid:5837 exited_at:{seconds:1757461757 nanos:739295426}" Sep 9 23:49:18.411902 systemd[1]: Started sshd@20-10.0.0.67:22-10.0.0.1:53106.service - OpenSSH per-connection server daemon (10.0.0.1:53106). Sep 9 23:49:18.495852 sshd[5849]: Accepted publickey for core from 10.0.0.1 port 53106 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:18.497705 sshd-session[5849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:18.502978 systemd-logind[1477]: New session 21 of user core. Sep 9 23:49:18.512046 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 23:49:18.764661 sshd[5852]: Connection closed by 10.0.0.1 port 53106 Sep 9 23:49:18.765075 sshd-session[5849]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:18.769038 systemd-logind[1477]: Session 21 logged out. Waiting for processes to exit. Sep 9 23:49:18.769345 systemd[1]: sshd@20-10.0.0.67:22-10.0.0.1:53106.service: Deactivated successfully. Sep 9 23:49:18.771306 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 23:49:18.774334 systemd-logind[1477]: Removed session 21. Sep 9 23:49:23.776903 systemd[1]: Started sshd@21-10.0.0.67:22-10.0.0.1:51732.service - OpenSSH per-connection server daemon (10.0.0.1:51732). Sep 9 23:49:23.838803 sshd[5865]: Accepted publickey for core from 10.0.0.1 port 51732 ssh2: RSA SHA256:BIipJKfG3sr4zTNTEUz0SDDjJtEzBqbnZB4/ga6/CtY Sep 9 23:49:23.841583 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:49:23.848612 systemd-logind[1477]: New session 22 of user core. Sep 9 23:49:23.858120 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 23:49:24.046189 sshd[5868]: Connection closed by 10.0.0.1 port 51732 Sep 9 23:49:24.046472 sshd-session[5865]: pam_unix(sshd:session): session closed for user core Sep 9 23:49:24.054241 systemd[1]: sshd@21-10.0.0.67:22-10.0.0.1:51732.service: Deactivated successfully. Sep 9 23:49:24.056353 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 23:49:24.057557 systemd-logind[1477]: Session 22 logged out. Waiting for processes to exit. Sep 9 23:49:24.058926 systemd-logind[1477]: Removed session 22.