Sep 16 04:37:09.775943 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 16 04:37:09.776018 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 16 03:05:48 -00 2025 Sep 16 04:37:09.776046 kernel: KASLR enabled Sep 16 04:37:09.776052 kernel: efi: EFI v2.7 by EDK II Sep 16 04:37:09.776058 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb221f18 Sep 16 04:37:09.776076 kernel: random: crng init done Sep 16 04:37:09.776084 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 16 04:37:09.776090 kernel: secureboot: Secure boot enabled Sep 16 04:37:09.776095 kernel: ACPI: Early table checksum verification disabled Sep 16 04:37:09.776104 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Sep 16 04:37:09.776111 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 16 04:37:09.776116 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776122 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776128 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776135 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776143 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776149 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776155 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776162 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776168 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:37:09.776174 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 16 04:37:09.776180 kernel: ACPI: Use ACPI SPCR as default console: No Sep 16 04:37:09.776186 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 16 04:37:09.776192 kernel: NODE_DATA(0) allocated [mem 0xdc737a00-0xdc73efff] Sep 16 04:37:09.776198 kernel: Zone ranges: Sep 16 04:37:09.776206 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 16 04:37:09.776212 kernel: DMA32 empty Sep 16 04:37:09.776218 kernel: Normal empty Sep 16 04:37:09.776224 kernel: Device empty Sep 16 04:37:09.776231 kernel: Movable zone start for each node Sep 16 04:37:09.776237 kernel: Early memory node ranges Sep 16 04:37:09.776243 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Sep 16 04:37:09.776250 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Sep 16 04:37:09.776256 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Sep 16 04:37:09.776262 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Sep 16 04:37:09.776269 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Sep 16 04:37:09.776275 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Sep 16 04:37:09.776282 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Sep 16 04:37:09.776288 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Sep 16 04:37:09.776294 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 16 04:37:09.776303 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 16 04:37:09.776310 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 16 04:37:09.776317 kernel: cma: Reserved 16 MiB at 0x00000000d7a00000 on node -1 Sep 16 04:37:09.776323 kernel: psci: probing for conduit method from ACPI. Sep 16 04:37:09.776352 kernel: psci: PSCIv1.1 detected in firmware. Sep 16 04:37:09.776359 kernel: psci: Using standard PSCI v0.2 function IDs Sep 16 04:37:09.776368 kernel: psci: Trusted OS migration not required Sep 16 04:37:09.776377 kernel: psci: SMC Calling Convention v1.1 Sep 16 04:37:09.776386 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 16 04:37:09.776395 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 16 04:37:09.776403 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 16 04:37:09.776410 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 16 04:37:09.776417 kernel: Detected PIPT I-cache on CPU0 Sep 16 04:37:09.776426 kernel: CPU features: detected: GIC system register CPU interface Sep 16 04:37:09.776433 kernel: CPU features: detected: Spectre-v4 Sep 16 04:37:09.776439 kernel: CPU features: detected: Spectre-BHB Sep 16 04:37:09.776446 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 16 04:37:09.776452 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 16 04:37:09.776459 kernel: CPU features: detected: ARM erratum 1418040 Sep 16 04:37:09.776466 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 16 04:37:09.776472 kernel: alternatives: applying boot alternatives Sep 16 04:37:09.776480 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=eff5cc3c399cf6fc52e3071751a09276871b099078da6d1b1a498405d04a9313 Sep 16 04:37:09.776487 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 16 04:37:09.776493 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 16 04:37:09.776501 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 04:37:09.776508 kernel: Fallback order for Node 0: 0 Sep 16 04:37:09.776514 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 16 04:37:09.776521 kernel: Policy zone: DMA Sep 16 04:37:09.776527 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 16 04:37:09.776534 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 16 04:37:09.776541 kernel: software IO TLB: area num 4. Sep 16 04:37:09.776547 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 16 04:37:09.776554 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Sep 16 04:37:09.776560 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 16 04:37:09.776567 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 16 04:37:09.776574 kernel: rcu: RCU event tracing is enabled. Sep 16 04:37:09.776582 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 16 04:37:09.776589 kernel: Trampoline variant of Tasks RCU enabled. Sep 16 04:37:09.776595 kernel: Tracing variant of Tasks RCU enabled. Sep 16 04:37:09.776602 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 16 04:37:09.776608 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 16 04:37:09.776615 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 16 04:37:09.776621 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 16 04:37:09.776628 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 16 04:37:09.776634 kernel: GICv3: 256 SPIs implemented Sep 16 04:37:09.776641 kernel: GICv3: 0 Extended SPIs implemented Sep 16 04:37:09.776647 kernel: Root IRQ handler: gic_handle_irq Sep 16 04:37:09.776655 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 16 04:37:09.776661 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 16 04:37:09.776668 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 16 04:37:09.776674 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 16 04:37:09.776681 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 16 04:37:09.776688 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 16 04:37:09.776694 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 16 04:37:09.776701 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 16 04:37:09.776707 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 16 04:37:09.776714 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 16 04:37:09.776720 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 16 04:37:09.776727 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 16 04:37:09.776735 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 16 04:37:09.776741 kernel: arm-pv: using stolen time PV Sep 16 04:37:09.776748 kernel: Console: colour dummy device 80x25 Sep 16 04:37:09.776755 kernel: ACPI: Core revision 20240827 Sep 16 04:37:09.776761 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 16 04:37:09.776768 kernel: pid_max: default: 32768 minimum: 301 Sep 16 04:37:09.776775 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 16 04:37:09.776781 kernel: landlock: Up and running. Sep 16 04:37:09.776788 kernel: SELinux: Initializing. Sep 16 04:37:09.776796 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 16 04:37:09.776803 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 16 04:37:09.776809 kernel: rcu: Hierarchical SRCU implementation. Sep 16 04:37:09.776816 kernel: rcu: Max phase no-delay instances is 400. Sep 16 04:37:09.776823 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 16 04:37:09.776830 kernel: Remapping and enabling EFI services. Sep 16 04:37:09.776843 kernel: smp: Bringing up secondary CPUs ... Sep 16 04:37:09.776850 kernel: Detected PIPT I-cache on CPU1 Sep 16 04:37:09.776856 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 16 04:37:09.776865 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 16 04:37:09.776877 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 16 04:37:09.776884 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 16 04:37:09.776893 kernel: Detected PIPT I-cache on CPU2 Sep 16 04:37:09.776900 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 16 04:37:09.776907 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 16 04:37:09.776914 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 16 04:37:09.776921 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 16 04:37:09.776929 kernel: Detected PIPT I-cache on CPU3 Sep 16 04:37:09.776937 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 16 04:37:09.776944 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 16 04:37:09.776951 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 16 04:37:09.776963 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 16 04:37:09.776970 kernel: smp: Brought up 1 node, 4 CPUs Sep 16 04:37:09.776977 kernel: SMP: Total of 4 processors activated. Sep 16 04:37:09.776986 kernel: CPU: All CPU(s) started at EL1 Sep 16 04:37:09.776995 kernel: CPU features: detected: 32-bit EL0 Support Sep 16 04:37:09.777004 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 16 04:37:09.777013 kernel: CPU features: detected: Common not Private translations Sep 16 04:37:09.777021 kernel: CPU features: detected: CRC32 instructions Sep 16 04:37:09.777028 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 16 04:37:09.777036 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 16 04:37:09.777044 kernel: CPU features: detected: LSE atomic instructions Sep 16 04:37:09.777052 kernel: CPU features: detected: Privileged Access Never Sep 16 04:37:09.777059 kernel: CPU features: detected: RAS Extension Support Sep 16 04:37:09.777066 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 16 04:37:09.777073 kernel: alternatives: applying system-wide alternatives Sep 16 04:37:09.777082 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 16 04:37:09.777090 kernel: Memory: 2422372K/2572288K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 127580K reserved, 16384K cma-reserved) Sep 16 04:37:09.777097 kernel: devtmpfs: initialized Sep 16 04:37:09.777105 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 16 04:37:09.777112 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 16 04:37:09.777119 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 16 04:37:09.777127 kernel: 0 pages in range for non-PLT usage Sep 16 04:37:09.777134 kernel: 508560 pages in range for PLT usage Sep 16 04:37:09.777141 kernel: pinctrl core: initialized pinctrl subsystem Sep 16 04:37:09.777150 kernel: SMBIOS 3.0.0 present. Sep 16 04:37:09.777157 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 16 04:37:09.777164 kernel: DMI: Memory slots populated: 1/1 Sep 16 04:37:09.777171 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 16 04:37:09.777178 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 16 04:37:09.777185 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 16 04:37:09.777192 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 16 04:37:09.777200 kernel: audit: initializing netlink subsys (disabled) Sep 16 04:37:09.777207 kernel: audit: type=2000 audit(0.024:1): state=initialized audit_enabled=0 res=1 Sep 16 04:37:09.777215 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 16 04:37:09.777222 kernel: cpuidle: using governor menu Sep 16 04:37:09.777229 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 16 04:37:09.777236 kernel: ASID allocator initialised with 32768 entries Sep 16 04:37:09.777243 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 16 04:37:09.777250 kernel: Serial: AMBA PL011 UART driver Sep 16 04:37:09.777257 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 16 04:37:09.777264 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 16 04:37:09.777271 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 16 04:37:09.777280 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 16 04:37:09.777287 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 16 04:37:09.777294 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 16 04:37:09.777301 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 16 04:37:09.777308 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 16 04:37:09.777315 kernel: ACPI: Added _OSI(Module Device) Sep 16 04:37:09.777321 kernel: ACPI: Added _OSI(Processor Device) Sep 16 04:37:09.777411 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 16 04:37:09.777419 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 16 04:37:09.777429 kernel: ACPI: Interpreter enabled Sep 16 04:37:09.777436 kernel: ACPI: Using GIC for interrupt routing Sep 16 04:37:09.777443 kernel: ACPI: MCFG table detected, 1 entries Sep 16 04:37:09.777450 kernel: ACPI: CPU0 has been hot-added Sep 16 04:37:09.777457 kernel: ACPI: CPU1 has been hot-added Sep 16 04:37:09.777464 kernel: ACPI: CPU2 has been hot-added Sep 16 04:37:09.777471 kernel: ACPI: CPU3 has been hot-added Sep 16 04:37:09.777478 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 16 04:37:09.777485 kernel: printk: legacy console [ttyAMA0] enabled Sep 16 04:37:09.777494 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 16 04:37:09.777659 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 16 04:37:09.777724 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 16 04:37:09.777782 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 16 04:37:09.777854 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 16 04:37:09.777917 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 16 04:37:09.777926 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 16 04:37:09.777937 kernel: PCI host bridge to bus 0000:00 Sep 16 04:37:09.778004 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 16 04:37:09.778058 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 16 04:37:09.778111 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 16 04:37:09.778163 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 16 04:37:09.778247 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 16 04:37:09.778319 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 16 04:37:09.778400 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 16 04:37:09.778460 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 16 04:37:09.778519 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 16 04:37:09.778577 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 16 04:37:09.778636 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 16 04:37:09.778695 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 16 04:37:09.778753 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 16 04:37:09.778806 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 16 04:37:09.778871 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 16 04:37:09.778881 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 16 04:37:09.778888 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 16 04:37:09.778895 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 16 04:37:09.778902 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 16 04:37:09.778910 kernel: iommu: Default domain type: Translated Sep 16 04:37:09.778917 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 16 04:37:09.778926 kernel: efivars: Registered efivars operations Sep 16 04:37:09.778933 kernel: vgaarb: loaded Sep 16 04:37:09.778940 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 16 04:37:09.778947 kernel: VFS: Disk quotas dquot_6.6.0 Sep 16 04:37:09.778954 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 16 04:37:09.778961 kernel: pnp: PnP ACPI init Sep 16 04:37:09.779027 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 16 04:37:09.779038 kernel: pnp: PnP ACPI: found 1 devices Sep 16 04:37:09.779047 kernel: NET: Registered PF_INET protocol family Sep 16 04:37:09.779054 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 16 04:37:09.779061 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 16 04:37:09.779068 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 16 04:37:09.779075 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 16 04:37:09.779083 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 16 04:37:09.779090 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 16 04:37:09.779097 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 16 04:37:09.779104 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 16 04:37:09.779113 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 16 04:37:09.779119 kernel: PCI: CLS 0 bytes, default 64 Sep 16 04:37:09.779126 kernel: kvm [1]: HYP mode not available Sep 16 04:37:09.779133 kernel: Initialise system trusted keyrings Sep 16 04:37:09.779140 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 16 04:37:09.779147 kernel: Key type asymmetric registered Sep 16 04:37:09.779154 kernel: Asymmetric key parser 'x509' registered Sep 16 04:37:09.779161 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 16 04:37:09.779169 kernel: io scheduler mq-deadline registered Sep 16 04:37:09.779177 kernel: io scheduler kyber registered Sep 16 04:37:09.779184 kernel: io scheduler bfq registered Sep 16 04:37:09.779194 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 16 04:37:09.779201 kernel: ACPI: button: Power Button [PWRB] Sep 16 04:37:09.779212 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 16 04:37:09.779271 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 16 04:37:09.779280 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 16 04:37:09.779287 kernel: thunder_xcv, ver 1.0 Sep 16 04:37:09.779294 kernel: thunder_bgx, ver 1.0 Sep 16 04:37:09.779303 kernel: nicpf, ver 1.0 Sep 16 04:37:09.779310 kernel: nicvf, ver 1.0 Sep 16 04:37:09.779459 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 16 04:37:09.779520 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-16T04:37:09 UTC (1757997429) Sep 16 04:37:09.779529 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 16 04:37:09.779537 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 16 04:37:09.779544 kernel: watchdog: NMI not fully supported Sep 16 04:37:09.779551 kernel: watchdog: Hard watchdog permanently disabled Sep 16 04:37:09.779561 kernel: NET: Registered PF_INET6 protocol family Sep 16 04:37:09.779569 kernel: Segment Routing with IPv6 Sep 16 04:37:09.779576 kernel: In-situ OAM (IOAM) with IPv6 Sep 16 04:37:09.779583 kernel: NET: Registered PF_PACKET protocol family Sep 16 04:37:09.779590 kernel: Key type dns_resolver registered Sep 16 04:37:09.779597 kernel: registered taskstats version 1 Sep 16 04:37:09.779604 kernel: Loading compiled-in X.509 certificates Sep 16 04:37:09.779611 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 99eb88579c3d58869b2224a85ec8efa5647af805' Sep 16 04:37:09.779618 kernel: Demotion targets for Node 0: null Sep 16 04:37:09.779627 kernel: Key type .fscrypt registered Sep 16 04:37:09.779634 kernel: Key type fscrypt-provisioning registered Sep 16 04:37:09.779641 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 16 04:37:09.779648 kernel: ima: Allocated hash algorithm: sha1 Sep 16 04:37:09.779655 kernel: ima: No architecture policies found Sep 16 04:37:09.779662 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 16 04:37:09.779669 kernel: clk: Disabling unused clocks Sep 16 04:37:09.779676 kernel: PM: genpd: Disabling unused power domains Sep 16 04:37:09.779683 kernel: Warning: unable to open an initial console. Sep 16 04:37:09.779692 kernel: Freeing unused kernel memory: 38976K Sep 16 04:37:09.779699 kernel: Run /init as init process Sep 16 04:37:09.779707 kernel: with arguments: Sep 16 04:37:09.779713 kernel: /init Sep 16 04:37:09.779721 kernel: with environment: Sep 16 04:37:09.779727 kernel: HOME=/ Sep 16 04:37:09.779734 kernel: TERM=linux Sep 16 04:37:09.779741 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 16 04:37:09.779749 systemd[1]: Successfully made /usr/ read-only. Sep 16 04:37:09.779760 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:37:09.779768 systemd[1]: Detected virtualization kvm. Sep 16 04:37:09.779776 systemd[1]: Detected architecture arm64. Sep 16 04:37:09.779783 systemd[1]: Running in initrd. Sep 16 04:37:09.779791 systemd[1]: No hostname configured, using default hostname. Sep 16 04:37:09.779799 systemd[1]: Hostname set to . Sep 16 04:37:09.779806 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:37:09.779815 systemd[1]: Queued start job for default target initrd.target. Sep 16 04:37:09.779823 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:37:09.779830 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:37:09.779850 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 16 04:37:09.779858 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:37:09.779866 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 16 04:37:09.779874 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 16 04:37:09.779885 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 16 04:37:09.779893 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 16 04:37:09.779901 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:37:09.779909 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:37:09.779917 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:37:09.779924 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:37:09.779932 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:37:09.779940 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:37:09.779949 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:37:09.779957 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:37:09.779965 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 16 04:37:09.779973 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 16 04:37:09.779980 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:37:09.779989 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:37:09.779996 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:37:09.780004 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:37:09.780011 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 16 04:37:09.780020 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:37:09.780028 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 16 04:37:09.780036 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 16 04:37:09.780044 systemd[1]: Starting systemd-fsck-usr.service... Sep 16 04:37:09.780051 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:37:09.780059 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:37:09.780067 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:37:09.780074 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:37:09.780084 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 16 04:37:09.780092 systemd[1]: Finished systemd-fsck-usr.service. Sep 16 04:37:09.780100 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 04:37:09.780127 systemd-journald[244]: Collecting audit messages is disabled. Sep 16 04:37:09.780149 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 16 04:37:09.780157 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:37:09.780164 kernel: Bridge firewalling registered Sep 16 04:37:09.780172 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:37:09.780182 systemd-journald[244]: Journal started Sep 16 04:37:09.780200 systemd-journald[244]: Runtime Journal (/run/log/journal/638bb3d30bd948aea8233c8adde0e944) is 6M, max 48.5M, 42.4M free. Sep 16 04:37:09.760888 systemd-modules-load[247]: Inserted module 'overlay' Sep 16 04:37:09.776949 systemd-modules-load[247]: Inserted module 'br_netfilter' Sep 16 04:37:09.784054 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:37:09.783590 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:37:09.786662 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 16 04:37:09.788518 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:37:09.790483 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:37:09.799199 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:37:09.807383 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:37:09.809720 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:37:09.810606 systemd-tmpfiles[265]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 16 04:37:09.813577 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:37:09.816211 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:37:09.817459 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:37:09.820169 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 16 04:37:09.846107 dracut-cmdline[289]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=eff5cc3c399cf6fc52e3071751a09276871b099078da6d1b1a498405d04a9313 Sep 16 04:37:09.859991 systemd-resolved[288]: Positive Trust Anchors: Sep 16 04:37:09.860012 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:37:09.860043 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:37:09.864913 systemd-resolved[288]: Defaulting to hostname 'linux'. Sep 16 04:37:09.865905 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:37:09.868293 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:37:09.919366 kernel: SCSI subsystem initialized Sep 16 04:37:09.923347 kernel: Loading iSCSI transport class v2.0-870. Sep 16 04:37:09.931373 kernel: iscsi: registered transport (tcp) Sep 16 04:37:09.943564 kernel: iscsi: registered transport (qla4xxx) Sep 16 04:37:09.943608 kernel: QLogic iSCSI HBA Driver Sep 16 04:37:09.960438 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:37:09.975815 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:37:09.977858 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:37:10.023642 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 16 04:37:10.025739 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 16 04:37:10.091353 kernel: raid6: neonx8 gen() 15782 MB/s Sep 16 04:37:10.108342 kernel: raid6: neonx4 gen() 15824 MB/s Sep 16 04:37:10.125340 kernel: raid6: neonx2 gen() 13624 MB/s Sep 16 04:37:10.142340 kernel: raid6: neonx1 gen() 10432 MB/s Sep 16 04:37:10.159339 kernel: raid6: int64x8 gen() 6900 MB/s Sep 16 04:37:10.176340 kernel: raid6: int64x4 gen() 7354 MB/s Sep 16 04:37:10.193349 kernel: raid6: int64x2 gen() 6099 MB/s Sep 16 04:37:10.210346 kernel: raid6: int64x1 gen() 5044 MB/s Sep 16 04:37:10.210368 kernel: raid6: using algorithm neonx4 gen() 15824 MB/s Sep 16 04:37:10.227361 kernel: raid6: .... xor() 12357 MB/s, rmw enabled Sep 16 04:37:10.227390 kernel: raid6: using neon recovery algorithm Sep 16 04:37:10.232549 kernel: xor: measuring software checksum speed Sep 16 04:37:10.232575 kernel: 8regs : 21624 MB/sec Sep 16 04:37:10.233727 kernel: 32regs : 21681 MB/sec Sep 16 04:37:10.233741 kernel: arm64_neon : 28089 MB/sec Sep 16 04:37:10.233750 kernel: xor: using function: arm64_neon (28089 MB/sec) Sep 16 04:37:10.287352 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 16 04:37:10.294401 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:37:10.297501 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:37:10.324055 systemd-udevd[499]: Using default interface naming scheme 'v255'. Sep 16 04:37:10.328414 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:37:10.330151 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 16 04:37:10.358676 dracut-pre-trigger[505]: rd.md=0: removing MD RAID activation Sep 16 04:37:10.380754 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:37:10.382923 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:37:10.439743 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:37:10.442016 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 16 04:37:10.496370 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 16 04:37:10.496535 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 16 04:37:10.503343 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:37:10.503463 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:37:10.511221 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:37:10.513008 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:37:10.516595 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 16 04:37:10.516633 kernel: GPT:9289727 != 19775487 Sep 16 04:37:10.516644 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 16 04:37:10.516653 kernel: GPT:9289727 != 19775487 Sep 16 04:37:10.517338 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 16 04:37:10.517353 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 16 04:37:10.541957 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 16 04:37:10.543261 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:37:10.550431 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 16 04:37:10.551355 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 16 04:37:10.553457 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 16 04:37:10.562625 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 16 04:37:10.574268 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 16 04:37:10.579599 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:37:10.580548 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:37:10.582277 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:37:10.584747 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 16 04:37:10.586287 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 16 04:37:10.611257 disk-uuid[589]: Primary Header is updated. Sep 16 04:37:10.611257 disk-uuid[589]: Secondary Entries is updated. Sep 16 04:37:10.611257 disk-uuid[589]: Secondary Header is updated. Sep 16 04:37:10.614493 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:37:10.617355 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 16 04:37:10.621353 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 16 04:37:11.623362 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 16 04:37:11.623840 disk-uuid[594]: The operation has completed successfully. Sep 16 04:37:11.650928 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 16 04:37:11.652066 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 16 04:37:11.674883 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 16 04:37:11.704404 sh[608]: Success Sep 16 04:37:11.717810 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 16 04:37:11.717854 kernel: device-mapper: uevent: version 1.0.3 Sep 16 04:37:11.717874 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 16 04:37:11.725344 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 16 04:37:11.748921 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 16 04:37:11.751940 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 16 04:37:11.780080 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 16 04:37:11.786547 kernel: BTRFS: device fsid 782b6948-7aaa-439e-9946-c8fdb4d8f287 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (620) Sep 16 04:37:11.786585 kernel: BTRFS info (device dm-0): first mount of filesystem 782b6948-7aaa-439e-9946-c8fdb4d8f287 Sep 16 04:37:11.786596 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:37:11.790792 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 16 04:37:11.790839 kernel: BTRFS info (device dm-0): enabling free space tree Sep 16 04:37:11.791854 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 16 04:37:11.793042 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:37:11.794223 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 16 04:37:11.795003 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 16 04:37:11.797856 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 16 04:37:11.825358 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (651) Sep 16 04:37:11.827672 kernel: BTRFS info (device vda6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:37:11.827707 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:37:11.830350 kernel: BTRFS info (device vda6): turning on async discard Sep 16 04:37:11.830388 kernel: BTRFS info (device vda6): enabling free space tree Sep 16 04:37:11.834344 kernel: BTRFS info (device vda6): last unmount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:37:11.836576 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 16 04:37:11.838674 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 16 04:37:11.906355 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:37:11.908807 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:37:11.939659 ignition[696]: Ignition 2.22.0 Sep 16 04:37:11.939673 ignition[696]: Stage: fetch-offline Sep 16 04:37:11.939702 ignition[696]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:37:11.939709 ignition[696]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 16 04:37:11.939785 ignition[696]: parsed url from cmdline: "" Sep 16 04:37:11.939788 ignition[696]: no config URL provided Sep 16 04:37:11.939793 ignition[696]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 04:37:11.939800 ignition[696]: no config at "/usr/lib/ignition/user.ign" Sep 16 04:37:11.939819 ignition[696]: op(1): [started] loading QEMU firmware config module Sep 16 04:37:11.939834 ignition[696]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 16 04:37:11.944721 ignition[696]: op(1): [finished] loading QEMU firmware config module Sep 16 04:37:11.952077 systemd-networkd[803]: lo: Link UP Sep 16 04:37:11.952091 systemd-networkd[803]: lo: Gained carrier Sep 16 04:37:11.952794 systemd-networkd[803]: Enumeration completed Sep 16 04:37:11.952888 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:37:11.953200 systemd-networkd[803]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:37:11.953205 systemd-networkd[803]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:37:11.953994 systemd-networkd[803]: eth0: Link UP Sep 16 04:37:11.954083 systemd-networkd[803]: eth0: Gained carrier Sep 16 04:37:11.954092 systemd-networkd[803]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:37:11.954841 systemd[1]: Reached target network.target - Network. Sep 16 04:37:11.982376 systemd-networkd[803]: eth0: DHCPv4 address 10.0.0.111/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 16 04:37:11.997618 ignition[696]: parsing config with SHA512: aa8825275eb20a1b8101468abbca6cf23f1669ab8de9f98a69b54a08d1a7ee4e4c969836a82c537673e4b2eeee8020f52e042704c48a2605b2d36d12aa42549e Sep 16 04:37:12.003778 unknown[696]: fetched base config from "system" Sep 16 04:37:12.003788 unknown[696]: fetched user config from "qemu" Sep 16 04:37:12.004147 ignition[696]: fetch-offline: fetch-offline passed Sep 16 04:37:12.004198 ignition[696]: Ignition finished successfully Sep 16 04:37:12.005802 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:37:12.007379 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 16 04:37:12.008233 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 16 04:37:12.036855 ignition[811]: Ignition 2.22.0 Sep 16 04:37:12.036871 ignition[811]: Stage: kargs Sep 16 04:37:12.037001 ignition[811]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:37:12.037009 ignition[811]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 16 04:37:12.037757 ignition[811]: kargs: kargs passed Sep 16 04:37:12.037801 ignition[811]: Ignition finished successfully Sep 16 04:37:12.042386 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 16 04:37:12.045141 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 16 04:37:12.080684 ignition[819]: Ignition 2.22.0 Sep 16 04:37:12.080702 ignition[819]: Stage: disks Sep 16 04:37:12.080837 ignition[819]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:37:12.080847 ignition[819]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 16 04:37:12.081596 ignition[819]: disks: disks passed Sep 16 04:37:12.081640 ignition[819]: Ignition finished successfully Sep 16 04:37:12.085176 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 16 04:37:12.086536 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 16 04:37:12.087853 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 16 04:37:12.089548 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:37:12.091146 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:37:12.093011 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:37:12.095383 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 16 04:37:12.113872 systemd-resolved[288]: Detected conflict on linux IN A 10.0.0.111 Sep 16 04:37:12.113889 systemd-resolved[288]: Hostname conflict, changing published hostname from 'linux' to 'linux9'. Sep 16 04:37:12.116560 systemd-fsck[829]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 16 04:37:12.118984 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 16 04:37:12.121084 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 16 04:37:12.183354 kernel: EXT4-fs (vda9): mounted filesystem a00d22d9-68b1-4a84-acfc-9fae1fca53dd r/w with ordered data mode. Quota mode: none. Sep 16 04:37:12.183617 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 16 04:37:12.184702 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 16 04:37:12.187340 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:37:12.189214 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 16 04:37:12.190099 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 16 04:37:12.190151 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 16 04:37:12.190175 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:37:12.197796 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 16 04:37:12.199656 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 16 04:37:12.203952 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (837) Sep 16 04:37:12.203979 kernel: BTRFS info (device vda6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:37:12.204807 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:37:12.207356 kernel: BTRFS info (device vda6): turning on async discard Sep 16 04:37:12.207392 kernel: BTRFS info (device vda6): enabling free space tree Sep 16 04:37:12.208622 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:37:12.234758 initrd-setup-root[861]: cut: /sysroot/etc/passwd: No such file or directory Sep 16 04:37:12.237898 initrd-setup-root[868]: cut: /sysroot/etc/group: No such file or directory Sep 16 04:37:12.241208 initrd-setup-root[875]: cut: /sysroot/etc/shadow: No such file or directory Sep 16 04:37:12.244667 initrd-setup-root[882]: cut: /sysroot/etc/gshadow: No such file or directory Sep 16 04:37:12.310393 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 16 04:37:12.312117 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 16 04:37:12.314448 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 16 04:37:12.327351 kernel: BTRFS info (device vda6): last unmount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:37:12.336776 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 16 04:37:12.348599 ignition[951]: INFO : Ignition 2.22.0 Sep 16 04:37:12.348599 ignition[951]: INFO : Stage: mount Sep 16 04:37:12.349830 ignition[951]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:37:12.349830 ignition[951]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 16 04:37:12.349830 ignition[951]: INFO : mount: mount passed Sep 16 04:37:12.349830 ignition[951]: INFO : Ignition finished successfully Sep 16 04:37:12.352048 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 16 04:37:12.354065 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 16 04:37:12.785721 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 16 04:37:12.787075 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:37:12.804350 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (963) Sep 16 04:37:12.804390 kernel: BTRFS info (device vda6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:37:12.805884 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:37:12.808350 kernel: BTRFS info (device vda6): turning on async discard Sep 16 04:37:12.808372 kernel: BTRFS info (device vda6): enabling free space tree Sep 16 04:37:12.809285 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:37:12.838614 ignition[980]: INFO : Ignition 2.22.0 Sep 16 04:37:12.838614 ignition[980]: INFO : Stage: files Sep 16 04:37:12.839844 ignition[980]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:37:12.839844 ignition[980]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 16 04:37:12.839844 ignition[980]: DEBUG : files: compiled without relabeling support, skipping Sep 16 04:37:12.842302 ignition[980]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 16 04:37:12.842302 ignition[980]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 16 04:37:12.844623 ignition[980]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 16 04:37:12.845607 ignition[980]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 16 04:37:12.845607 ignition[980]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 16 04:37:12.845184 unknown[980]: wrote ssh authorized keys file for user: core Sep 16 04:37:12.848577 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 16 04:37:12.848577 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 16 04:37:12.885727 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 16 04:37:12.954409 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:37:12.956274 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:37:12.969150 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:37:12.969150 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:37:12.969150 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 16 04:37:12.969150 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 16 04:37:12.969150 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 16 04:37:12.969150 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 16 04:37:12.995500 systemd-networkd[803]: eth0: Gained IPv6LL Sep 16 04:37:13.402582 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 16 04:37:13.760638 ignition[980]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 16 04:37:13.760638 ignition[980]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 16 04:37:13.764003 ignition[980]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 16 04:37:13.778386 ignition[980]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 16 04:37:13.781769 ignition[980]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 16 04:37:13.784425 ignition[980]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 16 04:37:13.784425 ignition[980]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 16 04:37:13.784425 ignition[980]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 16 04:37:13.784425 ignition[980]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:37:13.784425 ignition[980]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:37:13.784425 ignition[980]: INFO : files: files passed Sep 16 04:37:13.784425 ignition[980]: INFO : Ignition finished successfully Sep 16 04:37:13.785802 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 16 04:37:13.788616 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 16 04:37:13.790271 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 16 04:37:13.805833 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 16 04:37:13.807369 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 16 04:37:13.809796 initrd-setup-root-after-ignition[1009]: grep: /sysroot/oem/oem-release: No such file or directory Sep 16 04:37:13.812064 initrd-setup-root-after-ignition[1011]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:37:13.812064 initrd-setup-root-after-ignition[1011]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:37:13.814566 initrd-setup-root-after-ignition[1015]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:37:13.817835 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:37:13.819035 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 16 04:37:13.822506 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 16 04:37:13.889022 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 16 04:37:13.889129 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 16 04:37:13.893612 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 16 04:37:13.896657 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 16 04:37:13.899303 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 16 04:37:13.901073 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 16 04:37:13.930038 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:37:13.934356 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 16 04:37:13.961117 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:37:13.962154 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:37:13.963711 systemd[1]: Stopped target timers.target - Timer Units. Sep 16 04:37:13.965087 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 16 04:37:13.965203 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:37:13.967058 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 16 04:37:13.968555 systemd[1]: Stopped target basic.target - Basic System. Sep 16 04:37:13.969775 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 16 04:37:13.971087 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:37:13.972589 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 16 04:37:13.974111 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:37:13.975702 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 16 04:37:13.977271 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:37:13.978912 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 16 04:37:13.980427 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 16 04:37:13.981783 systemd[1]: Stopped target swap.target - Swaps. Sep 16 04:37:13.982942 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 16 04:37:13.983058 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:37:13.984931 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:37:13.986492 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:37:13.988014 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 16 04:37:13.988106 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:37:13.989811 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 16 04:37:13.989930 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 16 04:37:13.992278 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 16 04:37:13.992409 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:37:13.994049 systemd[1]: Stopped target paths.target - Path Units. Sep 16 04:37:13.995199 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 16 04:37:13.998365 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:37:13.999444 systemd[1]: Stopped target slices.target - Slice Units. Sep 16 04:37:14.001093 systemd[1]: Stopped target sockets.target - Socket Units. Sep 16 04:37:14.002372 systemd[1]: iscsid.socket: Deactivated successfully. Sep 16 04:37:14.002466 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:37:14.003755 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 16 04:37:14.003841 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:37:14.005023 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 16 04:37:14.005141 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:37:14.006526 systemd[1]: ignition-files.service: Deactivated successfully. Sep 16 04:37:14.006625 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 16 04:37:14.008623 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 16 04:37:14.009989 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 16 04:37:14.010106 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:37:14.019665 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 16 04:37:14.020373 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 16 04:37:14.020493 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:37:14.022040 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 16 04:37:14.022127 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:37:14.030741 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 16 04:37:14.032352 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 16 04:37:14.043259 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 16 04:37:14.054143 ignition[1035]: INFO : Ignition 2.22.0 Sep 16 04:37:14.054143 ignition[1035]: INFO : Stage: umount Sep 16 04:37:14.063624 ignition[1035]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:37:14.063624 ignition[1035]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 16 04:37:14.063624 ignition[1035]: INFO : umount: umount passed Sep 16 04:37:14.063624 ignition[1035]: INFO : Ignition finished successfully Sep 16 04:37:14.066525 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 16 04:37:14.067277 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 16 04:37:14.069043 systemd[1]: Stopped target network.target - Network. Sep 16 04:37:14.070410 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 16 04:37:14.071155 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 16 04:37:14.072744 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 16 04:37:14.073420 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 16 04:37:14.074182 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 16 04:37:14.074232 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 16 04:37:14.075520 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 16 04:37:14.075559 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 16 04:37:14.076958 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 16 04:37:14.078235 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 16 04:37:14.083434 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 16 04:37:14.083564 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 16 04:37:14.086952 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 16 04:37:14.087197 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 16 04:37:14.087235 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:37:14.090705 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:37:14.095272 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 16 04:37:14.095419 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 16 04:37:14.099500 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 16 04:37:14.099685 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 16 04:37:14.101369 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 16 04:37:14.101406 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:37:14.103709 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 16 04:37:14.104441 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 16 04:37:14.104497 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:37:14.106119 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 16 04:37:14.106157 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:37:14.108215 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 16 04:37:14.108256 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 16 04:37:14.110255 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:37:14.112534 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 16 04:37:14.112798 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 16 04:37:14.122210 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 16 04:37:14.123716 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 16 04:37:14.123798 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 16 04:37:14.128052 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 16 04:37:14.128868 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:37:14.131279 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 16 04:37:14.131380 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 16 04:37:14.132928 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 16 04:37:14.132963 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:37:14.134232 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 16 04:37:14.134272 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:37:14.135883 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 16 04:37:14.135933 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 16 04:37:14.136810 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 16 04:37:14.136865 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:37:14.140735 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 16 04:37:14.141571 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 16 04:37:14.141645 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:37:14.143866 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 16 04:37:14.143906 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:37:14.146413 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:37:14.146458 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:37:14.149349 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 16 04:37:14.149432 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 16 04:37:14.150710 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 16 04:37:14.150790 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 16 04:37:14.152863 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 16 04:37:14.154512 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 16 04:37:14.168772 systemd[1]: Switching root. Sep 16 04:37:14.204594 systemd-journald[244]: Journal stopped Sep 16 04:37:14.980388 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 16 04:37:14.980442 kernel: SELinux: policy capability network_peer_controls=1 Sep 16 04:37:14.980457 kernel: SELinux: policy capability open_perms=1 Sep 16 04:37:14.980471 kernel: SELinux: policy capability extended_socket_class=1 Sep 16 04:37:14.980482 kernel: SELinux: policy capability always_check_network=0 Sep 16 04:37:14.980492 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 16 04:37:14.980502 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 16 04:37:14.980511 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 16 04:37:14.980520 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 16 04:37:14.980529 kernel: SELinux: policy capability userspace_initial_context=0 Sep 16 04:37:14.980542 kernel: audit: type=1403 audit(1757997434.385:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 16 04:37:14.980553 systemd[1]: Successfully loaded SELinux policy in 57.577ms. Sep 16 04:37:14.980573 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.511ms. Sep 16 04:37:14.980585 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:37:14.980596 systemd[1]: Detected virtualization kvm. Sep 16 04:37:14.980605 systemd[1]: Detected architecture arm64. Sep 16 04:37:14.980615 systemd[1]: Detected first boot. Sep 16 04:37:14.980625 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:37:14.980635 zram_generator::config[1084]: No configuration found. Sep 16 04:37:14.980646 kernel: NET: Registered PF_VSOCK protocol family Sep 16 04:37:14.980656 systemd[1]: Populated /etc with preset unit settings. Sep 16 04:37:14.980672 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 16 04:37:14.980682 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 16 04:37:14.980692 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 16 04:37:14.980702 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 16 04:37:14.980712 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 16 04:37:14.980722 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 16 04:37:14.980732 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 16 04:37:14.980742 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 16 04:37:14.980751 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 16 04:37:14.980763 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 16 04:37:14.980773 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 16 04:37:14.980783 systemd[1]: Created slice user.slice - User and Session Slice. Sep 16 04:37:14.980793 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:37:14.980803 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:37:14.980822 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 16 04:37:14.980834 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 16 04:37:14.980844 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 16 04:37:14.980856 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:37:14.980866 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 16 04:37:14.980876 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:37:14.980886 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:37:14.980896 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 16 04:37:14.980906 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 16 04:37:14.980916 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 16 04:37:14.980926 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 16 04:37:14.980938 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:37:14.980947 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:37:14.980957 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:37:14.980968 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:37:14.980978 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 16 04:37:14.980987 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 16 04:37:14.980997 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 16 04:37:14.981008 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:37:14.981018 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:37:14.981027 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:37:14.981039 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 16 04:37:14.981048 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 16 04:37:14.981058 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 16 04:37:14.981068 systemd[1]: Mounting media.mount - External Media Directory... Sep 16 04:37:14.981078 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 16 04:37:14.981087 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 16 04:37:14.981097 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 16 04:37:14.981107 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 16 04:37:14.981118 systemd[1]: Reached target machines.target - Containers. Sep 16 04:37:14.981128 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 16 04:37:14.981138 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:37:14.981148 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:37:14.981158 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 16 04:37:14.981169 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:37:14.981178 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:37:14.981188 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:37:14.981198 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 16 04:37:14.981211 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:37:14.981222 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 16 04:37:14.981232 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 16 04:37:14.981241 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 16 04:37:14.981251 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 16 04:37:14.981260 systemd[1]: Stopped systemd-fsck-usr.service. Sep 16 04:37:14.981271 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:37:14.981280 kernel: fuse: init (API version 7.41) Sep 16 04:37:14.981291 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:37:14.981301 kernel: ACPI: bus type drm_connector registered Sep 16 04:37:14.981311 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:37:14.981320 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:37:14.981339 kernel: loop: module loaded Sep 16 04:37:14.981350 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 16 04:37:14.981362 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 16 04:37:14.981372 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:37:14.981384 systemd[1]: verity-setup.service: Deactivated successfully. Sep 16 04:37:14.981394 systemd[1]: Stopped verity-setup.service. Sep 16 04:37:14.981404 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 16 04:37:14.981413 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 16 04:37:14.981423 systemd[1]: Mounted media.mount - External Media Directory. Sep 16 04:37:14.981434 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 16 04:37:14.981445 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 16 04:37:14.981455 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 16 04:37:14.981485 systemd-journald[1152]: Collecting audit messages is disabled. Sep 16 04:37:14.981508 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 16 04:37:14.981518 systemd-journald[1152]: Journal started Sep 16 04:37:14.981539 systemd-journald[1152]: Runtime Journal (/run/log/journal/638bb3d30bd948aea8233c8adde0e944) is 6M, max 48.5M, 42.4M free. Sep 16 04:37:14.785282 systemd[1]: Queued start job for default target multi-user.target. Sep 16 04:37:14.806287 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 16 04:37:14.806661 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 16 04:37:14.984376 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:37:14.985140 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:37:14.986440 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 16 04:37:14.986594 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 16 04:37:14.987683 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:37:14.987850 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:37:14.989089 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:37:14.989253 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:37:14.990373 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:37:14.990521 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:37:14.991623 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 16 04:37:14.991772 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 16 04:37:14.992976 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:37:14.993130 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:37:14.994254 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:37:14.995471 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:37:14.996875 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 16 04:37:14.998121 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 16 04:37:15.010988 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:37:15.012975 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 16 04:37:15.014769 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 16 04:37:15.015691 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 16 04:37:15.015718 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:37:15.017298 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 16 04:37:15.024088 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 16 04:37:15.025038 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:37:15.026994 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 16 04:37:15.028704 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 16 04:37:15.029615 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:37:15.030600 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 16 04:37:15.031457 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:37:15.036101 systemd-journald[1152]: Time spent on flushing to /var/log/journal/638bb3d30bd948aea8233c8adde0e944 is 16.253ms for 882 entries. Sep 16 04:37:15.036101 systemd-journald[1152]: System Journal (/var/log/journal/638bb3d30bd948aea8233c8adde0e944) is 8M, max 195.6M, 187.6M free. Sep 16 04:37:15.056377 systemd-journald[1152]: Received client request to flush runtime journal. Sep 16 04:37:15.056427 kernel: loop0: detected capacity change from 0 to 100632 Sep 16 04:37:15.034468 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:37:15.036572 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 16 04:37:15.038989 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 16 04:37:15.041368 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:37:15.042432 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 16 04:37:15.043425 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 16 04:37:15.049360 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 16 04:37:15.051494 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 16 04:37:15.053542 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 16 04:37:15.068483 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 16 04:37:15.069392 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 16 04:37:15.072353 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:37:15.081976 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 16 04:37:15.090393 kernel: loop1: detected capacity change from 0 to 207008 Sep 16 04:37:15.101137 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 16 04:37:15.104859 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:37:15.123346 kernel: loop2: detected capacity change from 0 to 119368 Sep 16 04:37:15.131074 systemd-tmpfiles[1218]: ACLs are not supported, ignoring. Sep 16 04:37:15.131088 systemd-tmpfiles[1218]: ACLs are not supported, ignoring. Sep 16 04:37:15.134437 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:37:15.145348 kernel: loop3: detected capacity change from 0 to 100632 Sep 16 04:37:15.151349 kernel: loop4: detected capacity change from 0 to 207008 Sep 16 04:37:15.157346 kernel: loop5: detected capacity change from 0 to 119368 Sep 16 04:37:15.161616 (sd-merge)[1222]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 16 04:37:15.161993 (sd-merge)[1222]: Merged extensions into '/usr'. Sep 16 04:37:15.168471 systemd[1]: Reload requested from client PID 1200 ('systemd-sysext') (unit systemd-sysext.service)... Sep 16 04:37:15.168493 systemd[1]: Reloading... Sep 16 04:37:15.225214 zram_generator::config[1248]: No configuration found. Sep 16 04:37:15.294830 ldconfig[1195]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 16 04:37:15.364719 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 16 04:37:15.365121 systemd[1]: Reloading finished in 196 ms. Sep 16 04:37:15.384695 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 16 04:37:15.387973 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 16 04:37:15.398383 systemd[1]: Starting ensure-sysext.service... Sep 16 04:37:15.399880 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:37:15.408570 systemd[1]: Reload requested from client PID 1282 ('systemctl') (unit ensure-sysext.service)... Sep 16 04:37:15.408582 systemd[1]: Reloading... Sep 16 04:37:15.415011 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 16 04:37:15.415298 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 16 04:37:15.415621 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 16 04:37:15.415906 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 16 04:37:15.416656 systemd-tmpfiles[1283]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 16 04:37:15.416974 systemd-tmpfiles[1283]: ACLs are not supported, ignoring. Sep 16 04:37:15.417087 systemd-tmpfiles[1283]: ACLs are not supported, ignoring. Sep 16 04:37:15.419754 systemd-tmpfiles[1283]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:37:15.419867 systemd-tmpfiles[1283]: Skipping /boot Sep 16 04:37:15.426412 systemd-tmpfiles[1283]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:37:15.426501 systemd-tmpfiles[1283]: Skipping /boot Sep 16 04:37:15.457488 zram_generator::config[1310]: No configuration found. Sep 16 04:37:15.586063 systemd[1]: Reloading finished in 177 ms. Sep 16 04:37:15.604826 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 16 04:37:15.611190 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:37:15.616776 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:37:15.619357 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 16 04:37:15.621439 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 16 04:37:15.625471 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:37:15.627469 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:37:15.630219 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 16 04:37:15.637214 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:37:15.644980 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:37:15.646949 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:37:15.650404 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:37:15.651414 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:37:15.651535 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:37:15.652419 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:37:15.654787 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:37:15.657533 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 16 04:37:15.659757 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 16 04:37:15.661608 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:37:15.661741 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:37:15.663274 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:37:15.663421 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:37:15.668155 augenrules[1374]: No rules Sep 16 04:37:15.669076 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:37:15.669287 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:37:15.673716 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:37:15.674858 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:37:15.676597 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:37:15.677761 systemd-udevd[1354]: Using default interface naming scheme 'v255'. Sep 16 04:37:15.684489 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:37:15.685501 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:37:15.685687 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:37:15.686789 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 16 04:37:15.689496 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 16 04:37:15.690459 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 04:37:15.691796 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 16 04:37:15.693480 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:37:15.693636 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:37:15.695083 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:37:15.695217 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:37:15.696547 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:37:15.698175 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:37:15.698799 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:37:15.700208 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 16 04:37:15.713408 systemd[1]: Finished ensure-sysext.service. Sep 16 04:37:15.722518 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:37:15.724137 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:37:15.725125 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:37:15.741516 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:37:15.746724 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:37:15.752073 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:37:15.753297 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:37:15.753428 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:37:15.762704 augenrules[1420]: /sbin/augenrules: No change Sep 16 04:37:15.771346 augenrules[1449]: No rules Sep 16 04:37:15.778828 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:37:15.782538 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 16 04:37:15.783441 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 04:37:15.784003 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:37:15.784401 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:37:15.785822 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:37:15.786002 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:37:15.787503 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:37:15.787754 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:37:15.789235 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:37:15.789519 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:37:15.791045 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:37:15.791258 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:37:15.800948 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 16 04:37:15.802212 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 16 04:37:15.808216 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:37:15.808578 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:37:15.827510 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 16 04:37:15.830298 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 16 04:37:15.853931 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 16 04:37:15.864960 systemd-networkd[1456]: lo: Link UP Sep 16 04:37:15.864967 systemd-networkd[1456]: lo: Gained carrier Sep 16 04:37:15.865750 systemd-networkd[1456]: Enumeration completed Sep 16 04:37:15.865886 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:37:15.866237 systemd-networkd[1456]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:37:15.866247 systemd-networkd[1456]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:37:15.866760 systemd-networkd[1456]: eth0: Link UP Sep 16 04:37:15.866900 systemd-networkd[1456]: eth0: Gained carrier Sep 16 04:37:15.866919 systemd-networkd[1456]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:37:15.869225 systemd-resolved[1349]: Positive Trust Anchors: Sep 16 04:37:15.869242 systemd-resolved[1349]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:37:15.869274 systemd-resolved[1349]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:37:15.869804 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 16 04:37:15.872555 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 16 04:37:15.875086 systemd-resolved[1349]: Defaulting to hostname 'linux'. Sep 16 04:37:15.876629 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:37:15.877892 systemd[1]: Reached target network.target - Network. Sep 16 04:37:15.878904 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:37:15.881418 systemd-networkd[1456]: eth0: DHCPv4 address 10.0.0.111/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 16 04:37:15.883297 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 16 04:37:15.884815 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:37:15.884938 systemd-timesyncd[1457]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 16 04:37:15.884977 systemd-timesyncd[1457]: Initial clock synchronization to Tue 2025-09-16 04:37:15.703314 UTC. Sep 16 04:37:15.886280 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 16 04:37:15.889499 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 16 04:37:15.890424 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 16 04:37:15.891362 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 16 04:37:15.891393 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:37:15.892122 systemd[1]: Reached target time-set.target - System Time Set. Sep 16 04:37:15.893079 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 16 04:37:15.894012 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 16 04:37:15.895052 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:37:15.896455 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 16 04:37:15.899583 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 16 04:37:15.902110 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 16 04:37:15.904246 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 16 04:37:15.905359 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 16 04:37:15.911982 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 16 04:37:15.913105 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 16 04:37:15.916033 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 16 04:37:15.917706 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 16 04:37:15.919987 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:37:15.921761 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:37:15.924493 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:37:15.924523 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:37:15.927459 systemd[1]: Starting containerd.service - containerd container runtime... Sep 16 04:37:15.929263 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 16 04:37:15.932530 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 16 04:37:15.934249 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 16 04:37:15.940986 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 16 04:37:15.942785 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 16 04:37:15.943852 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 16 04:37:15.946568 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 16 04:37:15.950283 jq[1497]: false Sep 16 04:37:15.951926 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 16 04:37:15.955044 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 16 04:37:15.957414 extend-filesystems[1498]: Found /dev/vda6 Sep 16 04:37:15.958507 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 16 04:37:15.960084 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 16 04:37:15.960475 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 16 04:37:15.963189 extend-filesystems[1498]: Found /dev/vda9 Sep 16 04:37:15.962553 systemd[1]: Starting update-engine.service - Update Engine... Sep 16 04:37:15.965991 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 16 04:37:15.974406 extend-filesystems[1498]: Checking size of /dev/vda9 Sep 16 04:37:15.969893 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 16 04:37:15.977839 jq[1516]: true Sep 16 04:37:15.971657 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 16 04:37:15.973369 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 16 04:37:15.973622 systemd[1]: motdgen.service: Deactivated successfully. Sep 16 04:37:15.973775 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 16 04:37:15.977606 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 16 04:37:15.977762 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 16 04:37:15.978747 extend-filesystems[1498]: Resized partition /dev/vda9 Sep 16 04:37:15.981282 extend-filesystems[1526]: resize2fs 1.47.3 (8-Jul-2025) Sep 16 04:37:15.985361 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 16 04:37:15.998672 tar[1525]: linux-arm64/LICENSE Sep 16 04:37:15.999796 tar[1525]: linux-arm64/helm Sep 16 04:37:16.007519 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:37:16.014165 update_engine[1511]: I20250916 04:37:16.013406 1511 main.cc:92] Flatcar Update Engine starting Sep 16 04:37:16.014451 jq[1527]: true Sep 16 04:37:16.016343 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 16 04:37:16.016643 (ntainerd)[1528]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 16 04:37:16.036051 extend-filesystems[1526]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 16 04:37:16.036051 extend-filesystems[1526]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 16 04:37:16.036051 extend-filesystems[1526]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 16 04:37:16.033671 systemd-logind[1510]: Watching system buttons on /dev/input/event0 (Power Button) Sep 16 04:37:16.036063 dbus-daemon[1495]: [system] SELinux support is enabled Sep 16 04:37:16.047321 update_engine[1511]: I20250916 04:37:16.046622 1511 update_check_scheduler.cc:74] Next update check in 2m51s Sep 16 04:37:16.048068 extend-filesystems[1498]: Resized filesystem in /dev/vda9 Sep 16 04:37:16.033846 systemd-logind[1510]: New seat seat0. Sep 16 04:37:16.035221 systemd[1]: Started systemd-logind.service - User Login Management. Sep 16 04:37:16.037496 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 16 04:37:16.040094 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 16 04:37:16.040314 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 16 04:37:16.069942 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:37:16.076097 bash[1560]: Updated "/home/core/.ssh/authorized_keys" Sep 16 04:37:16.079228 dbus-daemon[1495]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 16 04:37:16.082362 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 16 04:37:16.085672 systemd[1]: Started update-engine.service - Update Engine. Sep 16 04:37:16.088394 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 16 04:37:16.088565 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 16 04:37:16.088684 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 16 04:37:16.091540 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 16 04:37:16.091646 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 16 04:37:16.097411 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 16 04:37:16.149234 locksmithd[1564]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 16 04:37:16.205203 containerd[1528]: time="2025-09-16T04:37:16Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 16 04:37:16.206403 containerd[1528]: time="2025-09-16T04:37:16.205829375Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 16 04:37:16.220794 containerd[1528]: time="2025-09-16T04:37:16.220751658Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.459µs" Sep 16 04:37:16.220794 containerd[1528]: time="2025-09-16T04:37:16.220781134Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 16 04:37:16.220794 containerd[1528]: time="2025-09-16T04:37:16.220796615Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 16 04:37:16.221006 containerd[1528]: time="2025-09-16T04:37:16.220981603Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 16 04:37:16.221032 containerd[1528]: time="2025-09-16T04:37:16.221007443Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 16 04:37:16.221146 containerd[1528]: time="2025-09-16T04:37:16.221132423Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:37:16.221255 containerd[1528]: time="2025-09-16T04:37:16.221189968Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:37:16.221255 containerd[1528]: time="2025-09-16T04:37:16.221252204Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:37:16.221593 containerd[1528]: time="2025-09-16T04:37:16.221563032Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:37:16.221593 containerd[1528]: time="2025-09-16T04:37:16.221590123Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:37:16.221635 containerd[1528]: time="2025-09-16T04:37:16.221602477Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:37:16.221635 containerd[1528]: time="2025-09-16T04:37:16.221609944Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 16 04:37:16.223336 containerd[1528]: time="2025-09-16T04:37:16.221692625Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 16 04:37:16.223336 containerd[1528]: time="2025-09-16T04:37:16.221966667Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:37:16.223336 containerd[1528]: time="2025-09-16T04:37:16.221995126Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:37:16.223336 containerd[1528]: time="2025-09-16T04:37:16.222004548Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 16 04:37:16.223336 containerd[1528]: time="2025-09-16T04:37:16.222037620Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 16 04:37:16.223336 containerd[1528]: time="2025-09-16T04:37:16.223153255Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 16 04:37:16.223336 containerd[1528]: time="2025-09-16T04:37:16.223244458Z" level=info msg="metadata content store policy set" policy=shared Sep 16 04:37:16.226508 containerd[1528]: time="2025-09-16T04:37:16.226475138Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 16 04:37:16.226570 containerd[1528]: time="2025-09-16T04:37:16.226520486Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 16 04:37:16.226570 containerd[1528]: time="2025-09-16T04:37:16.226535185Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 16 04:37:16.226570 containerd[1528]: time="2025-09-16T04:37:16.226546913Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 16 04:37:16.226570 containerd[1528]: time="2025-09-16T04:37:16.226565286Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 16 04:37:16.226638 containerd[1528]: time="2025-09-16T04:37:16.226577249Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 16 04:37:16.226638 containerd[1528]: time="2025-09-16T04:37:16.226603637Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 16 04:37:16.226638 containerd[1528]: time="2025-09-16T04:37:16.226616615Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 16 04:37:16.226638 containerd[1528]: time="2025-09-16T04:37:16.226627600Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 16 04:37:16.226638 containerd[1528]: time="2025-09-16T04:37:16.226637335Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 16 04:37:16.226718 containerd[1528]: time="2025-09-16T04:37:16.226645818Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 16 04:37:16.226718 containerd[1528]: time="2025-09-16T04:37:16.226656451Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 16 04:37:16.226786 containerd[1528]: time="2025-09-16T04:37:16.226764113Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 16 04:37:16.226810 containerd[1528]: time="2025-09-16T04:37:16.226791674Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 16 04:37:16.226810 containerd[1528]: time="2025-09-16T04:37:16.226806529Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 16 04:37:16.226846 containerd[1528]: time="2025-09-16T04:37:16.226817358Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 16 04:37:16.226846 containerd[1528]: time="2025-09-16T04:37:16.226827131Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 16 04:37:16.226877 containerd[1528]: time="2025-09-16T04:37:16.226849414Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 16 04:37:16.226877 containerd[1528]: time="2025-09-16T04:37:16.226860164Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 16 04:37:16.226877 containerd[1528]: time="2025-09-16T04:37:16.226870368Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 16 04:37:16.226935 containerd[1528]: time="2025-09-16T04:37:16.226889093Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 16 04:37:16.226935 containerd[1528]: time="2025-09-16T04:37:16.226899336Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 16 04:37:16.226935 containerd[1528]: time="2025-09-16T04:37:16.226908913Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 16 04:37:16.227100 containerd[1528]: time="2025-09-16T04:37:16.227083346Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 16 04:37:16.227132 containerd[1528]: time="2025-09-16T04:37:16.227101954Z" level=info msg="Start snapshots syncer" Sep 16 04:37:16.227132 containerd[1528]: time="2025-09-16T04:37:16.227125527Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 16 04:37:16.227388 containerd[1528]: time="2025-09-16T04:37:16.227351719Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 16 04:37:16.227482 containerd[1528]: time="2025-09-16T04:37:16.227404690Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 16 04:37:16.227482 containerd[1528]: time="2025-09-16T04:37:16.227473728Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 16 04:37:16.227648 containerd[1528]: time="2025-09-16T04:37:16.227625448Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 16 04:37:16.227675 containerd[1528]: time="2025-09-16T04:37:16.227659732Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 16 04:37:16.227675 containerd[1528]: time="2025-09-16T04:37:16.227671108Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 16 04:37:16.227707 containerd[1528]: time="2025-09-16T04:37:16.227681624Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 16 04:37:16.227707 containerd[1528]: time="2025-09-16T04:37:16.227692297Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 16 04:37:16.227707 containerd[1528]: time="2025-09-16T04:37:16.227701366Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 16 04:37:16.227754 containerd[1528]: time="2025-09-16T04:37:16.227711296Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 16 04:37:16.227754 containerd[1528]: time="2025-09-16T04:37:16.227733305Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 16 04:37:16.227754 containerd[1528]: time="2025-09-16T04:37:16.227751796Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 16 04:37:16.227806 containerd[1528]: time="2025-09-16T04:37:16.227763250Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 16 04:37:16.227806 containerd[1528]: time="2025-09-16T04:37:16.227796792Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:37:16.227838 containerd[1528]: time="2025-09-16T04:37:16.227808872Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:37:16.227838 containerd[1528]: time="2025-09-16T04:37:16.227816886Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:37:16.227838 containerd[1528]: time="2025-09-16T04:37:16.227833227Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:37:16.227891 containerd[1528]: time="2025-09-16T04:37:16.227841202Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 16 04:37:16.227891 containerd[1528]: time="2025-09-16T04:37:16.227849372Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 16 04:37:16.227891 containerd[1528]: time="2025-09-16T04:37:16.227858676Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 16 04:37:16.227940 containerd[1528]: time="2025-09-16T04:37:16.227929239Z" level=info msg="runtime interface created" Sep 16 04:37:16.227940 containerd[1528]: time="2025-09-16T04:37:16.227934282Z" level=info msg="created NRI interface" Sep 16 04:37:16.227972 containerd[1528]: time="2025-09-16T04:37:16.227946244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 16 04:37:16.227972 containerd[1528]: time="2025-09-16T04:37:16.227956604Z" level=info msg="Connect containerd service" Sep 16 04:37:16.228006 containerd[1528]: time="2025-09-16T04:37:16.227990810Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 16 04:37:16.228658 containerd[1528]: time="2025-09-16T04:37:16.228631583Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 04:37:16.289655 containerd[1528]: time="2025-09-16T04:37:16.289577162Z" level=info msg="Start subscribing containerd event" Sep 16 04:37:16.289655 containerd[1528]: time="2025-09-16T04:37:16.289662893Z" level=info msg="Start recovering state" Sep 16 04:37:16.289758 containerd[1528]: time="2025-09-16T04:37:16.289739397Z" level=info msg="Start event monitor" Sep 16 04:37:16.289758 containerd[1528]: time="2025-09-16T04:37:16.289753119Z" level=info msg="Start cni network conf syncer for default" Sep 16 04:37:16.289809 containerd[1528]: time="2025-09-16T04:37:16.289760469Z" level=info msg="Start streaming server" Sep 16 04:37:16.289848 containerd[1528]: time="2025-09-16T04:37:16.289768561Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 16 04:37:16.289872 containerd[1528]: time="2025-09-16T04:37:16.289837052Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 16 04:37:16.289902 containerd[1528]: time="2025-09-16T04:37:16.289844909Z" level=info msg="runtime interface starting up..." Sep 16 04:37:16.289930 containerd[1528]: time="2025-09-16T04:37:16.289900890Z" level=info msg="starting plugins..." Sep 16 04:37:16.289930 containerd[1528]: time="2025-09-16T04:37:16.289885644Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 16 04:37:16.289962 containerd[1528]: time="2025-09-16T04:37:16.289915746Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 16 04:37:16.291264 containerd[1528]: time="2025-09-16T04:37:16.290070045Z" level=info msg="containerd successfully booted in 0.085186s" Sep 16 04:37:16.290163 systemd[1]: Started containerd.service - containerd container runtime. Sep 16 04:37:16.320043 tar[1525]: linux-arm64/README.md Sep 16 04:37:16.334370 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 16 04:37:16.365232 sshd_keygen[1522]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 16 04:37:16.383185 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 16 04:37:16.386203 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 16 04:37:16.407247 systemd[1]: issuegen.service: Deactivated successfully. Sep 16 04:37:16.409345 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 16 04:37:16.411399 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 16 04:37:16.431164 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 16 04:37:16.433355 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 16 04:37:16.435020 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 16 04:37:16.436104 systemd[1]: Reached target getty.target - Login Prompts. Sep 16 04:37:17.475484 systemd-networkd[1456]: eth0: Gained IPv6LL Sep 16 04:37:17.479405 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 16 04:37:17.480838 systemd[1]: Reached target network-online.target - Network is Online. Sep 16 04:37:17.482977 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 16 04:37:17.484909 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:37:17.486750 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 16 04:37:17.507387 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 16 04:37:17.508484 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 16 04:37:17.509851 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 16 04:37:17.511498 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 16 04:37:18.011229 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:18.012567 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 16 04:37:18.014753 (kubelet)[1633]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:37:18.018450 systemd[1]: Startup finished in 2.024s (kernel) + 4.778s (initrd) + 3.691s (userspace) = 10.494s. Sep 16 04:37:18.339432 kubelet[1633]: E0916 04:37:18.339306 1633 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:37:18.341482 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:37:18.341613 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:37:18.342447 systemd[1]: kubelet.service: Consumed 737ms CPU time, 257M memory peak. Sep 16 04:37:22.630606 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 16 04:37:22.631834 systemd[1]: Started sshd@0-10.0.0.111:22-10.0.0.1:33302.service - OpenSSH per-connection server daemon (10.0.0.1:33302). Sep 16 04:37:22.686536 sshd[1646]: Accepted publickey for core from 10.0.0.1 port 33302 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:37:22.688141 sshd-session[1646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:22.693878 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 16 04:37:22.695028 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 16 04:37:22.700655 systemd-logind[1510]: New session 1 of user core. Sep 16 04:37:22.722463 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 16 04:37:22.724911 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 16 04:37:22.738183 (systemd)[1651]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 16 04:37:22.740578 systemd-logind[1510]: New session c1 of user core. Sep 16 04:37:22.846726 systemd[1651]: Queued start job for default target default.target. Sep 16 04:37:22.856295 systemd[1651]: Created slice app.slice - User Application Slice. Sep 16 04:37:22.856344 systemd[1651]: Reached target paths.target - Paths. Sep 16 04:37:22.856383 systemd[1651]: Reached target timers.target - Timers. Sep 16 04:37:22.857936 systemd[1651]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 16 04:37:22.874221 systemd[1651]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 16 04:37:22.874350 systemd[1651]: Reached target sockets.target - Sockets. Sep 16 04:37:22.874391 systemd[1651]: Reached target basic.target - Basic System. Sep 16 04:37:22.874417 systemd[1651]: Reached target default.target - Main User Target. Sep 16 04:37:22.874444 systemd[1651]: Startup finished in 128ms. Sep 16 04:37:22.874487 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 16 04:37:22.877263 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 16 04:37:22.952440 systemd[1]: Started sshd@1-10.0.0.111:22-10.0.0.1:33436.service - OpenSSH per-connection server daemon (10.0.0.1:33436). Sep 16 04:37:23.013162 sshd[1662]: Accepted publickey for core from 10.0.0.1 port 33436 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:37:23.015242 sshd-session[1662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:23.020591 systemd-logind[1510]: New session 2 of user core. Sep 16 04:37:23.031794 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 16 04:37:23.084403 sshd[1665]: Connection closed by 10.0.0.1 port 33436 Sep 16 04:37:23.084731 sshd-session[1662]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:23.096271 systemd[1]: sshd@1-10.0.0.111:22-10.0.0.1:33436.service: Deactivated successfully. Sep 16 04:37:23.098732 systemd[1]: session-2.scope: Deactivated successfully. Sep 16 04:37:23.102027 systemd-logind[1510]: Session 2 logged out. Waiting for processes to exit. Sep 16 04:37:23.105973 systemd[1]: Started sshd@2-10.0.0.111:22-10.0.0.1:33448.service - OpenSSH per-connection server daemon (10.0.0.1:33448). Sep 16 04:37:23.106774 systemd-logind[1510]: Removed session 2. Sep 16 04:37:23.160829 sshd[1671]: Accepted publickey for core from 10.0.0.1 port 33448 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:37:23.161916 sshd-session[1671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:23.166414 systemd-logind[1510]: New session 3 of user core. Sep 16 04:37:23.173482 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 16 04:37:23.222719 sshd[1674]: Connection closed by 10.0.0.1 port 33448 Sep 16 04:37:23.223160 sshd-session[1671]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:23.232261 systemd[1]: sshd@2-10.0.0.111:22-10.0.0.1:33448.service: Deactivated successfully. Sep 16 04:37:23.235522 systemd[1]: session-3.scope: Deactivated successfully. Sep 16 04:37:23.236131 systemd-logind[1510]: Session 3 logged out. Waiting for processes to exit. Sep 16 04:37:23.238212 systemd[1]: Started sshd@3-10.0.0.111:22-10.0.0.1:33456.service - OpenSSH per-connection server daemon (10.0.0.1:33456). Sep 16 04:37:23.238669 systemd-logind[1510]: Removed session 3. Sep 16 04:37:23.290862 sshd[1680]: Accepted publickey for core from 10.0.0.1 port 33456 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:37:23.291962 sshd-session[1680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:23.295787 systemd-logind[1510]: New session 4 of user core. Sep 16 04:37:23.302468 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 16 04:37:23.352539 sshd[1683]: Connection closed by 10.0.0.1 port 33456 Sep 16 04:37:23.352776 sshd-session[1680]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:23.365074 systemd[1]: sshd@3-10.0.0.111:22-10.0.0.1:33456.service: Deactivated successfully. Sep 16 04:37:23.367459 systemd[1]: session-4.scope: Deactivated successfully. Sep 16 04:37:23.368397 systemd-logind[1510]: Session 4 logged out. Waiting for processes to exit. Sep 16 04:37:23.370749 systemd[1]: Started sshd@4-10.0.0.111:22-10.0.0.1:33472.service - OpenSSH per-connection server daemon (10.0.0.1:33472). Sep 16 04:37:23.371412 systemd-logind[1510]: Removed session 4. Sep 16 04:37:23.410889 sshd[1689]: Accepted publickey for core from 10.0.0.1 port 33472 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:37:23.411972 sshd-session[1689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:23.416308 systemd-logind[1510]: New session 5 of user core. Sep 16 04:37:23.426481 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 16 04:37:23.482299 sudo[1693]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 16 04:37:23.482864 sudo[1693]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:37:23.496153 sudo[1693]: pam_unix(sudo:session): session closed for user root Sep 16 04:37:23.498066 sshd[1692]: Connection closed by 10.0.0.1 port 33472 Sep 16 04:37:23.497887 sshd-session[1689]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:23.514119 systemd[1]: sshd@4-10.0.0.111:22-10.0.0.1:33472.service: Deactivated successfully. Sep 16 04:37:23.516418 systemd[1]: session-5.scope: Deactivated successfully. Sep 16 04:37:23.517226 systemd-logind[1510]: Session 5 logged out. Waiting for processes to exit. Sep 16 04:37:23.519643 systemd[1]: Started sshd@5-10.0.0.111:22-10.0.0.1:33476.service - OpenSSH per-connection server daemon (10.0.0.1:33476). Sep 16 04:37:23.520517 systemd-logind[1510]: Removed session 5. Sep 16 04:37:23.573501 sshd[1699]: Accepted publickey for core from 10.0.0.1 port 33476 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:37:23.574636 sshd-session[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:23.578386 systemd-logind[1510]: New session 6 of user core. Sep 16 04:37:23.587491 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 16 04:37:23.637510 sudo[1704]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 16 04:37:23.638075 sudo[1704]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:37:23.720657 sudo[1704]: pam_unix(sudo:session): session closed for user root Sep 16 04:37:23.725799 sudo[1703]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 16 04:37:23.726087 sudo[1703]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:37:23.735875 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:37:23.768018 augenrules[1726]: No rules Sep 16 04:37:23.769458 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:37:23.769705 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:37:23.771558 sudo[1703]: pam_unix(sudo:session): session closed for user root Sep 16 04:37:23.773240 sshd[1702]: Connection closed by 10.0.0.1 port 33476 Sep 16 04:37:23.773098 sshd-session[1699]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:23.784421 systemd[1]: sshd@5-10.0.0.111:22-10.0.0.1:33476.service: Deactivated successfully. Sep 16 04:37:23.786794 systemd[1]: session-6.scope: Deactivated successfully. Sep 16 04:37:23.787585 systemd-logind[1510]: Session 6 logged out. Waiting for processes to exit. Sep 16 04:37:23.790024 systemd[1]: Started sshd@6-10.0.0.111:22-10.0.0.1:33484.service - OpenSSH per-connection server daemon (10.0.0.1:33484). Sep 16 04:37:23.790611 systemd-logind[1510]: Removed session 6. Sep 16 04:37:23.851887 sshd[1735]: Accepted publickey for core from 10.0.0.1 port 33484 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:37:23.853113 sshd-session[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:23.857135 systemd-logind[1510]: New session 7 of user core. Sep 16 04:37:23.872533 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 16 04:37:23.921904 sudo[1739]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 16 04:37:23.922178 sudo[1739]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:37:24.194018 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 16 04:37:24.223821 (dockerd)[1759]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 16 04:37:24.421507 dockerd[1759]: time="2025-09-16T04:37:24.421438949Z" level=info msg="Starting up" Sep 16 04:37:24.422651 dockerd[1759]: time="2025-09-16T04:37:24.422315585Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 16 04:37:24.435345 dockerd[1759]: time="2025-09-16T04:37:24.435272394Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 16 04:37:24.471762 dockerd[1759]: time="2025-09-16T04:37:24.471648690Z" level=info msg="Loading containers: start." Sep 16 04:37:24.480353 kernel: Initializing XFRM netlink socket Sep 16 04:37:24.706820 systemd-networkd[1456]: docker0: Link UP Sep 16 04:37:24.710250 dockerd[1759]: time="2025-09-16T04:37:24.710134728Z" level=info msg="Loading containers: done." Sep 16 04:37:24.723634 dockerd[1759]: time="2025-09-16T04:37:24.723525966Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 16 04:37:24.723634 dockerd[1759]: time="2025-09-16T04:37:24.723615701Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 16 04:37:24.723798 dockerd[1759]: time="2025-09-16T04:37:24.723708968Z" level=info msg="Initializing buildkit" Sep 16 04:37:24.746731 dockerd[1759]: time="2025-09-16T04:37:24.746672872Z" level=info msg="Completed buildkit initialization" Sep 16 04:37:24.751449 dockerd[1759]: time="2025-09-16T04:37:24.751413166Z" level=info msg="Daemon has completed initialization" Sep 16 04:37:24.751594 dockerd[1759]: time="2025-09-16T04:37:24.751470635Z" level=info msg="API listen on /run/docker.sock" Sep 16 04:37:24.751711 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 16 04:37:25.420308 containerd[1528]: time="2025-09-16T04:37:25.420264591Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 16 04:37:25.901630 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount842756686.mount: Deactivated successfully. Sep 16 04:37:27.188207 containerd[1528]: time="2025-09-16T04:37:27.188147810Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:27.188679 containerd[1528]: time="2025-09-16T04:37:27.188648144Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=26363687" Sep 16 04:37:27.189677 containerd[1528]: time="2025-09-16T04:37:27.189651878Z" level=info msg="ImageCreate event name:\"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:27.192889 containerd[1528]: time="2025-09-16T04:37:27.192851886Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:27.196757 containerd[1528]: time="2025-09-16T04:37:27.194938139Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"26360284\" in 1.774407007s" Sep 16 04:37:27.196801 containerd[1528]: time="2025-09-16T04:37:27.196769449Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\"" Sep 16 04:37:27.197924 containerd[1528]: time="2025-09-16T04:37:27.197714013Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 16 04:37:28.270880 containerd[1528]: time="2025-09-16T04:37:28.270836058Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:28.271839 containerd[1528]: time="2025-09-16T04:37:28.271794978Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=22531202" Sep 16 04:37:28.272836 containerd[1528]: time="2025-09-16T04:37:28.272374718Z" level=info msg="ImageCreate event name:\"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:28.274913 containerd[1528]: time="2025-09-16T04:37:28.274880979Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:28.275915 containerd[1528]: time="2025-09-16T04:37:28.275882503Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"24099975\" in 1.078134544s" Sep 16 04:37:28.275915 containerd[1528]: time="2025-09-16T04:37:28.275913441Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\"" Sep 16 04:37:28.276440 containerd[1528]: time="2025-09-16T04:37:28.276417369Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 16 04:37:28.592061 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 16 04:37:28.593726 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:37:28.717371 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:28.720609 (kubelet)[2044]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:37:28.754567 kubelet[2044]: E0916 04:37:28.754508 2044 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:37:28.757301 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:37:28.757453 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:37:28.758458 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.8M memory peak. Sep 16 04:37:29.503086 containerd[1528]: time="2025-09-16T04:37:29.503030020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:29.503954 containerd[1528]: time="2025-09-16T04:37:29.503562563Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=17484326" Sep 16 04:37:29.504900 containerd[1528]: time="2025-09-16T04:37:29.504860834Z" level=info msg="ImageCreate event name:\"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:29.507351 containerd[1528]: time="2025-09-16T04:37:29.507304827Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:29.509037 containerd[1528]: time="2025-09-16T04:37:29.509013730Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"19053117\" in 1.232273915s" Sep 16 04:37:29.509096 containerd[1528]: time="2025-09-16T04:37:29.509043929Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\"" Sep 16 04:37:29.509481 containerd[1528]: time="2025-09-16T04:37:29.509420458Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 16 04:37:30.429687 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2731186855.mount: Deactivated successfully. Sep 16 04:37:30.650885 containerd[1528]: time="2025-09-16T04:37:30.650825835Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:30.651337 containerd[1528]: time="2025-09-16T04:37:30.651297181Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=27417819" Sep 16 04:37:30.652066 containerd[1528]: time="2025-09-16T04:37:30.652040373Z" level=info msg="ImageCreate event name:\"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:30.653600 containerd[1528]: time="2025-09-16T04:37:30.653577539Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:30.654373 containerd[1528]: time="2025-09-16T04:37:30.654064191Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"27416836\" in 1.144484608s" Sep 16 04:37:30.654373 containerd[1528]: time="2025-09-16T04:37:30.654097913Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\"" Sep 16 04:37:30.654602 containerd[1528]: time="2025-09-16T04:37:30.654572926Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 16 04:37:31.124638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1015554106.mount: Deactivated successfully. Sep 16 04:37:31.947456 containerd[1528]: time="2025-09-16T04:37:31.947404012Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:31.947880 containerd[1528]: time="2025-09-16T04:37:31.947838718Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Sep 16 04:37:31.949105 containerd[1528]: time="2025-09-16T04:37:31.949070976Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:31.951583 containerd[1528]: time="2025-09-16T04:37:31.951543188Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:31.952849 containerd[1528]: time="2025-09-16T04:37:31.952657767Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.297977055s" Sep 16 04:37:31.952849 containerd[1528]: time="2025-09-16T04:37:31.952708451Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 16 04:37:31.953300 containerd[1528]: time="2025-09-16T04:37:31.953273557Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 16 04:37:32.368405 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2231656226.mount: Deactivated successfully. Sep 16 04:37:32.373012 containerd[1528]: time="2025-09-16T04:37:32.372945492Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:37:32.374082 containerd[1528]: time="2025-09-16T04:37:32.374041469Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 16 04:37:32.375088 containerd[1528]: time="2025-09-16T04:37:32.375059018Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:37:32.377263 containerd[1528]: time="2025-09-16T04:37:32.377192730Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:37:32.377882 containerd[1528]: time="2025-09-16T04:37:32.377839154Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 424.510642ms" Sep 16 04:37:32.377882 containerd[1528]: time="2025-09-16T04:37:32.377877332Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 16 04:37:32.378405 containerd[1528]: time="2025-09-16T04:37:32.378373360Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 16 04:37:32.861617 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount685583555.mount: Deactivated successfully. Sep 16 04:37:34.763722 containerd[1528]: time="2025-09-16T04:37:34.763660555Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:34.764675 containerd[1528]: time="2025-09-16T04:37:34.764400594Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943167" Sep 16 04:37:34.765507 containerd[1528]: time="2025-09-16T04:37:34.765473111Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:34.768462 containerd[1528]: time="2025-09-16T04:37:34.768416466Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:34.769806 containerd[1528]: time="2025-09-16T04:37:34.769747132Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.391213835s" Sep 16 04:37:34.770339 containerd[1528]: time="2025-09-16T04:37:34.770171261Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 16 04:37:39.007888 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 16 04:37:39.009490 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:37:39.166428 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:39.178694 (kubelet)[2204]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:37:39.216551 kubelet[2204]: E0916 04:37:39.216483 2204 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:37:39.219124 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:37:39.219437 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:37:39.219861 systemd[1]: kubelet.service: Consumed 147ms CPU time, 107.1M memory peak. Sep 16 04:37:39.949929 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:39.950077 systemd[1]: kubelet.service: Consumed 147ms CPU time, 107.1M memory peak. Sep 16 04:37:39.952213 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:37:39.977402 systemd[1]: Reload requested from client PID 2219 ('systemctl') (unit session-7.scope)... Sep 16 04:37:39.977421 systemd[1]: Reloading... Sep 16 04:37:40.050371 zram_generator::config[2264]: No configuration found. Sep 16 04:37:40.229189 systemd[1]: Reloading finished in 251 ms. Sep 16 04:37:40.289910 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 16 04:37:40.290009 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 16 04:37:40.290313 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:40.290380 systemd[1]: kubelet.service: Consumed 105ms CPU time, 95M memory peak. Sep 16 04:37:40.292139 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:37:40.420466 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:40.425158 (kubelet)[2306]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:37:40.463000 kubelet[2306]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:37:40.463000 kubelet[2306]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 04:37:40.463000 kubelet[2306]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:37:40.463395 kubelet[2306]: I0916 04:37:40.463064 2306 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:37:41.388815 kubelet[2306]: I0916 04:37:41.388760 2306 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 16 04:37:41.388815 kubelet[2306]: I0916 04:37:41.388802 2306 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:37:41.390252 kubelet[2306]: I0916 04:37:41.389454 2306 server.go:954] "Client rotation is on, will bootstrap in background" Sep 16 04:37:41.413960 kubelet[2306]: E0916 04:37:41.413906 2306 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.111:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:37:41.415190 kubelet[2306]: I0916 04:37:41.415168 2306 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:37:41.421515 kubelet[2306]: I0916 04:37:41.421481 2306 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:37:41.424643 kubelet[2306]: I0916 04:37:41.424622 2306 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:37:41.425943 kubelet[2306]: I0916 04:37:41.425884 2306 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:37:41.426179 kubelet[2306]: I0916 04:37:41.425947 2306 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:37:41.426281 kubelet[2306]: I0916 04:37:41.426255 2306 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:37:41.426281 kubelet[2306]: I0916 04:37:41.426264 2306 container_manager_linux.go:304] "Creating device plugin manager" Sep 16 04:37:41.426529 kubelet[2306]: I0916 04:37:41.426509 2306 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:37:41.428954 kubelet[2306]: I0916 04:37:41.428911 2306 kubelet.go:446] "Attempting to sync node with API server" Sep 16 04:37:41.429051 kubelet[2306]: I0916 04:37:41.429037 2306 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:37:41.429106 kubelet[2306]: I0916 04:37:41.429074 2306 kubelet.go:352] "Adding apiserver pod source" Sep 16 04:37:41.429106 kubelet[2306]: I0916 04:37:41.429086 2306 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:37:41.433731 kubelet[2306]: I0916 04:37:41.433440 2306 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:37:41.434103 kubelet[2306]: I0916 04:37:41.434081 2306 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 04:37:41.434230 kubelet[2306]: W0916 04:37:41.434215 2306 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 16 04:37:41.434438 kubelet[2306]: W0916 04:37:41.434385 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.111:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused Sep 16 04:37:41.434476 kubelet[2306]: E0916 04:37:41.434456 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.111:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:37:41.434706 kubelet[2306]: W0916 04:37:41.434651 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.111:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused Sep 16 04:37:41.434782 kubelet[2306]: E0916 04:37:41.434712 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.111:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:37:41.435160 kubelet[2306]: I0916 04:37:41.435137 2306 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 04:37:41.435190 kubelet[2306]: I0916 04:37:41.435176 2306 server.go:1287] "Started kubelet" Sep 16 04:37:41.439484 kubelet[2306]: I0916 04:37:41.439239 2306 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:37:41.439908 kubelet[2306]: I0916 04:37:41.439881 2306 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:37:41.440081 kubelet[2306]: I0916 04:37:41.439976 2306 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:37:41.442173 kubelet[2306]: E0916 04:37:41.441863 2306 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.111:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.111:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1865a9607ce1845a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-16 04:37:41.435155546 +0000 UTC m=+1.006501580,LastTimestamp:2025-09-16 04:37:41.435155546 +0000 UTC m=+1.006501580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 16 04:37:41.442293 kubelet[2306]: I0916 04:37:41.442186 2306 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:37:41.443529 kubelet[2306]: I0916 04:37:41.443501 2306 server.go:479] "Adding debug handlers to kubelet server" Sep 16 04:37:41.444942 kubelet[2306]: I0916 04:37:41.444900 2306 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:37:41.445265 kubelet[2306]: I0916 04:37:41.445245 2306 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 04:37:41.446048 kubelet[2306]: I0916 04:37:41.446019 2306 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 04:37:41.446106 kubelet[2306]: I0916 04:37:41.446094 2306 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:37:41.446520 kubelet[2306]: E0916 04:37:41.446484 2306 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 16 04:37:41.448422 kubelet[2306]: W0916 04:37:41.448367 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.111:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused Sep 16 04:37:41.448513 kubelet[2306]: E0916 04:37:41.448435 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.111:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:37:41.448598 kubelet[2306]: E0916 04:37:41.448237 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.111:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.111:6443: connect: connection refused" interval="200ms" Sep 16 04:37:41.449165 kubelet[2306]: I0916 04:37:41.449014 2306 factory.go:221] Registration of the systemd container factory successfully Sep 16 04:37:41.449165 kubelet[2306]: I0916 04:37:41.449130 2306 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:37:41.450409 kubelet[2306]: E0916 04:37:41.450379 2306 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 04:37:41.450796 kubelet[2306]: I0916 04:37:41.450770 2306 factory.go:221] Registration of the containerd container factory successfully Sep 16 04:37:41.461529 kubelet[2306]: I0916 04:37:41.461372 2306 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 04:37:41.463041 kubelet[2306]: I0916 04:37:41.463008 2306 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 04:37:41.463742 kubelet[2306]: I0916 04:37:41.463383 2306 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 16 04:37:41.463742 kubelet[2306]: I0916 04:37:41.463421 2306 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 04:37:41.463742 kubelet[2306]: I0916 04:37:41.463432 2306 kubelet.go:2382] "Starting kubelet main sync loop" Sep 16 04:37:41.463742 kubelet[2306]: E0916 04:37:41.463477 2306 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:37:41.467204 kubelet[2306]: W0916 04:37:41.467011 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.111:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused Sep 16 04:37:41.467204 kubelet[2306]: E0916 04:37:41.467062 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.111:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:37:41.467690 kubelet[2306]: I0916 04:37:41.467670 2306 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 04:37:41.467690 kubelet[2306]: I0916 04:37:41.467689 2306 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 04:37:41.467760 kubelet[2306]: I0916 04:37:41.467712 2306 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:37:41.546676 kubelet[2306]: E0916 04:37:41.546616 2306 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 16 04:37:41.564611 kubelet[2306]: E0916 04:37:41.564566 2306 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 16 04:37:41.564757 kubelet[2306]: I0916 04:37:41.564731 2306 policy_none.go:49] "None policy: Start" Sep 16 04:37:41.564757 kubelet[2306]: I0916 04:37:41.564754 2306 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 04:37:41.564810 kubelet[2306]: I0916 04:37:41.564768 2306 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:37:41.571464 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 16 04:37:41.585523 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 16 04:37:41.606205 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 16 04:37:41.607959 kubelet[2306]: I0916 04:37:41.607921 2306 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 04:37:41.608173 kubelet[2306]: I0916 04:37:41.608155 2306 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:37:41.608211 kubelet[2306]: I0916 04:37:41.608174 2306 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:37:41.608897 kubelet[2306]: I0916 04:37:41.608508 2306 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:37:41.610267 kubelet[2306]: E0916 04:37:41.610239 2306 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 04:37:41.610399 kubelet[2306]: E0916 04:37:41.610287 2306 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 16 04:37:41.650182 kubelet[2306]: E0916 04:37:41.650041 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.111:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.111:6443: connect: connection refused" interval="400ms" Sep 16 04:37:41.710321 kubelet[2306]: I0916 04:37:41.710278 2306 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 16 04:37:41.710863 kubelet[2306]: E0916 04:37:41.710806 2306 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.111:6443/api/v1/nodes\": dial tcp 10.0.0.111:6443: connect: connection refused" node="localhost" Sep 16 04:37:41.774312 systemd[1]: Created slice kubepods-burstable-pod5632c1dd8e9e103c20d3d7801f8315c6.slice - libcontainer container kubepods-burstable-pod5632c1dd8e9e103c20d3d7801f8315c6.slice. Sep 16 04:37:41.806083 kubelet[2306]: E0916 04:37:41.806031 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:41.809845 systemd[1]: Created slice kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice - libcontainer container kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice. Sep 16 04:37:41.812049 kubelet[2306]: E0916 04:37:41.811998 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:41.834361 systemd[1]: Created slice kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice - libcontainer container kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice. Sep 16 04:37:41.836712 kubelet[2306]: E0916 04:37:41.836647 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:41.847967 kubelet[2306]: I0916 04:37:41.847924 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:41.847967 kubelet[2306]: I0916 04:37:41.847967 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:41.848074 kubelet[2306]: I0916 04:37:41.847986 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:41.848074 kubelet[2306]: I0916 04:37:41.848010 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 16 04:37:41.848074 kubelet[2306]: I0916 04:37:41.848025 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5632c1dd8e9e103c20d3d7801f8315c6-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"5632c1dd8e9e103c20d3d7801f8315c6\") " pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:41.848074 kubelet[2306]: I0916 04:37:41.848041 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:41.848074 kubelet[2306]: I0916 04:37:41.848057 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:41.848193 kubelet[2306]: I0916 04:37:41.848080 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5632c1dd8e9e103c20d3d7801f8315c6-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"5632c1dd8e9e103c20d3d7801f8315c6\") " pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:41.848193 kubelet[2306]: I0916 04:37:41.848097 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5632c1dd8e9e103c20d3d7801f8315c6-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"5632c1dd8e9e103c20d3d7801f8315c6\") " pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:41.912634 kubelet[2306]: I0916 04:37:41.912121 2306 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 16 04:37:41.912634 kubelet[2306]: E0916 04:37:41.912509 2306 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.111:6443/api/v1/nodes\": dial tcp 10.0.0.111:6443: connect: connection refused" node="localhost" Sep 16 04:37:42.050955 kubelet[2306]: E0916 04:37:42.050911 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.111:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.111:6443: connect: connection refused" interval="800ms" Sep 16 04:37:42.107785 containerd[1528]: time="2025-09-16T04:37:42.107722820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:5632c1dd8e9e103c20d3d7801f8315c6,Namespace:kube-system,Attempt:0,}" Sep 16 04:37:42.113555 containerd[1528]: time="2025-09-16T04:37:42.113521772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,}" Sep 16 04:37:42.127624 containerd[1528]: time="2025-09-16T04:37:42.127578305Z" level=info msg="connecting to shim 36ebbff9c022f5871904b0dd896a6adbb48b15aa19c9a42b663c32fbb9fb8ea8" address="unix:///run/containerd/s/174013c75b6820403661de0e32c07259f5f92600514c56e8b1e668d3b3eaef75" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:37:42.138763 containerd[1528]: time="2025-09-16T04:37:42.138505243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,}" Sep 16 04:37:42.141030 containerd[1528]: time="2025-09-16T04:37:42.140891641Z" level=info msg="connecting to shim cc56b58baca10465810aa5520629215c1da12963a22c7e0e6cf114b99e864a64" address="unix:///run/containerd/s/66dcc2a3c258b2e7484c67665dd0f6217b619fc1b1140b7b1f2d6121e2cad5e8" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:37:42.158536 systemd[1]: Started cri-containerd-36ebbff9c022f5871904b0dd896a6adbb48b15aa19c9a42b663c32fbb9fb8ea8.scope - libcontainer container 36ebbff9c022f5871904b0dd896a6adbb48b15aa19c9a42b663c32fbb9fb8ea8. Sep 16 04:37:42.167726 systemd[1]: Started cri-containerd-cc56b58baca10465810aa5520629215c1da12963a22c7e0e6cf114b99e864a64.scope - libcontainer container cc56b58baca10465810aa5520629215c1da12963a22c7e0e6cf114b99e864a64. Sep 16 04:37:42.169409 containerd[1528]: time="2025-09-16T04:37:42.169361894Z" level=info msg="connecting to shim 03195ce73dc3cbb100820c65cecb5aa0df7760651bd2d84c56a580537c30bc76" address="unix:///run/containerd/s/71394ea704cb2379dac3f53980f7ee7f50c757f15131a2b8b1187519142fc362" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:37:42.198536 systemd[1]: Started cri-containerd-03195ce73dc3cbb100820c65cecb5aa0df7760651bd2d84c56a580537c30bc76.scope - libcontainer container 03195ce73dc3cbb100820c65cecb5aa0df7760651bd2d84c56a580537c30bc76. Sep 16 04:37:42.207117 containerd[1528]: time="2025-09-16T04:37:42.207065279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:5632c1dd8e9e103c20d3d7801f8315c6,Namespace:kube-system,Attempt:0,} returns sandbox id \"36ebbff9c022f5871904b0dd896a6adbb48b15aa19c9a42b663c32fbb9fb8ea8\"" Sep 16 04:37:42.210916 containerd[1528]: time="2025-09-16T04:37:42.210861283Z" level=info msg="CreateContainer within sandbox \"36ebbff9c022f5871904b0dd896a6adbb48b15aa19c9a42b663c32fbb9fb8ea8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 16 04:37:42.216583 containerd[1528]: time="2025-09-16T04:37:42.216507624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,} returns sandbox id \"cc56b58baca10465810aa5520629215c1da12963a22c7e0e6cf114b99e864a64\"" Sep 16 04:37:42.219868 containerd[1528]: time="2025-09-16T04:37:42.219517342Z" level=info msg="CreateContainer within sandbox \"cc56b58baca10465810aa5520629215c1da12963a22c7e0e6cf114b99e864a64\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 16 04:37:42.221624 containerd[1528]: time="2025-09-16T04:37:42.221537758Z" level=info msg="Container 59555db6f0f73b8acad5a3b3ab07fb0810269d5d43cab55f646a0cd54cfe0e2c: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:37:42.229301 containerd[1528]: time="2025-09-16T04:37:42.229252520Z" level=info msg="CreateContainer within sandbox \"36ebbff9c022f5871904b0dd896a6adbb48b15aa19c9a42b663c32fbb9fb8ea8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"59555db6f0f73b8acad5a3b3ab07fb0810269d5d43cab55f646a0cd54cfe0e2c\"" Sep 16 04:37:42.230562 containerd[1528]: time="2025-09-16T04:37:42.230269643Z" level=info msg="StartContainer for \"59555db6f0f73b8acad5a3b3ab07fb0810269d5d43cab55f646a0cd54cfe0e2c\"" Sep 16 04:37:42.230647 containerd[1528]: time="2025-09-16T04:37:42.230626632Z" level=info msg="Container 869782958a898845f986671100a006174c96ccbbf1247e8e94a17578fbbe66c0: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:37:42.234441 containerd[1528]: time="2025-09-16T04:37:42.234398733Z" level=info msg="connecting to shim 59555db6f0f73b8acad5a3b3ab07fb0810269d5d43cab55f646a0cd54cfe0e2c" address="unix:///run/containerd/s/174013c75b6820403661de0e32c07259f5f92600514c56e8b1e668d3b3eaef75" protocol=ttrpc version=3 Sep 16 04:37:42.243551 containerd[1528]: time="2025-09-16T04:37:42.243502756Z" level=info msg="CreateContainer within sandbox \"cc56b58baca10465810aa5520629215c1da12963a22c7e0e6cf114b99e864a64\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"869782958a898845f986671100a006174c96ccbbf1247e8e94a17578fbbe66c0\"" Sep 16 04:37:42.244117 containerd[1528]: time="2025-09-16T04:37:42.244085106Z" level=info msg="StartContainer for \"869782958a898845f986671100a006174c96ccbbf1247e8e94a17578fbbe66c0\"" Sep 16 04:37:42.244847 containerd[1528]: time="2025-09-16T04:37:42.244791168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,} returns sandbox id \"03195ce73dc3cbb100820c65cecb5aa0df7760651bd2d84c56a580537c30bc76\"" Sep 16 04:37:42.245693 containerd[1528]: time="2025-09-16T04:37:42.245662754Z" level=info msg="connecting to shim 869782958a898845f986671100a006174c96ccbbf1247e8e94a17578fbbe66c0" address="unix:///run/containerd/s/66dcc2a3c258b2e7484c67665dd0f6217b619fc1b1140b7b1f2d6121e2cad5e8" protocol=ttrpc version=3 Sep 16 04:37:42.249362 containerd[1528]: time="2025-09-16T04:37:42.248199046Z" level=info msg="CreateContainer within sandbox \"03195ce73dc3cbb100820c65cecb5aa0df7760651bd2d84c56a580537c30bc76\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 16 04:37:42.258389 containerd[1528]: time="2025-09-16T04:37:42.258347853Z" level=info msg="Container 85d1068e477b5103799469c3ccb2c0576cdb298d9439df3c4afec4ed25ff5400: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:37:42.261534 systemd[1]: Started cri-containerd-59555db6f0f73b8acad5a3b3ab07fb0810269d5d43cab55f646a0cd54cfe0e2c.scope - libcontainer container 59555db6f0f73b8acad5a3b3ab07fb0810269d5d43cab55f646a0cd54cfe0e2c. Sep 16 04:37:42.264684 systemd[1]: Started cri-containerd-869782958a898845f986671100a006174c96ccbbf1247e8e94a17578fbbe66c0.scope - libcontainer container 869782958a898845f986671100a006174c96ccbbf1247e8e94a17578fbbe66c0. Sep 16 04:37:42.266420 containerd[1528]: time="2025-09-16T04:37:42.266335383Z" level=info msg="CreateContainer within sandbox \"03195ce73dc3cbb100820c65cecb5aa0df7760651bd2d84c56a580537c30bc76\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"85d1068e477b5103799469c3ccb2c0576cdb298d9439df3c4afec4ed25ff5400\"" Sep 16 04:37:42.267338 containerd[1528]: time="2025-09-16T04:37:42.266858414Z" level=info msg="StartContainer for \"85d1068e477b5103799469c3ccb2c0576cdb298d9439df3c4afec4ed25ff5400\"" Sep 16 04:37:42.267918 containerd[1528]: time="2025-09-16T04:37:42.267880374Z" level=info msg="connecting to shim 85d1068e477b5103799469c3ccb2c0576cdb298d9439df3c4afec4ed25ff5400" address="unix:///run/containerd/s/71394ea704cb2379dac3f53980f7ee7f50c757f15131a2b8b1187519142fc362" protocol=ttrpc version=3 Sep 16 04:37:42.291619 systemd[1]: Started cri-containerd-85d1068e477b5103799469c3ccb2c0576cdb298d9439df3c4afec4ed25ff5400.scope - libcontainer container 85d1068e477b5103799469c3ccb2c0576cdb298d9439df3c4afec4ed25ff5400. Sep 16 04:37:42.314666 kubelet[2306]: I0916 04:37:42.314364 2306 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 16 04:37:42.314998 kubelet[2306]: E0916 04:37:42.314961 2306 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.111:6443/api/v1/nodes\": dial tcp 10.0.0.111:6443: connect: connection refused" node="localhost" Sep 16 04:37:42.316253 containerd[1528]: time="2025-09-16T04:37:42.316191522Z" level=info msg="StartContainer for \"59555db6f0f73b8acad5a3b3ab07fb0810269d5d43cab55f646a0cd54cfe0e2c\" returns successfully" Sep 16 04:37:42.326770 containerd[1528]: time="2025-09-16T04:37:42.326711067Z" level=info msg="StartContainer for \"869782958a898845f986671100a006174c96ccbbf1247e8e94a17578fbbe66c0\" returns successfully" Sep 16 04:37:42.330762 kubelet[2306]: W0916 04:37:42.330697 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.111:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused Sep 16 04:37:42.330929 kubelet[2306]: E0916 04:37:42.330774 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.111:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:37:42.347750 containerd[1528]: time="2025-09-16T04:37:42.347705430Z" level=info msg="StartContainer for \"85d1068e477b5103799469c3ccb2c0576cdb298d9439df3c4afec4ed25ff5400\" returns successfully" Sep 16 04:37:42.474490 kubelet[2306]: E0916 04:37:42.474384 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:42.478183 kubelet[2306]: E0916 04:37:42.478153 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:42.480335 kubelet[2306]: E0916 04:37:42.480312 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:43.116935 kubelet[2306]: I0916 04:37:43.116891 2306 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 16 04:37:43.482944 kubelet[2306]: E0916 04:37:43.482823 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:43.483352 kubelet[2306]: E0916 04:37:43.483309 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 16 04:37:44.038539 kubelet[2306]: E0916 04:37:44.038489 2306 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 16 04:37:44.121012 kubelet[2306]: I0916 04:37:44.120949 2306 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 16 04:37:44.147993 kubelet[2306]: I0916 04:37:44.146817 2306 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:44.208089 kubelet[2306]: E0916 04:37:44.207551 2306 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:44.208089 kubelet[2306]: I0916 04:37:44.207589 2306 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:44.210539 kubelet[2306]: E0916 04:37:44.210373 2306 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:44.211386 kubelet[2306]: I0916 04:37:44.210598 2306 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 16 04:37:44.213484 kubelet[2306]: E0916 04:37:44.213447 2306 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 16 04:37:44.434648 kubelet[2306]: I0916 04:37:44.434407 2306 apiserver.go:52] "Watching apiserver" Sep 16 04:37:44.446393 kubelet[2306]: I0916 04:37:44.446343 2306 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 04:37:46.360449 systemd[1]: Reload requested from client PID 2581 ('systemctl') (unit session-7.scope)... Sep 16 04:37:46.360468 systemd[1]: Reloading... Sep 16 04:37:46.431745 zram_generator::config[2628]: No configuration found. Sep 16 04:37:46.602620 systemd[1]: Reloading finished in 241 ms. Sep 16 04:37:46.624143 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:37:46.640424 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 04:37:46.640691 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:46.640753 systemd[1]: kubelet.service: Consumed 1.423s CPU time, 128.6M memory peak. Sep 16 04:37:46.642698 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:37:46.801645 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:37:46.810453 (kubelet)[2666]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:37:46.857354 kubelet[2666]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:37:46.857354 kubelet[2666]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 04:37:46.857354 kubelet[2666]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:37:46.857717 kubelet[2666]: I0916 04:37:46.857426 2666 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:37:46.864491 kubelet[2666]: I0916 04:37:46.863819 2666 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 16 04:37:46.864491 kubelet[2666]: I0916 04:37:46.863858 2666 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:37:46.864491 kubelet[2666]: I0916 04:37:46.864282 2666 server.go:954] "Client rotation is on, will bootstrap in background" Sep 16 04:37:46.866253 kubelet[2666]: I0916 04:37:46.866221 2666 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 16 04:37:46.868812 kubelet[2666]: I0916 04:37:46.868764 2666 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:37:46.872697 kubelet[2666]: I0916 04:37:46.872665 2666 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:37:46.875902 kubelet[2666]: I0916 04:37:46.875686 2666 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:37:46.876003 kubelet[2666]: I0916 04:37:46.875958 2666 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:37:46.876169 kubelet[2666]: I0916 04:37:46.875987 2666 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:37:46.876247 kubelet[2666]: I0916 04:37:46.876178 2666 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:37:46.876247 kubelet[2666]: I0916 04:37:46.876188 2666 container_manager_linux.go:304] "Creating device plugin manager" Sep 16 04:37:46.876247 kubelet[2666]: I0916 04:37:46.876237 2666 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:37:46.876718 kubelet[2666]: I0916 04:37:46.876682 2666 kubelet.go:446] "Attempting to sync node with API server" Sep 16 04:37:46.876718 kubelet[2666]: I0916 04:37:46.876707 2666 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:37:46.876802 kubelet[2666]: I0916 04:37:46.876732 2666 kubelet.go:352] "Adding apiserver pod source" Sep 16 04:37:46.876802 kubelet[2666]: I0916 04:37:46.876749 2666 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:37:46.877498 kubelet[2666]: I0916 04:37:46.877444 2666 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:37:46.878001 kubelet[2666]: I0916 04:37:46.877916 2666 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 04:37:46.879669 kubelet[2666]: I0916 04:37:46.879547 2666 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 04:37:46.879715 kubelet[2666]: I0916 04:37:46.879696 2666 server.go:1287] "Started kubelet" Sep 16 04:37:46.881945 kubelet[2666]: I0916 04:37:46.881894 2666 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:37:46.883112 kubelet[2666]: I0916 04:37:46.882618 2666 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:37:46.883112 kubelet[2666]: I0916 04:37:46.882701 2666 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:37:46.885855 kubelet[2666]: I0916 04:37:46.885821 2666 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:37:46.886194 kubelet[2666]: I0916 04:37:46.886165 2666 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:37:46.886604 kubelet[2666]: I0916 04:37:46.886586 2666 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 04:37:46.886717 kubelet[2666]: E0916 04:37:46.886699 2666 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 16 04:37:46.887082 kubelet[2666]: I0916 04:37:46.887062 2666 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 04:37:46.887200 kubelet[2666]: I0916 04:37:46.887186 2666 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:37:46.889534 kubelet[2666]: I0916 04:37:46.889503 2666 factory.go:221] Registration of the systemd container factory successfully Sep 16 04:37:46.890513 kubelet[2666]: I0916 04:37:46.889618 2666 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:37:46.892338 kubelet[2666]: I0916 04:37:46.890913 2666 server.go:479] "Adding debug handlers to kubelet server" Sep 16 04:37:46.894502 kubelet[2666]: I0916 04:37:46.894473 2666 factory.go:221] Registration of the containerd container factory successfully Sep 16 04:37:46.908552 kubelet[2666]: E0916 04:37:46.908445 2666 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 04:37:46.914698 kubelet[2666]: I0916 04:37:46.914532 2666 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 04:37:46.916532 kubelet[2666]: I0916 04:37:46.916276 2666 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 04:37:46.916532 kubelet[2666]: I0916 04:37:46.916428 2666 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 16 04:37:46.916532 kubelet[2666]: I0916 04:37:46.916457 2666 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 04:37:46.916532 kubelet[2666]: I0916 04:37:46.916464 2666 kubelet.go:2382] "Starting kubelet main sync loop" Sep 16 04:37:46.916532 kubelet[2666]: E0916 04:37:46.916516 2666 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:37:46.941723 kubelet[2666]: I0916 04:37:46.941690 2666 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 04:37:46.941723 kubelet[2666]: I0916 04:37:46.941707 2666 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 04:37:46.941723 kubelet[2666]: I0916 04:37:46.941735 2666 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:37:46.941919 kubelet[2666]: I0916 04:37:46.941900 2666 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 16 04:37:46.941947 kubelet[2666]: I0916 04:37:46.941919 2666 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 16 04:37:46.941947 kubelet[2666]: I0916 04:37:46.941941 2666 policy_none.go:49] "None policy: Start" Sep 16 04:37:46.941996 kubelet[2666]: I0916 04:37:46.941949 2666 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 04:37:46.941996 kubelet[2666]: I0916 04:37:46.941958 2666 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:37:46.942065 kubelet[2666]: I0916 04:37:46.942052 2666 state_mem.go:75] "Updated machine memory state" Sep 16 04:37:46.946366 kubelet[2666]: I0916 04:37:46.945726 2666 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 04:37:46.946366 kubelet[2666]: I0916 04:37:46.945905 2666 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:37:46.946366 kubelet[2666]: I0916 04:37:46.945918 2666 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:37:46.946366 kubelet[2666]: I0916 04:37:46.946205 2666 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:37:46.950022 kubelet[2666]: E0916 04:37:46.949412 2666 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 04:37:47.017724 kubelet[2666]: I0916 04:37:47.017673 2666 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:47.017858 kubelet[2666]: I0916 04:37:47.017674 2666 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 16 04:37:47.018099 kubelet[2666]: I0916 04:37:47.018065 2666 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:47.049685 kubelet[2666]: I0916 04:37:47.049639 2666 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 16 04:37:47.057183 kubelet[2666]: I0916 04:37:47.057039 2666 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 16 04:37:47.057183 kubelet[2666]: I0916 04:37:47.057126 2666 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 16 04:37:47.188584 kubelet[2666]: I0916 04:37:47.188453 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:47.188584 kubelet[2666]: I0916 04:37:47.188502 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:47.188584 kubelet[2666]: I0916 04:37:47.188525 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:47.188584 kubelet[2666]: I0916 04:37:47.188545 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5632c1dd8e9e103c20d3d7801f8315c6-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"5632c1dd8e9e103c20d3d7801f8315c6\") " pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:47.188584 kubelet[2666]: I0916 04:37:47.188562 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:47.188957 kubelet[2666]: I0916 04:37:47.188636 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 16 04:37:47.188957 kubelet[2666]: I0916 04:37:47.188684 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 16 04:37:47.188957 kubelet[2666]: I0916 04:37:47.188703 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5632c1dd8e9e103c20d3d7801f8315c6-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"5632c1dd8e9e103c20d3d7801f8315c6\") " pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:47.188957 kubelet[2666]: I0916 04:37:47.188721 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5632c1dd8e9e103c20d3d7801f8315c6-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"5632c1dd8e9e103c20d3d7801f8315c6\") " pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:47.876983 kubelet[2666]: I0916 04:37:47.876934 2666 apiserver.go:52] "Watching apiserver" Sep 16 04:37:47.887878 kubelet[2666]: I0916 04:37:47.887842 2666 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 04:37:47.930120 kubelet[2666]: I0916 04:37:47.929754 2666 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 16 04:37:47.930120 kubelet[2666]: I0916 04:37:47.929951 2666 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:47.940357 kubelet[2666]: E0916 04:37:47.938397 2666 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 16 04:37:47.940619 kubelet[2666]: E0916 04:37:47.940408 2666 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 16 04:37:47.968887 kubelet[2666]: I0916 04:37:47.968812 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=0.968793678 podStartE2EDuration="968.793678ms" podCreationTimestamp="2025-09-16 04:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:37:47.959274518 +0000 UTC m=+1.145140585" watchObservedRunningTime="2025-09-16 04:37:47.968793678 +0000 UTC m=+1.154659745" Sep 16 04:37:47.969101 kubelet[2666]: I0916 04:37:47.968968 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=0.968961337 podStartE2EDuration="968.961337ms" podCreationTimestamp="2025-09-16 04:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:37:47.968622979 +0000 UTC m=+1.154489046" watchObservedRunningTime="2025-09-16 04:37:47.968961337 +0000 UTC m=+1.154827404" Sep 16 04:37:47.987765 kubelet[2666]: I0916 04:37:47.987704 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=0.98768525 podStartE2EDuration="987.68525ms" podCreationTimestamp="2025-09-16 04:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:37:47.979160491 +0000 UTC m=+1.165026558" watchObservedRunningTime="2025-09-16 04:37:47.98768525 +0000 UTC m=+1.173551317" Sep 16 04:37:52.693270 kubelet[2666]: I0916 04:37:52.693237 2666 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 16 04:37:52.693764 containerd[1528]: time="2025-09-16T04:37:52.693618993Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 16 04:37:52.694704 kubelet[2666]: I0916 04:37:52.694105 2666 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 16 04:37:53.647234 systemd[1]: Created slice kubepods-besteffort-pod73bd20c1_e218_4fa0_8bc8_0cdf8ad51f7e.slice - libcontainer container kubepods-besteffort-pod73bd20c1_e218_4fa0_8bc8_0cdf8ad51f7e.slice. Sep 16 04:37:53.729310 kubelet[2666]: I0916 04:37:53.729086 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e-kube-proxy\") pod \"kube-proxy-2swlg\" (UID: \"73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e\") " pod="kube-system/kube-proxy-2swlg" Sep 16 04:37:53.729310 kubelet[2666]: I0916 04:37:53.729154 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e-xtables-lock\") pod \"kube-proxy-2swlg\" (UID: \"73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e\") " pod="kube-system/kube-proxy-2swlg" Sep 16 04:37:53.729310 kubelet[2666]: I0916 04:37:53.729175 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqhg\" (UniqueName: \"kubernetes.io/projected/73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e-kube-api-access-lzqhg\") pod \"kube-proxy-2swlg\" (UID: \"73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e\") " pod="kube-system/kube-proxy-2swlg" Sep 16 04:37:53.729310 kubelet[2666]: I0916 04:37:53.729238 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e-lib-modules\") pod \"kube-proxy-2swlg\" (UID: \"73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e\") " pod="kube-system/kube-proxy-2swlg" Sep 16 04:37:53.879644 systemd[1]: Created slice kubepods-besteffort-podb9890636_a416_4721_ae0d_66e195c54422.slice - libcontainer container kubepods-besteffort-podb9890636_a416_4721_ae0d_66e195c54422.slice. Sep 16 04:37:53.932151 kubelet[2666]: I0916 04:37:53.931994 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nrcs\" (UniqueName: \"kubernetes.io/projected/b9890636-a416-4721-ae0d-66e195c54422-kube-api-access-8nrcs\") pod \"tigera-operator-755d956888-f2kmd\" (UID: \"b9890636-a416-4721-ae0d-66e195c54422\") " pod="tigera-operator/tigera-operator-755d956888-f2kmd" Sep 16 04:37:53.932151 kubelet[2666]: I0916 04:37:53.932043 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b9890636-a416-4721-ae0d-66e195c54422-var-lib-calico\") pod \"tigera-operator-755d956888-f2kmd\" (UID: \"b9890636-a416-4721-ae0d-66e195c54422\") " pod="tigera-operator/tigera-operator-755d956888-f2kmd" Sep 16 04:37:53.960732 containerd[1528]: time="2025-09-16T04:37:53.960691418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2swlg,Uid:73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e,Namespace:kube-system,Attempt:0,}" Sep 16 04:37:53.982928 containerd[1528]: time="2025-09-16T04:37:53.982876598Z" level=info msg="connecting to shim 2e78d175ebcd1fef329bde88a6559ef6a1710362da3893c5cb9ed3c2489376d9" address="unix:///run/containerd/s/ad850c4cdae8099f53ad99e8dadc2dc5a1e88c9f7a8227fc2d81a013024ed249" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:37:54.006537 systemd[1]: Started cri-containerd-2e78d175ebcd1fef329bde88a6559ef6a1710362da3893c5cb9ed3c2489376d9.scope - libcontainer container 2e78d175ebcd1fef329bde88a6559ef6a1710362da3893c5cb9ed3c2489376d9. Sep 16 04:37:54.028568 containerd[1528]: time="2025-09-16T04:37:54.028516159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-2swlg,Uid:73bd20c1-e218-4fa0-8bc8-0cdf8ad51f7e,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e78d175ebcd1fef329bde88a6559ef6a1710362da3893c5cb9ed3c2489376d9\"" Sep 16 04:37:54.031279 containerd[1528]: time="2025-09-16T04:37:54.031240602Z" level=info msg="CreateContainer within sandbox \"2e78d175ebcd1fef329bde88a6559ef6a1710362da3893c5cb9ed3c2489376d9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 16 04:37:54.045202 containerd[1528]: time="2025-09-16T04:37:54.045165294Z" level=info msg="Container 8b0551776d6cc7ecfdc517f4ccc419d1bd4965f4151a1c8c532d17d63c1b8308: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:37:54.052162 containerd[1528]: time="2025-09-16T04:37:54.052105300Z" level=info msg="CreateContainer within sandbox \"2e78d175ebcd1fef329bde88a6559ef6a1710362da3893c5cb9ed3c2489376d9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"8b0551776d6cc7ecfdc517f4ccc419d1bd4965f4151a1c8c532d17d63c1b8308\"" Sep 16 04:37:54.053145 containerd[1528]: time="2025-09-16T04:37:54.053120501Z" level=info msg="StartContainer for \"8b0551776d6cc7ecfdc517f4ccc419d1bd4965f4151a1c8c532d17d63c1b8308\"" Sep 16 04:37:54.054855 containerd[1528]: time="2025-09-16T04:37:54.054748662Z" level=info msg="connecting to shim 8b0551776d6cc7ecfdc517f4ccc419d1bd4965f4151a1c8c532d17d63c1b8308" address="unix:///run/containerd/s/ad850c4cdae8099f53ad99e8dadc2dc5a1e88c9f7a8227fc2d81a013024ed249" protocol=ttrpc version=3 Sep 16 04:37:54.073537 systemd[1]: Started cri-containerd-8b0551776d6cc7ecfdc517f4ccc419d1bd4965f4151a1c8c532d17d63c1b8308.scope - libcontainer container 8b0551776d6cc7ecfdc517f4ccc419d1bd4965f4151a1c8c532d17d63c1b8308. Sep 16 04:37:54.106756 containerd[1528]: time="2025-09-16T04:37:54.106714387Z" level=info msg="StartContainer for \"8b0551776d6cc7ecfdc517f4ccc419d1bd4965f4151a1c8c532d17d63c1b8308\" returns successfully" Sep 16 04:37:54.184426 containerd[1528]: time="2025-09-16T04:37:54.184266175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-f2kmd,Uid:b9890636-a416-4721-ae0d-66e195c54422,Namespace:tigera-operator,Attempt:0,}" Sep 16 04:37:54.200615 containerd[1528]: time="2025-09-16T04:37:54.200571709Z" level=info msg="connecting to shim 4d1e500890b5e5a6b547bb9dadf661a4d7730778b3fc9af6e7c7a5c695f288c4" address="unix:///run/containerd/s/8ea26689b7121851858ff039c13935250409ef24863d6a461db4dcd5d08b68b2" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:37:54.235558 systemd[1]: Started cri-containerd-4d1e500890b5e5a6b547bb9dadf661a4d7730778b3fc9af6e7c7a5c695f288c4.scope - libcontainer container 4d1e500890b5e5a6b547bb9dadf661a4d7730778b3fc9af6e7c7a5c695f288c4. Sep 16 04:37:54.275496 containerd[1528]: time="2025-09-16T04:37:54.275449295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-f2kmd,Uid:b9890636-a416-4721-ae0d-66e195c54422,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4d1e500890b5e5a6b547bb9dadf661a4d7730778b3fc9af6e7c7a5c695f288c4\"" Sep 16 04:37:54.278558 containerd[1528]: time="2025-09-16T04:37:54.278514417Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 16 04:37:54.957388 kubelet[2666]: I0916 04:37:54.957309 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-2swlg" podStartSLOduration=1.95729181 podStartE2EDuration="1.95729181s" podCreationTimestamp="2025-09-16 04:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:37:54.954845568 +0000 UTC m=+8.140711675" watchObservedRunningTime="2025-09-16 04:37:54.95729181 +0000 UTC m=+8.143157877" Sep 16 04:37:55.803231 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1829539762.mount: Deactivated successfully. Sep 16 04:37:56.314201 containerd[1528]: time="2025-09-16T04:37:56.314149600Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:56.314716 containerd[1528]: time="2025-09-16T04:37:56.314677440Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 16 04:37:56.315345 containerd[1528]: time="2025-09-16T04:37:56.315306041Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:56.317680 containerd[1528]: time="2025-09-16T04:37:56.317643163Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:37:56.318394 containerd[1528]: time="2025-09-16T04:37:56.318367203Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.039811386s" Sep 16 04:37:56.318434 containerd[1528]: time="2025-09-16T04:37:56.318397403Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 16 04:37:56.320428 containerd[1528]: time="2025-09-16T04:37:56.320400765Z" level=info msg="CreateContainer within sandbox \"4d1e500890b5e5a6b547bb9dadf661a4d7730778b3fc9af6e7c7a5c695f288c4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 16 04:37:56.326852 containerd[1528]: time="2025-09-16T04:37:56.326820410Z" level=info msg="Container 08caeebd969e72690c60422d43de4b5a23c1e42ab7110746bfc30cbc6775b3f1: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:37:56.332354 containerd[1528]: time="2025-09-16T04:37:56.332239094Z" level=info msg="CreateContainer within sandbox \"4d1e500890b5e5a6b547bb9dadf661a4d7730778b3fc9af6e7c7a5c695f288c4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"08caeebd969e72690c60422d43de4b5a23c1e42ab7110746bfc30cbc6775b3f1\"" Sep 16 04:37:56.332767 containerd[1528]: time="2025-09-16T04:37:56.332743815Z" level=info msg="StartContainer for \"08caeebd969e72690c60422d43de4b5a23c1e42ab7110746bfc30cbc6775b3f1\"" Sep 16 04:37:56.333895 containerd[1528]: time="2025-09-16T04:37:56.333647775Z" level=info msg="connecting to shim 08caeebd969e72690c60422d43de4b5a23c1e42ab7110746bfc30cbc6775b3f1" address="unix:///run/containerd/s/8ea26689b7121851858ff039c13935250409ef24863d6a461db4dcd5d08b68b2" protocol=ttrpc version=3 Sep 16 04:37:56.364496 systemd[1]: Started cri-containerd-08caeebd969e72690c60422d43de4b5a23c1e42ab7110746bfc30cbc6775b3f1.scope - libcontainer container 08caeebd969e72690c60422d43de4b5a23c1e42ab7110746bfc30cbc6775b3f1. Sep 16 04:37:56.388463 containerd[1528]: time="2025-09-16T04:37:56.388420178Z" level=info msg="StartContainer for \"08caeebd969e72690c60422d43de4b5a23c1e42ab7110746bfc30cbc6775b3f1\" returns successfully" Sep 16 04:37:56.963340 kubelet[2666]: I0916 04:37:56.963204 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-f2kmd" podStartSLOduration=1.92070056 podStartE2EDuration="3.963191348s" podCreationTimestamp="2025-09-16 04:37:53 +0000 UTC" firstStartedPulling="2025-09-16 04:37:54.276803296 +0000 UTC m=+7.462669363" lastFinishedPulling="2025-09-16 04:37:56.319294084 +0000 UTC m=+9.505160151" observedRunningTime="2025-09-16 04:37:56.962968788 +0000 UTC m=+10.148834895" watchObservedRunningTime="2025-09-16 04:37:56.963191348 +0000 UTC m=+10.149057415" Sep 16 04:38:01.529422 update_engine[1511]: I20250916 04:38:01.529368 1511 update_attempter.cc:509] Updating boot flags... Sep 16 04:38:01.529458 sudo[1739]: pam_unix(sudo:session): session closed for user root Sep 16 04:38:01.530913 sshd[1738]: Connection closed by 10.0.0.1 port 33484 Sep 16 04:38:01.531606 sshd-session[1735]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:01.537660 systemd[1]: sshd@6-10.0.0.111:22-10.0.0.1:33484.service: Deactivated successfully. Sep 16 04:38:01.540517 systemd[1]: session-7.scope: Deactivated successfully. Sep 16 04:38:01.540752 systemd[1]: session-7.scope: Consumed 6.909s CPU time, 220.7M memory peak. Sep 16 04:38:01.543700 systemd-logind[1510]: Session 7 logged out. Waiting for processes to exit. Sep 16 04:38:01.545743 systemd-logind[1510]: Removed session 7. Sep 16 04:38:07.401305 systemd[1]: Created slice kubepods-besteffort-pod90b23e7d_70bd_4525_b580_78166a5a66d7.slice - libcontainer container kubepods-besteffort-pod90b23e7d_70bd_4525_b580_78166a5a66d7.slice. Sep 16 04:38:07.427164 kubelet[2666]: I0916 04:38:07.427104 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/90b23e7d-70bd-4525-b580-78166a5a66d7-typha-certs\") pod \"calico-typha-54888b5dd5-9cshs\" (UID: \"90b23e7d-70bd-4525-b580-78166a5a66d7\") " pod="calico-system/calico-typha-54888b5dd5-9cshs" Sep 16 04:38:07.427164 kubelet[2666]: I0916 04:38:07.427167 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfnd5\" (UniqueName: \"kubernetes.io/projected/90b23e7d-70bd-4525-b580-78166a5a66d7-kube-api-access-wfnd5\") pod \"calico-typha-54888b5dd5-9cshs\" (UID: \"90b23e7d-70bd-4525-b580-78166a5a66d7\") " pod="calico-system/calico-typha-54888b5dd5-9cshs" Sep 16 04:38:07.427569 kubelet[2666]: I0916 04:38:07.427213 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b23e7d-70bd-4525-b580-78166a5a66d7-tigera-ca-bundle\") pod \"calico-typha-54888b5dd5-9cshs\" (UID: \"90b23e7d-70bd-4525-b580-78166a5a66d7\") " pod="calico-system/calico-typha-54888b5dd5-9cshs" Sep 16 04:38:07.618205 systemd[1]: Created slice kubepods-besteffort-pod92dcc2e7_d95a_48d3_8244_c50c82e3c3d4.slice - libcontainer container kubepods-besteffort-pod92dcc2e7_d95a_48d3_8244_c50c82e3c3d4.slice. Sep 16 04:38:07.628868 kubelet[2666]: I0916 04:38:07.628543 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-cni-bin-dir\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.628868 kubelet[2666]: I0916 04:38:07.628592 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-cni-net-dir\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.628868 kubelet[2666]: I0916 04:38:07.628618 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwc5\" (UniqueName: \"kubernetes.io/projected/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-kube-api-access-8fwc5\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.628868 kubelet[2666]: I0916 04:38:07.628638 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-tigera-ca-bundle\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.628868 kubelet[2666]: I0916 04:38:07.628655 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-cni-log-dir\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.629085 kubelet[2666]: I0916 04:38:07.628669 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-lib-modules\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.629085 kubelet[2666]: I0916 04:38:07.628684 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-xtables-lock\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.629085 kubelet[2666]: I0916 04:38:07.628701 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-var-lib-calico\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.629085 kubelet[2666]: I0916 04:38:07.628744 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-var-run-calico\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.629085 kubelet[2666]: I0916 04:38:07.628771 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-node-certs\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.629195 kubelet[2666]: I0916 04:38:07.628794 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-flexvol-driver-host\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.629195 kubelet[2666]: I0916 04:38:07.628828 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/92dcc2e7-d95a-48d3-8244-c50c82e3c3d4-policysync\") pod \"calico-node-vdvts\" (UID: \"92dcc2e7-d95a-48d3-8244-c50c82e3c3d4\") " pod="calico-system/calico-node-vdvts" Sep 16 04:38:07.708260 containerd[1528]: time="2025-09-16T04:38:07.708140049Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54888b5dd5-9cshs,Uid:90b23e7d-70bd-4525-b580-78166a5a66d7,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:07.735797 kubelet[2666]: E0916 04:38:07.734606 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.735797 kubelet[2666]: W0916 04:38:07.734631 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.738400 kubelet[2666]: E0916 04:38:07.738353 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.739489 kubelet[2666]: E0916 04:38:07.739433 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.739489 kubelet[2666]: W0916 04:38:07.739449 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.739489 kubelet[2666]: E0916 04:38:07.739465 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.742354 kubelet[2666]: E0916 04:38:07.742316 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.742354 kubelet[2666]: W0916 04:38:07.742350 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.742450 kubelet[2666]: E0916 04:38:07.742364 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.745310 containerd[1528]: time="2025-09-16T04:38:07.745274026Z" level=info msg="connecting to shim 0eba36d49c1a6561e10f7003d9b42f8c3ad5397e0c7f43247dc66491c75a58cc" address="unix:///run/containerd/s/e3d3034efde8c87f5bcc0b443a0ae47fce3b64e3438acaf53bdcd1cbdbee3d0c" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:07.784229 kubelet[2666]: E0916 04:38:07.783877 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zlcv" podUID="f6d01e5d-7118-4f45-86dd-a1a5e54fa711" Sep 16 04:38:07.813560 systemd[1]: Started cri-containerd-0eba36d49c1a6561e10f7003d9b42f8c3ad5397e0c7f43247dc66491c75a58cc.scope - libcontainer container 0eba36d49c1a6561e10f7003d9b42f8c3ad5397e0c7f43247dc66491c75a58cc. Sep 16 04:38:07.815013 kubelet[2666]: E0916 04:38:07.814831 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.815013 kubelet[2666]: W0916 04:38:07.814852 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.815013 kubelet[2666]: E0916 04:38:07.814872 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.815676 kubelet[2666]: E0916 04:38:07.815655 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.815726 kubelet[2666]: W0916 04:38:07.815672 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.815726 kubelet[2666]: E0916 04:38:07.815713 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.816439 kubelet[2666]: E0916 04:38:07.816417 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.816439 kubelet[2666]: W0916 04:38:07.816434 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.816540 kubelet[2666]: E0916 04:38:07.816446 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.817345 kubelet[2666]: E0916 04:38:07.817301 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.817345 kubelet[2666]: W0916 04:38:07.817320 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.817456 kubelet[2666]: E0916 04:38:07.817354 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.818003 kubelet[2666]: E0916 04:38:07.817982 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.818003 kubelet[2666]: W0916 04:38:07.817998 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.818093 kubelet[2666]: E0916 04:38:07.818010 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.818197 kubelet[2666]: E0916 04:38:07.818183 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.818197 kubelet[2666]: W0916 04:38:07.818195 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.818259 kubelet[2666]: E0916 04:38:07.818204 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819090 kubelet[2666]: E0916 04:38:07.818703 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819090 kubelet[2666]: W0916 04:38:07.818715 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819090 kubelet[2666]: E0916 04:38:07.818744 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819090 kubelet[2666]: E0916 04:38:07.818897 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819090 kubelet[2666]: W0916 04:38:07.818905 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819090 kubelet[2666]: E0916 04:38:07.818913 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819090 kubelet[2666]: E0916 04:38:07.819096 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819268 kubelet[2666]: W0916 04:38:07.819105 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819268 kubelet[2666]: E0916 04:38:07.819113 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819268 kubelet[2666]: E0916 04:38:07.819233 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819268 kubelet[2666]: W0916 04:38:07.819240 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819268 kubelet[2666]: E0916 04:38:07.819248 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819410 kubelet[2666]: E0916 04:38:07.819387 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819410 kubelet[2666]: W0916 04:38:07.819403 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819481 kubelet[2666]: E0916 04:38:07.819411 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819540 kubelet[2666]: E0916 04:38:07.819527 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819540 kubelet[2666]: W0916 04:38:07.819537 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819593 kubelet[2666]: E0916 04:38:07.819544 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819688 kubelet[2666]: E0916 04:38:07.819672 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819688 kubelet[2666]: W0916 04:38:07.819683 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819753 kubelet[2666]: E0916 04:38:07.819691 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819814 kubelet[2666]: E0916 04:38:07.819801 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819814 kubelet[2666]: W0916 04:38:07.819811 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.819862 kubelet[2666]: E0916 04:38:07.819821 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.819946 kubelet[2666]: E0916 04:38:07.819927 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.819946 kubelet[2666]: W0916 04:38:07.819943 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.820009 kubelet[2666]: E0916 04:38:07.819952 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.820080 kubelet[2666]: E0916 04:38:07.820068 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.820080 kubelet[2666]: W0916 04:38:07.820078 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.820133 kubelet[2666]: E0916 04:38:07.820086 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.820221 kubelet[2666]: E0916 04:38:07.820208 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.820221 kubelet[2666]: W0916 04:38:07.820218 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.820287 kubelet[2666]: E0916 04:38:07.820226 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.820361 kubelet[2666]: E0916 04:38:07.820348 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.820361 kubelet[2666]: W0916 04:38:07.820358 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.820423 kubelet[2666]: E0916 04:38:07.820365 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.820498 kubelet[2666]: E0916 04:38:07.820486 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.820498 kubelet[2666]: W0916 04:38:07.820496 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.820553 kubelet[2666]: E0916 04:38:07.820503 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.820630 kubelet[2666]: E0916 04:38:07.820613 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.820630 kubelet[2666]: W0916 04:38:07.820624 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.820630 kubelet[2666]: E0916 04:38:07.820630 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.830166 kubelet[2666]: E0916 04:38:07.830007 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.830166 kubelet[2666]: W0916 04:38:07.830026 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.830166 kubelet[2666]: E0916 04:38:07.830039 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.830166 kubelet[2666]: I0916 04:38:07.830074 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f6d01e5d-7118-4f45-86dd-a1a5e54fa711-varrun\") pod \"csi-node-driver-5zlcv\" (UID: \"f6d01e5d-7118-4f45-86dd-a1a5e54fa711\") " pod="calico-system/csi-node-driver-5zlcv" Sep 16 04:38:07.830297 kubelet[2666]: E0916 04:38:07.830258 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.830297 kubelet[2666]: W0916 04:38:07.830268 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.830297 kubelet[2666]: E0916 04:38:07.830290 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.830432 kubelet[2666]: I0916 04:38:07.830307 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6d01e5d-7118-4f45-86dd-a1a5e54fa711-socket-dir\") pod \"csi-node-driver-5zlcv\" (UID: \"f6d01e5d-7118-4f45-86dd-a1a5e54fa711\") " pod="calico-system/csi-node-driver-5zlcv" Sep 16 04:38:07.830580 kubelet[2666]: E0916 04:38:07.830548 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.830580 kubelet[2666]: W0916 04:38:07.830566 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.830660 kubelet[2666]: E0916 04:38:07.830588 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.830753 kubelet[2666]: E0916 04:38:07.830739 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.830783 kubelet[2666]: W0916 04:38:07.830752 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.830783 kubelet[2666]: E0916 04:38:07.830765 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.830995 kubelet[2666]: E0916 04:38:07.830977 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.830995 kubelet[2666]: W0916 04:38:07.830993 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.831048 kubelet[2666]: E0916 04:38:07.831010 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.831350 kubelet[2666]: E0916 04:38:07.831259 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.831350 kubelet[2666]: W0916 04:38:07.831276 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.831350 kubelet[2666]: E0916 04:38:07.831293 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.831350 kubelet[2666]: I0916 04:38:07.831313 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffrf\" (UniqueName: \"kubernetes.io/projected/f6d01e5d-7118-4f45-86dd-a1a5e54fa711-kube-api-access-mffrf\") pod \"csi-node-driver-5zlcv\" (UID: \"f6d01e5d-7118-4f45-86dd-a1a5e54fa711\") " pod="calico-system/csi-node-driver-5zlcv" Sep 16 04:38:07.832161 kubelet[2666]: E0916 04:38:07.831491 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.832161 kubelet[2666]: W0916 04:38:07.831506 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.832161 kubelet[2666]: E0916 04:38:07.831515 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.832161 kubelet[2666]: E0916 04:38:07.831705 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.832161 kubelet[2666]: W0916 04:38:07.831715 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.832161 kubelet[2666]: E0916 04:38:07.831731 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.832161 kubelet[2666]: E0916 04:38:07.831894 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.832161 kubelet[2666]: W0916 04:38:07.831903 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.832161 kubelet[2666]: E0916 04:38:07.831920 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.832488 kubelet[2666]: E0916 04:38:07.832438 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.832488 kubelet[2666]: W0916 04:38:07.832450 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.832488 kubelet[2666]: E0916 04:38:07.832460 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.832583 kubelet[2666]: I0916 04:38:07.832488 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6d01e5d-7118-4f45-86dd-a1a5e54fa711-kubelet-dir\") pod \"csi-node-driver-5zlcv\" (UID: \"f6d01e5d-7118-4f45-86dd-a1a5e54fa711\") " pod="calico-system/csi-node-driver-5zlcv" Sep 16 04:38:07.832713 kubelet[2666]: E0916 04:38:07.832696 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.832713 kubelet[2666]: W0916 04:38:07.832712 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.832769 kubelet[2666]: E0916 04:38:07.832728 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.832769 kubelet[2666]: I0916 04:38:07.832744 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6d01e5d-7118-4f45-86dd-a1a5e54fa711-registration-dir\") pod \"csi-node-driver-5zlcv\" (UID: \"f6d01e5d-7118-4f45-86dd-a1a5e54fa711\") " pod="calico-system/csi-node-driver-5zlcv" Sep 16 04:38:07.833735 kubelet[2666]: E0916 04:38:07.833706 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.833735 kubelet[2666]: W0916 04:38:07.833726 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.833808 kubelet[2666]: E0916 04:38:07.833748 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.833983 kubelet[2666]: E0916 04:38:07.833964 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.833983 kubelet[2666]: W0916 04:38:07.833978 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.834027 kubelet[2666]: E0916 04:38:07.833998 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.834532 kubelet[2666]: E0916 04:38:07.834514 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.834532 kubelet[2666]: W0916 04:38:07.834530 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.834587 kubelet[2666]: E0916 04:38:07.834541 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.834734 kubelet[2666]: E0916 04:38:07.834719 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.834761 kubelet[2666]: W0916 04:38:07.834733 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.834761 kubelet[2666]: E0916 04:38:07.834742 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.852198 containerd[1528]: time="2025-09-16T04:38:07.852156234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54888b5dd5-9cshs,Uid:90b23e7d-70bd-4525-b580-78166a5a66d7,Namespace:calico-system,Attempt:0,} returns sandbox id \"0eba36d49c1a6561e10f7003d9b42f8c3ad5397e0c7f43247dc66491c75a58cc\"" Sep 16 04:38:07.854931 containerd[1528]: time="2025-09-16T04:38:07.854899995Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 16 04:38:07.924940 containerd[1528]: time="2025-09-16T04:38:07.924880707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vdvts,Uid:92dcc2e7-d95a-48d3-8244-c50c82e3c3d4,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:07.933921 kubelet[2666]: E0916 04:38:07.933835 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.933921 kubelet[2666]: W0916 04:38:07.933862 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.933921 kubelet[2666]: E0916 04:38:07.933881 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.934173 kubelet[2666]: E0916 04:38:07.934133 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.934173 kubelet[2666]: W0916 04:38:07.934143 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.934173 kubelet[2666]: E0916 04:38:07.934160 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.934440 kubelet[2666]: E0916 04:38:07.934369 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.934440 kubelet[2666]: W0916 04:38:07.934382 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.934440 kubelet[2666]: E0916 04:38:07.934396 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.934742 kubelet[2666]: E0916 04:38:07.934560 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.934742 kubelet[2666]: W0916 04:38:07.934574 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.934742 kubelet[2666]: E0916 04:38:07.934593 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.934837 kubelet[2666]: E0916 04:38:07.934773 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.934837 kubelet[2666]: W0916 04:38:07.934782 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.934837 kubelet[2666]: E0916 04:38:07.934801 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.935058 kubelet[2666]: E0916 04:38:07.935033 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.935058 kubelet[2666]: W0916 04:38:07.935047 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.935124 kubelet[2666]: E0916 04:38:07.935068 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.935244 kubelet[2666]: E0916 04:38:07.935226 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.935244 kubelet[2666]: W0916 04:38:07.935239 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.935391 kubelet[2666]: E0916 04:38:07.935271 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.935420 kubelet[2666]: E0916 04:38:07.935397 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.935420 kubelet[2666]: W0916 04:38:07.935406 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.935468 kubelet[2666]: E0916 04:38:07.935433 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.935564 kubelet[2666]: E0916 04:38:07.935548 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.935564 kubelet[2666]: W0916 04:38:07.935560 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.935612 kubelet[2666]: E0916 04:38:07.935583 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.935712 kubelet[2666]: E0916 04:38:07.935698 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.935712 kubelet[2666]: W0916 04:38:07.935710 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.935761 kubelet[2666]: E0916 04:38:07.935733 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.935858 kubelet[2666]: E0916 04:38:07.935844 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.935858 kubelet[2666]: W0916 04:38:07.935851 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.935858 kubelet[2666]: E0916 04:38:07.935871 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.936128 kubelet[2666]: E0916 04:38:07.936048 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.936128 kubelet[2666]: W0916 04:38:07.936073 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.936128 kubelet[2666]: E0916 04:38:07.936093 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.936298 kubelet[2666]: E0916 04:38:07.936273 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.936298 kubelet[2666]: W0916 04:38:07.936287 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.936449 kubelet[2666]: E0916 04:38:07.936312 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.936731 kubelet[2666]: E0916 04:38:07.936690 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.936731 kubelet[2666]: W0916 04:38:07.936709 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.936731 kubelet[2666]: E0916 04:38:07.936728 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.939359 kubelet[2666]: E0916 04:38:07.939315 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.939359 kubelet[2666]: W0916 04:38:07.939342 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.940340 kubelet[2666]: E0916 04:38:07.939660 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.940340 kubelet[2666]: E0916 04:38:07.939745 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.940340 kubelet[2666]: W0916 04:38:07.939756 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.940340 kubelet[2666]: E0916 04:38:07.939787 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.940340 kubelet[2666]: E0916 04:38:07.940036 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.940340 kubelet[2666]: W0916 04:38:07.940046 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.940340 kubelet[2666]: E0916 04:38:07.940078 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.940524 kubelet[2666]: E0916 04:38:07.940365 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.940524 kubelet[2666]: W0916 04:38:07.940374 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.940524 kubelet[2666]: E0916 04:38:07.940458 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.940627 kubelet[2666]: E0916 04:38:07.940607 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.940627 kubelet[2666]: W0916 04:38:07.940620 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.940676 kubelet[2666]: E0916 04:38:07.940641 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.940950 kubelet[2666]: E0916 04:38:07.940904 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.941376 kubelet[2666]: W0916 04:38:07.941355 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.941430 kubelet[2666]: E0916 04:38:07.941384 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.942154 kubelet[2666]: E0916 04:38:07.942119 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.942154 kubelet[2666]: W0916 04:38:07.942134 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.942615 kubelet[2666]: E0916 04:38:07.942565 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.943135 kubelet[2666]: E0916 04:38:07.943113 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.943135 kubelet[2666]: W0916 04:38:07.943129 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.943252 kubelet[2666]: E0916 04:38:07.943226 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.943633 kubelet[2666]: E0916 04:38:07.943612 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.943633 kubelet[2666]: W0916 04:38:07.943629 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.944419 kubelet[2666]: E0916 04:38:07.944389 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.944913 kubelet[2666]: E0916 04:38:07.944874 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.944913 kubelet[2666]: W0916 04:38:07.944895 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.944913 kubelet[2666]: E0916 04:38:07.944912 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.945544 kubelet[2666]: E0916 04:38:07.945515 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.945544 kubelet[2666]: W0916 04:38:07.945531 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.945544 kubelet[2666]: E0916 04:38:07.945545 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.947800 containerd[1528]: time="2025-09-16T04:38:07.947472837Z" level=info msg="connecting to shim f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139" address="unix:///run/containerd/s/0701a3afc69199a184763d73fb9acb7dcffb4cfb3d6f8459ab390f573301954f" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:07.958924 kubelet[2666]: E0916 04:38:07.958833 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:07.958924 kubelet[2666]: W0916 04:38:07.958853 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:07.958924 kubelet[2666]: E0916 04:38:07.958870 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:07.993536 systemd[1]: Started cri-containerd-f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139.scope - libcontainer container f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139. Sep 16 04:38:08.097846 containerd[1528]: time="2025-09-16T04:38:08.097809143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vdvts,Uid:92dcc2e7-d95a-48d3-8244-c50c82e3c3d4,Namespace:calico-system,Attempt:0,} returns sandbox id \"f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139\"" Sep 16 04:38:09.098227 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3285664383.mount: Deactivated successfully. Sep 16 04:38:09.654238 containerd[1528]: time="2025-09-16T04:38:09.654130800Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:09.655397 containerd[1528]: time="2025-09-16T04:38:09.654879520Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 16 04:38:09.656013 containerd[1528]: time="2025-09-16T04:38:09.655970921Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:09.658374 containerd[1528]: time="2025-09-16T04:38:09.658061762Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:09.659269 containerd[1528]: time="2025-09-16T04:38:09.659241242Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.804094407s" Sep 16 04:38:09.659395 containerd[1528]: time="2025-09-16T04:38:09.659373322Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 16 04:38:09.660991 containerd[1528]: time="2025-09-16T04:38:09.660913323Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 16 04:38:09.683177 containerd[1528]: time="2025-09-16T04:38:09.683087892Z" level=info msg="CreateContainer within sandbox \"0eba36d49c1a6561e10f7003d9b42f8c3ad5397e0c7f43247dc66491c75a58cc\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 16 04:38:09.699503 containerd[1528]: time="2025-09-16T04:38:09.699396819Z" level=info msg="Container b0b48a1094642e4990374938ae780f3ff4878947635a47f94ff57cfaf5e57c7c: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:09.700060 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3629449440.mount: Deactivated successfully. Sep 16 04:38:09.708627 containerd[1528]: time="2025-09-16T04:38:09.708577382Z" level=info msg="CreateContainer within sandbox \"0eba36d49c1a6561e10f7003d9b42f8c3ad5397e0c7f43247dc66491c75a58cc\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b0b48a1094642e4990374938ae780f3ff4878947635a47f94ff57cfaf5e57c7c\"" Sep 16 04:38:09.709470 containerd[1528]: time="2025-09-16T04:38:09.709322583Z" level=info msg="StartContainer for \"b0b48a1094642e4990374938ae780f3ff4878947635a47f94ff57cfaf5e57c7c\"" Sep 16 04:38:09.710672 containerd[1528]: time="2025-09-16T04:38:09.710634263Z" level=info msg="connecting to shim b0b48a1094642e4990374938ae780f3ff4878947635a47f94ff57cfaf5e57c7c" address="unix:///run/containerd/s/e3d3034efde8c87f5bcc0b443a0ae47fce3b64e3438acaf53bdcd1cbdbee3d0c" protocol=ttrpc version=3 Sep 16 04:38:09.737567 systemd[1]: Started cri-containerd-b0b48a1094642e4990374938ae780f3ff4878947635a47f94ff57cfaf5e57c7c.scope - libcontainer container b0b48a1094642e4990374938ae780f3ff4878947635a47f94ff57cfaf5e57c7c. Sep 16 04:38:09.786951 containerd[1528]: time="2025-09-16T04:38:09.786901455Z" level=info msg="StartContainer for \"b0b48a1094642e4990374938ae780f3ff4878947635a47f94ff57cfaf5e57c7c\" returns successfully" Sep 16 04:38:09.917159 kubelet[2666]: E0916 04:38:09.917044 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zlcv" podUID="f6d01e5d-7118-4f45-86dd-a1a5e54fa711" Sep 16 04:38:10.018589 kubelet[2666]: I0916 04:38:10.018521 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-54888b5dd5-9cshs" podStartSLOduration=1.211396822 podStartE2EDuration="3.01850491s" podCreationTimestamp="2025-09-16 04:38:07 +0000 UTC" firstStartedPulling="2025-09-16 04:38:07.853362595 +0000 UTC m=+21.039228662" lastFinishedPulling="2025-09-16 04:38:09.660470683 +0000 UTC m=+22.846336750" observedRunningTime="2025-09-16 04:38:10.01847623 +0000 UTC m=+23.204342297" watchObservedRunningTime="2025-09-16 04:38:10.01850491 +0000 UTC m=+23.204370977" Sep 16 04:38:10.035488 kubelet[2666]: E0916 04:38:10.035455 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.035488 kubelet[2666]: W0916 04:38:10.035481 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.035710 kubelet[2666]: E0916 04:38:10.035504 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.035774 kubelet[2666]: E0916 04:38:10.035725 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.035774 kubelet[2666]: W0916 04:38:10.035735 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.035774 kubelet[2666]: E0916 04:38:10.035745 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.037451 kubelet[2666]: E0916 04:38:10.037428 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.037451 kubelet[2666]: W0916 04:38:10.037446 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.037540 kubelet[2666]: E0916 04:38:10.037461 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.038351 kubelet[2666]: E0916 04:38:10.038296 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.039482 kubelet[2666]: W0916 04:38:10.038314 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.039482 kubelet[2666]: E0916 04:38:10.039392 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.040441 kubelet[2666]: E0916 04:38:10.040412 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.040441 kubelet[2666]: W0916 04:38:10.040435 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.040540 kubelet[2666]: E0916 04:38:10.040449 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.041057 kubelet[2666]: E0916 04:38:10.041037 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.041057 kubelet[2666]: W0916 04:38:10.041052 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.041362 kubelet[2666]: E0916 04:38:10.041065 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.041362 kubelet[2666]: E0916 04:38:10.041242 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.041362 kubelet[2666]: W0916 04:38:10.041251 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.041362 kubelet[2666]: E0916 04:38:10.041259 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.041568 kubelet[2666]: E0916 04:38:10.041553 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.041568 kubelet[2666]: W0916 04:38:10.041565 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.041634 kubelet[2666]: E0916 04:38:10.041576 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.041810 kubelet[2666]: E0916 04:38:10.041779 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.041810 kubelet[2666]: W0916 04:38:10.041791 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.041990 kubelet[2666]: E0916 04:38:10.041804 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.042242 kubelet[2666]: E0916 04:38:10.042226 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.042242 kubelet[2666]: W0916 04:38:10.042239 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.042314 kubelet[2666]: E0916 04:38:10.042250 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.042717 kubelet[2666]: E0916 04:38:10.042672 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.042717 kubelet[2666]: W0916 04:38:10.042710 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.042795 kubelet[2666]: E0916 04:38:10.042722 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.042897 kubelet[2666]: E0916 04:38:10.042872 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.042897 kubelet[2666]: W0916 04:38:10.042884 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.042897 kubelet[2666]: E0916 04:38:10.042892 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.043405 kubelet[2666]: E0916 04:38:10.043114 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.043405 kubelet[2666]: W0916 04:38:10.043124 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.043405 kubelet[2666]: E0916 04:38:10.043133 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.043957 kubelet[2666]: E0916 04:38:10.043935 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.043957 kubelet[2666]: W0916 04:38:10.043950 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.044048 kubelet[2666]: E0916 04:38:10.043969 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.044132 kubelet[2666]: E0916 04:38:10.044118 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.044132 kubelet[2666]: W0916 04:38:10.044129 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.044192 kubelet[2666]: E0916 04:38:10.044137 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.051051 kubelet[2666]: E0916 04:38:10.050752 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.051051 kubelet[2666]: W0916 04:38:10.050768 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.051051 kubelet[2666]: E0916 04:38:10.050780 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.052178 kubelet[2666]: E0916 04:38:10.052049 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.052178 kubelet[2666]: W0916 04:38:10.052064 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.052178 kubelet[2666]: E0916 04:38:10.052080 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.052429 kubelet[2666]: E0916 04:38:10.052383 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.052541 kubelet[2666]: W0916 04:38:10.052506 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.052682 kubelet[2666]: E0916 04:38:10.052669 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.053246 kubelet[2666]: E0916 04:38:10.053107 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.053246 kubelet[2666]: W0916 04:38:10.053128 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.053246 kubelet[2666]: E0916 04:38:10.053146 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.053465 kubelet[2666]: E0916 04:38:10.053453 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.053524 kubelet[2666]: W0916 04:38:10.053513 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.053682 kubelet[2666]: E0916 04:38:10.053661 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.053919 kubelet[2666]: E0916 04:38:10.053888 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.053919 kubelet[2666]: W0916 04:38:10.053904 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.054366 kubelet[2666]: E0916 04:38:10.054323 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.054366 kubelet[2666]: W0916 04:38:10.054348 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.054366 kubelet[2666]: E0916 04:38:10.054376 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.054366 kubelet[2666]: E0916 04:38:10.054385 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.054686 kubelet[2666]: E0916 04:38:10.054487 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.054686 kubelet[2666]: W0916 04:38:10.054495 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.054686 kubelet[2666]: E0916 04:38:10.054511 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.054893 kubelet[2666]: E0916 04:38:10.054880 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.054893 kubelet[2666]: W0916 04:38:10.054892 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.055385 kubelet[2666]: E0916 04:38:10.055351 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.055670 kubelet[2666]: E0916 04:38:10.055607 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.056066 kubelet[2666]: W0916 04:38:10.055984 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.056294 kubelet[2666]: E0916 04:38:10.056143 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.056626 kubelet[2666]: E0916 04:38:10.056542 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.057370 kubelet[2666]: W0916 04:38:10.056710 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.057370 kubelet[2666]: E0916 04:38:10.056854 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.057648 kubelet[2666]: E0916 04:38:10.057618 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.057648 kubelet[2666]: W0916 04:38:10.057633 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.057837 kubelet[2666]: E0916 04:38:10.057800 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.057970 kubelet[2666]: E0916 04:38:10.057958 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.058399 kubelet[2666]: W0916 04:38:10.058379 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.058542 kubelet[2666]: E0916 04:38:10.058515 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.058785 kubelet[2666]: E0916 04:38:10.058732 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.058785 kubelet[2666]: W0916 04:38:10.058746 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.058847 kubelet[2666]: E0916 04:38:10.058785 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.059131 kubelet[2666]: E0916 04:38:10.059054 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.059469 kubelet[2666]: W0916 04:38:10.059371 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.059469 kubelet[2666]: E0916 04:38:10.059402 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.060570 kubelet[2666]: E0916 04:38:10.060391 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.060570 kubelet[2666]: W0916 04:38:10.060408 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.060570 kubelet[2666]: E0916 04:38:10.060430 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.060819 kubelet[2666]: E0916 04:38:10.060801 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.060819 kubelet[2666]: W0916 04:38:10.060818 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.060909 kubelet[2666]: E0916 04:38:10.060835 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.061286 kubelet[2666]: E0916 04:38:10.061266 2666 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:38:10.061286 kubelet[2666]: W0916 04:38:10.061285 2666 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:38:10.061374 kubelet[2666]: E0916 04:38:10.061298 2666 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:38:10.768086 containerd[1528]: time="2025-09-16T04:38:10.768029605Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:10.768566 containerd[1528]: time="2025-09-16T04:38:10.768523245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 16 04:38:10.769310 containerd[1528]: time="2025-09-16T04:38:10.769279645Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:10.771208 containerd[1528]: time="2025-09-16T04:38:10.771170806Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:10.772084 containerd[1528]: time="2025-09-16T04:38:10.771677166Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.110694283s" Sep 16 04:38:10.772084 containerd[1528]: time="2025-09-16T04:38:10.771703286Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 16 04:38:10.779627 containerd[1528]: time="2025-09-16T04:38:10.779572329Z" level=info msg="CreateContainer within sandbox \"f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 16 04:38:10.786495 containerd[1528]: time="2025-09-16T04:38:10.786049092Z" level=info msg="Container cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:10.793145 containerd[1528]: time="2025-09-16T04:38:10.793093574Z" level=info msg="CreateContainer within sandbox \"f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76\"" Sep 16 04:38:10.793681 containerd[1528]: time="2025-09-16T04:38:10.793544935Z" level=info msg="StartContainer for \"cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76\"" Sep 16 04:38:10.795828 containerd[1528]: time="2025-09-16T04:38:10.795787815Z" level=info msg="connecting to shim cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76" address="unix:///run/containerd/s/0701a3afc69199a184763d73fb9acb7dcffb4cfb3d6f8459ab390f573301954f" protocol=ttrpc version=3 Sep 16 04:38:10.824562 systemd[1]: Started cri-containerd-cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76.scope - libcontainer container cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76. Sep 16 04:38:10.870477 systemd[1]: cri-containerd-cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76.scope: Deactivated successfully. Sep 16 04:38:10.901820 containerd[1528]: time="2025-09-16T04:38:10.901638297Z" level=info msg="StartContainer for \"cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76\" returns successfully" Sep 16 04:38:10.908322 containerd[1528]: time="2025-09-16T04:38:10.908238180Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76\" id:\"cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76\" pid:3366 exited_at:{seconds:1757997490 nanos:907860660}" Sep 16 04:38:10.913525 containerd[1528]: time="2025-09-16T04:38:10.913483902Z" level=info msg="received exit event container_id:\"cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76\" id:\"cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76\" pid:3366 exited_at:{seconds:1757997490 nanos:907860660}" Sep 16 04:38:10.949714 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cb4b75de2409173609ece55b81852d1118e6cb0821de969bc0bc09d9ced09f76-rootfs.mount: Deactivated successfully. Sep 16 04:38:11.031815 kubelet[2666]: I0916 04:38:11.031707 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:11.917680 kubelet[2666]: E0916 04:38:11.917635 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zlcv" podUID="f6d01e5d-7118-4f45-86dd-a1a5e54fa711" Sep 16 04:38:12.015754 containerd[1528]: time="2025-09-16T04:38:12.015712999Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 16 04:38:13.918299 kubelet[2666]: E0916 04:38:13.917865 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zlcv" podUID="f6d01e5d-7118-4f45-86dd-a1a5e54fa711" Sep 16 04:38:15.388568 containerd[1528]: time="2025-09-16T04:38:15.388525119Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:15.389609 containerd[1528]: time="2025-09-16T04:38:15.389584440Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 16 04:38:15.390431 containerd[1528]: time="2025-09-16T04:38:15.390384840Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:15.392275 containerd[1528]: time="2025-09-16T04:38:15.392231841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:15.393353 containerd[1528]: time="2025-09-16T04:38:15.392803321Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.377050802s" Sep 16 04:38:15.393353 containerd[1528]: time="2025-09-16T04:38:15.392833041Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 16 04:38:15.394712 containerd[1528]: time="2025-09-16T04:38:15.394684881Z" level=info msg="CreateContainer within sandbox \"f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 16 04:38:15.402859 containerd[1528]: time="2025-09-16T04:38:15.401709724Z" level=info msg="Container 63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:15.413578 containerd[1528]: time="2025-09-16T04:38:15.413534207Z" level=info msg="CreateContainer within sandbox \"f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7\"" Sep 16 04:38:15.414999 containerd[1528]: time="2025-09-16T04:38:15.414042488Z" level=info msg="StartContainer for \"63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7\"" Sep 16 04:38:15.416753 containerd[1528]: time="2025-09-16T04:38:15.416716488Z" level=info msg="connecting to shim 63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7" address="unix:///run/containerd/s/0701a3afc69199a184763d73fb9acb7dcffb4cfb3d6f8459ab390f573301954f" protocol=ttrpc version=3 Sep 16 04:38:15.445516 systemd[1]: Started cri-containerd-63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7.scope - libcontainer container 63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7. Sep 16 04:38:15.480396 containerd[1528]: time="2025-09-16T04:38:15.480360469Z" level=info msg="StartContainer for \"63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7\" returns successfully" Sep 16 04:38:15.917231 kubelet[2666]: E0916 04:38:15.917173 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zlcv" podUID="f6d01e5d-7118-4f45-86dd-a1a5e54fa711" Sep 16 04:38:16.064515 systemd[1]: cri-containerd-63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7.scope: Deactivated successfully. Sep 16 04:38:16.066417 systemd[1]: cri-containerd-63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7.scope: Consumed 475ms CPU time, 176.2M memory peak, 3.1M read from disk, 165.8M written to disk. Sep 16 04:38:16.066566 containerd[1528]: time="2025-09-16T04:38:16.066526416Z" level=info msg="received exit event container_id:\"63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7\" id:\"63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7\" pid:3426 exited_at:{seconds:1757997496 nanos:65546376}" Sep 16 04:38:16.066974 containerd[1528]: time="2025-09-16T04:38:16.066767096Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7\" id:\"63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7\" pid:3426 exited_at:{seconds:1757997496 nanos:65546376}" Sep 16 04:38:16.085263 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-63ddd520efd125d77f7a96d1e1114b4b58c02b3219b8b332db341e8f240caae7-rootfs.mount: Deactivated successfully. Sep 16 04:38:16.100962 kubelet[2666]: I0916 04:38:16.100925 2666 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 16 04:38:16.197121 kubelet[2666]: I0916 04:38:16.196999 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp4q8\" (UniqueName: \"kubernetes.io/projected/42a76947-1770-41d9-ba2c-8e5f1b8c14bc-kube-api-access-xp4q8\") pod \"coredns-668d6bf9bc-x2hwg\" (UID: \"42a76947-1770-41d9-ba2c-8e5f1b8c14bc\") " pod="kube-system/coredns-668d6bf9bc-x2hwg" Sep 16 04:38:16.197121 kubelet[2666]: I0916 04:38:16.197060 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn5tp\" (UniqueName: \"kubernetes.io/projected/9922ce3d-2d4e-4090-a23f-cae50ccd6bc0-kube-api-access-rn5tp\") pod \"calico-kube-controllers-779d9c954d-c96p4\" (UID: \"9922ce3d-2d4e-4090-a23f-cae50ccd6bc0\") " pod="calico-system/calico-kube-controllers-779d9c954d-c96p4" Sep 16 04:38:16.197121 kubelet[2666]: I0916 04:38:16.197083 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9922ce3d-2d4e-4090-a23f-cae50ccd6bc0-tigera-ca-bundle\") pod \"calico-kube-controllers-779d9c954d-c96p4\" (UID: \"9922ce3d-2d4e-4090-a23f-cae50ccd6bc0\") " pod="calico-system/calico-kube-controllers-779d9c954d-c96p4" Sep 16 04:38:16.197121 kubelet[2666]: I0916 04:38:16.197099 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42a76947-1770-41d9-ba2c-8e5f1b8c14bc-config-volume\") pod \"coredns-668d6bf9bc-x2hwg\" (UID: \"42a76947-1770-41d9-ba2c-8e5f1b8c14bc\") " pod="kube-system/coredns-668d6bf9bc-x2hwg" Sep 16 04:38:16.228310 systemd[1]: Created slice kubepods-burstable-pod42a76947_1770_41d9_ba2c_8e5f1b8c14bc.slice - libcontainer container kubepods-burstable-pod42a76947_1770_41d9_ba2c_8e5f1b8c14bc.slice. Sep 16 04:38:16.234526 systemd[1]: Created slice kubepods-besteffort-pod9922ce3d_2d4e_4090_a23f_cae50ccd6bc0.slice - libcontainer container kubepods-besteffort-pod9922ce3d_2d4e_4090_a23f_cae50ccd6bc0.slice. Sep 16 04:38:16.239444 systemd[1]: Created slice kubepods-besteffort-pod7187ad0a_ae2a_4b74_a73b_72d3fb6e0068.slice - libcontainer container kubepods-besteffort-pod7187ad0a_ae2a_4b74_a73b_72d3fb6e0068.slice. Sep 16 04:38:16.243726 systemd[1]: Created slice kubepods-besteffort-pod191ab5cb_a255_43f8_99f4_8b4ccf3f8a34.slice - libcontainer container kubepods-besteffort-pod191ab5cb_a255_43f8_99f4_8b4ccf3f8a34.slice. Sep 16 04:38:16.247939 systemd[1]: Created slice kubepods-besteffort-pod9dfceabc_d3cc_47a6_863c_141ab1a44e04.slice - libcontainer container kubepods-besteffort-pod9dfceabc_d3cc_47a6_863c_141ab1a44e04.slice. Sep 16 04:38:16.257834 systemd[1]: Created slice kubepods-besteffort-pod51090495_1f2e_452d_b64d_ba0a752d16a3.slice - libcontainer container kubepods-besteffort-pod51090495_1f2e_452d_b64d_ba0a752d16a3.slice. Sep 16 04:38:16.266089 systemd[1]: Created slice kubepods-burstable-pod610c18ac_cdd1_464e_aa63_481268b282c8.slice - libcontainer container kubepods-burstable-pod610c18ac_cdd1_464e_aa63_481268b282c8.slice. Sep 16 04:38:16.297968 kubelet[2666]: I0916 04:38:16.297840 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6nlj\" (UniqueName: \"kubernetes.io/projected/9dfceabc-d3cc-47a6-863c-141ab1a44e04-kube-api-access-x6nlj\") pod \"calico-apiserver-6cfbc464b6-lx4r7\" (UID: \"9dfceabc-d3cc-47a6-863c-141ab1a44e04\") " pod="calico-apiserver/calico-apiserver-6cfbc464b6-lx4r7" Sep 16 04:38:16.297968 kubelet[2666]: I0916 04:38:16.297882 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/610c18ac-cdd1-464e-aa63-481268b282c8-config-volume\") pod \"coredns-668d6bf9bc-lv566\" (UID: \"610c18ac-cdd1-464e-aa63-481268b282c8\") " pod="kube-system/coredns-668d6bf9bc-lv566" Sep 16 04:38:16.297968 kubelet[2666]: I0916 04:38:16.297921 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/191ab5cb-a255-43f8-99f4-8b4ccf3f8a34-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-wczbt\" (UID: \"191ab5cb-a255-43f8-99f4-8b4ccf3f8a34\") " pod="calico-system/goldmane-54d579b49d-wczbt" Sep 16 04:38:16.297968 kubelet[2666]: I0916 04:38:16.297938 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/191ab5cb-a255-43f8-99f4-8b4ccf3f8a34-goldmane-key-pair\") pod \"goldmane-54d579b49d-wczbt\" (UID: \"191ab5cb-a255-43f8-99f4-8b4ccf3f8a34\") " pod="calico-system/goldmane-54d579b49d-wczbt" Sep 16 04:38:16.297968 kubelet[2666]: I0916 04:38:16.297957 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-backend-key-pair\") pod \"whisker-5c5587676-9btsf\" (UID: \"51090495-1f2e-452d-b64d-ba0a752d16a3\") " pod="calico-system/whisker-5c5587676-9btsf" Sep 16 04:38:16.298203 kubelet[2666]: I0916 04:38:16.297987 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvvh2\" (UniqueName: \"kubernetes.io/projected/7187ad0a-ae2a-4b74-a73b-72d3fb6e0068-kube-api-access-fvvh2\") pod \"calico-apiserver-6cfbc464b6-x6s7t\" (UID: \"7187ad0a-ae2a-4b74-a73b-72d3fb6e0068\") " pod="calico-apiserver/calico-apiserver-6cfbc464b6-x6s7t" Sep 16 04:38:16.298203 kubelet[2666]: I0916 04:38:16.298005 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55rtr\" (UniqueName: \"kubernetes.io/projected/610c18ac-cdd1-464e-aa63-481268b282c8-kube-api-access-55rtr\") pod \"coredns-668d6bf9bc-lv566\" (UID: \"610c18ac-cdd1-464e-aa63-481268b282c8\") " pod="kube-system/coredns-668d6bf9bc-lv566" Sep 16 04:38:16.298203 kubelet[2666]: I0916 04:38:16.298021 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqnz\" (UniqueName: \"kubernetes.io/projected/191ab5cb-a255-43f8-99f4-8b4ccf3f8a34-kube-api-access-6lqnz\") pod \"goldmane-54d579b49d-wczbt\" (UID: \"191ab5cb-a255-43f8-99f4-8b4ccf3f8a34\") " pod="calico-system/goldmane-54d579b49d-wczbt" Sep 16 04:38:16.298203 kubelet[2666]: I0916 04:38:16.298035 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-ca-bundle\") pod \"whisker-5c5587676-9btsf\" (UID: \"51090495-1f2e-452d-b64d-ba0a752d16a3\") " pod="calico-system/whisker-5c5587676-9btsf" Sep 16 04:38:16.298203 kubelet[2666]: I0916 04:38:16.298050 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj74l\" (UniqueName: \"kubernetes.io/projected/51090495-1f2e-452d-b64d-ba0a752d16a3-kube-api-access-cj74l\") pod \"whisker-5c5587676-9btsf\" (UID: \"51090495-1f2e-452d-b64d-ba0a752d16a3\") " pod="calico-system/whisker-5c5587676-9btsf" Sep 16 04:38:16.298311 kubelet[2666]: I0916 04:38:16.298079 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191ab5cb-a255-43f8-99f4-8b4ccf3f8a34-config\") pod \"goldmane-54d579b49d-wczbt\" (UID: \"191ab5cb-a255-43f8-99f4-8b4ccf3f8a34\") " pod="calico-system/goldmane-54d579b49d-wczbt" Sep 16 04:38:16.298311 kubelet[2666]: I0916 04:38:16.298106 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9dfceabc-d3cc-47a6-863c-141ab1a44e04-calico-apiserver-certs\") pod \"calico-apiserver-6cfbc464b6-lx4r7\" (UID: \"9dfceabc-d3cc-47a6-863c-141ab1a44e04\") " pod="calico-apiserver/calico-apiserver-6cfbc464b6-lx4r7" Sep 16 04:38:16.298311 kubelet[2666]: I0916 04:38:16.298122 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7187ad0a-ae2a-4b74-a73b-72d3fb6e0068-calico-apiserver-certs\") pod \"calico-apiserver-6cfbc464b6-x6s7t\" (UID: \"7187ad0a-ae2a-4b74-a73b-72d3fb6e0068\") " pod="calico-apiserver/calico-apiserver-6cfbc464b6-x6s7t" Sep 16 04:38:16.532258 containerd[1528]: time="2025-09-16T04:38:16.532212640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x2hwg,Uid:42a76947-1770-41d9-ba2c-8e5f1b8c14bc,Namespace:kube-system,Attempt:0,}" Sep 16 04:38:16.538048 containerd[1528]: time="2025-09-16T04:38:16.537822321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-779d9c954d-c96p4,Uid:9922ce3d-2d4e-4090-a23f-cae50ccd6bc0,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:16.542702 containerd[1528]: time="2025-09-16T04:38:16.542665363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-x6s7t,Uid:7187ad0a-ae2a-4b74-a73b-72d3fb6e0068,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:38:16.546731 containerd[1528]: time="2025-09-16T04:38:16.546616404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wczbt,Uid:191ab5cb-a255-43f8-99f4-8b4ccf3f8a34,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:16.551497 containerd[1528]: time="2025-09-16T04:38:16.551470526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-lx4r7,Uid:9dfceabc-d3cc-47a6-863c-141ab1a44e04,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:38:16.564149 containerd[1528]: time="2025-09-16T04:38:16.564117289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c5587676-9btsf,Uid:51090495-1f2e-452d-b64d-ba0a752d16a3,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:16.588370 containerd[1528]: time="2025-09-16T04:38:16.588317977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lv566,Uid:610c18ac-cdd1-464e-aa63-481268b282c8,Namespace:kube-system,Attempt:0,}" Sep 16 04:38:16.644940 containerd[1528]: time="2025-09-16T04:38:16.644884474Z" level=error msg="Failed to destroy network for sandbox \"84de0a52602b363f706aa9042b88a9415fc2c71d82986563df8a132bc454e9c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.647761 containerd[1528]: time="2025-09-16T04:38:16.647613555Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-lx4r7,Uid:9dfceabc-d3cc-47a6-863c-141ab1a44e04,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"84de0a52602b363f706aa9042b88a9415fc2c71d82986563df8a132bc454e9c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.648599 containerd[1528]: time="2025-09-16T04:38:16.647957275Z" level=error msg="Failed to destroy network for sandbox \"0d20a2a3b18ba378a5f7cbd6ce7afd0aebc93c7b5669815a1b16620fb3391315\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.650344 containerd[1528]: time="2025-09-16T04:38:16.650293436Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x2hwg,Uid:42a76947-1770-41d9-ba2c-8e5f1b8c14bc,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d20a2a3b18ba378a5f7cbd6ce7afd0aebc93c7b5669815a1b16620fb3391315\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.650865 kubelet[2666]: E0916 04:38:16.650811 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d20a2a3b18ba378a5f7cbd6ce7afd0aebc93c7b5669815a1b16620fb3391315\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.650962 kubelet[2666]: E0916 04:38:16.650934 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d20a2a3b18ba378a5f7cbd6ce7afd0aebc93c7b5669815a1b16620fb3391315\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x2hwg" Sep 16 04:38:16.651092 kubelet[2666]: E0916 04:38:16.650960 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d20a2a3b18ba378a5f7cbd6ce7afd0aebc93c7b5669815a1b16620fb3391315\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x2hwg" Sep 16 04:38:16.651092 kubelet[2666]: E0916 04:38:16.651005 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-x2hwg_kube-system(42a76947-1770-41d9-ba2c-8e5f1b8c14bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-x2hwg_kube-system(42a76947-1770-41d9-ba2c-8e5f1b8c14bc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d20a2a3b18ba378a5f7cbd6ce7afd0aebc93c7b5669815a1b16620fb3391315\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-x2hwg" podUID="42a76947-1770-41d9-ba2c-8e5f1b8c14bc" Sep 16 04:38:16.653171 kubelet[2666]: E0916 04:38:16.653121 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84de0a52602b363f706aa9042b88a9415fc2c71d82986563df8a132bc454e9c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.653370 kubelet[2666]: E0916 04:38:16.653183 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84de0a52602b363f706aa9042b88a9415fc2c71d82986563df8a132bc454e9c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cfbc464b6-lx4r7" Sep 16 04:38:16.653370 kubelet[2666]: E0916 04:38:16.653203 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84de0a52602b363f706aa9042b88a9415fc2c71d82986563df8a132bc454e9c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cfbc464b6-lx4r7" Sep 16 04:38:16.653370 kubelet[2666]: E0916 04:38:16.653290 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cfbc464b6-lx4r7_calico-apiserver(9dfceabc-d3cc-47a6-863c-141ab1a44e04)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cfbc464b6-lx4r7_calico-apiserver(9dfceabc-d3cc-47a6-863c-141ab1a44e04)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"84de0a52602b363f706aa9042b88a9415fc2c71d82986563df8a132bc454e9c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cfbc464b6-lx4r7" podUID="9dfceabc-d3cc-47a6-863c-141ab1a44e04" Sep 16 04:38:16.673059 containerd[1528]: time="2025-09-16T04:38:16.673013803Z" level=error msg="Failed to destroy network for sandbox \"d8ba94c5db0c074f51c47399bdef1f0b189120b4b1e1a993fef0c8c9149395a5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.674455 containerd[1528]: time="2025-09-16T04:38:16.674373043Z" level=error msg="Failed to destroy network for sandbox \"60b8307859d2c06e0c678b2687394674c3919c991ce2f3cbb7d952dceeaed863\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.675466 containerd[1528]: time="2025-09-16T04:38:16.675415404Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-779d9c954d-c96p4,Uid:9922ce3d-2d4e-4090-a23f-cae50ccd6bc0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ba94c5db0c074f51c47399bdef1f0b189120b4b1e1a993fef0c8c9149395a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.675675 kubelet[2666]: E0916 04:38:16.675636 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ba94c5db0c074f51c47399bdef1f0b189120b4b1e1a993fef0c8c9149395a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.675738 kubelet[2666]: E0916 04:38:16.675694 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ba94c5db0c074f51c47399bdef1f0b189120b4b1e1a993fef0c8c9149395a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-779d9c954d-c96p4" Sep 16 04:38:16.675738 kubelet[2666]: E0916 04:38:16.675718 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ba94c5db0c074f51c47399bdef1f0b189120b4b1e1a993fef0c8c9149395a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-779d9c954d-c96p4" Sep 16 04:38:16.675893 kubelet[2666]: E0916 04:38:16.675755 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-779d9c954d-c96p4_calico-system(9922ce3d-2d4e-4090-a23f-cae50ccd6bc0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-779d9c954d-c96p4_calico-system(9922ce3d-2d4e-4090-a23f-cae50ccd6bc0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d8ba94c5db0c074f51c47399bdef1f0b189120b4b1e1a993fef0c8c9149395a5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-779d9c954d-c96p4" podUID="9922ce3d-2d4e-4090-a23f-cae50ccd6bc0" Sep 16 04:38:16.677821 containerd[1528]: time="2025-09-16T04:38:16.677235524Z" level=error msg="Failed to destroy network for sandbox \"0a2e846c22dfd11c713b05a4869c5dde3e08a1c474a46a6d8867925347bdd6b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.678243 containerd[1528]: time="2025-09-16T04:38:16.678173365Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c5587676-9btsf,Uid:51090495-1f2e-452d-b64d-ba0a752d16a3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a2e846c22dfd11c713b05a4869c5dde3e08a1c474a46a6d8867925347bdd6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.678549 kubelet[2666]: E0916 04:38:16.678518 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a2e846c22dfd11c713b05a4869c5dde3e08a1c474a46a6d8867925347bdd6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.678604 kubelet[2666]: E0916 04:38:16.678565 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a2e846c22dfd11c713b05a4869c5dde3e08a1c474a46a6d8867925347bdd6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c5587676-9btsf" Sep 16 04:38:16.678604 kubelet[2666]: E0916 04:38:16.678584 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a2e846c22dfd11c713b05a4869c5dde3e08a1c474a46a6d8867925347bdd6b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c5587676-9btsf" Sep 16 04:38:16.678669 kubelet[2666]: E0916 04:38:16.678622 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5c5587676-9btsf_calico-system(51090495-1f2e-452d-b64d-ba0a752d16a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5c5587676-9btsf_calico-system(51090495-1f2e-452d-b64d-ba0a752d16a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a2e846c22dfd11c713b05a4869c5dde3e08a1c474a46a6d8867925347bdd6b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5c5587676-9btsf" podUID="51090495-1f2e-452d-b64d-ba0a752d16a3" Sep 16 04:38:16.679305 kubelet[2666]: E0916 04:38:16.678974 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60b8307859d2c06e0c678b2687394674c3919c991ce2f3cbb7d952dceeaed863\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.679305 kubelet[2666]: E0916 04:38:16.679002 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60b8307859d2c06e0c678b2687394674c3919c991ce2f3cbb7d952dceeaed863\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cfbc464b6-x6s7t" Sep 16 04:38:16.679305 kubelet[2666]: E0916 04:38:16.679015 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60b8307859d2c06e0c678b2687394674c3919c991ce2f3cbb7d952dceeaed863\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cfbc464b6-x6s7t" Sep 16 04:38:16.679468 containerd[1528]: time="2025-09-16T04:38:16.678829525Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-x6s7t,Uid:7187ad0a-ae2a-4b74-a73b-72d3fb6e0068,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"60b8307859d2c06e0c678b2687394674c3919c991ce2f3cbb7d952dceeaed863\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.679567 kubelet[2666]: E0916 04:38:16.679043 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cfbc464b6-x6s7t_calico-apiserver(7187ad0a-ae2a-4b74-a73b-72d3fb6e0068)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cfbc464b6-x6s7t_calico-apiserver(7187ad0a-ae2a-4b74-a73b-72d3fb6e0068)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60b8307859d2c06e0c678b2687394674c3919c991ce2f3cbb7d952dceeaed863\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cfbc464b6-x6s7t" podUID="7187ad0a-ae2a-4b74-a73b-72d3fb6e0068" Sep 16 04:38:16.681774 containerd[1528]: time="2025-09-16T04:38:16.681739246Z" level=error msg="Failed to destroy network for sandbox \"47d5dd44b689db4c0d5b21a57b33921580baa103d8b04420cc46465c8189fc63\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.682702 containerd[1528]: time="2025-09-16T04:38:16.682636406Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lv566,Uid:610c18ac-cdd1-464e-aa63-481268b282c8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"47d5dd44b689db4c0d5b21a57b33921580baa103d8b04420cc46465c8189fc63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.682851 kubelet[2666]: E0916 04:38:16.682819 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47d5dd44b689db4c0d5b21a57b33921580baa103d8b04420cc46465c8189fc63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.682851 kubelet[2666]: E0916 04:38:16.682852 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47d5dd44b689db4c0d5b21a57b33921580baa103d8b04420cc46465c8189fc63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-lv566" Sep 16 04:38:16.682979 kubelet[2666]: E0916 04:38:16.682867 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47d5dd44b689db4c0d5b21a57b33921580baa103d8b04420cc46465c8189fc63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-lv566" Sep 16 04:38:16.682979 kubelet[2666]: E0916 04:38:16.682907 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-lv566_kube-system(610c18ac-cdd1-464e-aa63-481268b282c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-lv566_kube-system(610c18ac-cdd1-464e-aa63-481268b282c8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47d5dd44b689db4c0d5b21a57b33921580baa103d8b04420cc46465c8189fc63\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-lv566" podUID="610c18ac-cdd1-464e-aa63-481268b282c8" Sep 16 04:38:16.687693 containerd[1528]: time="2025-09-16T04:38:16.687658168Z" level=error msg="Failed to destroy network for sandbox \"1d22e2b9c5779c7a263f24f3f2220f9f113da22c1fdc458231c287083773ad2a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.688650 containerd[1528]: time="2025-09-16T04:38:16.688586768Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wczbt,Uid:191ab5cb-a255-43f8-99f4-8b4ccf3f8a34,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d22e2b9c5779c7a263f24f3f2220f9f113da22c1fdc458231c287083773ad2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.688783 kubelet[2666]: E0916 04:38:16.688752 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d22e2b9c5779c7a263f24f3f2220f9f113da22c1fdc458231c287083773ad2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:16.688818 kubelet[2666]: E0916 04:38:16.688793 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d22e2b9c5779c7a263f24f3f2220f9f113da22c1fdc458231c287083773ad2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-wczbt" Sep 16 04:38:16.688818 kubelet[2666]: E0916 04:38:16.688809 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d22e2b9c5779c7a263f24f3f2220f9f113da22c1fdc458231c287083773ad2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-wczbt" Sep 16 04:38:16.688879 kubelet[2666]: E0916 04:38:16.688854 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-wczbt_calico-system(191ab5cb-a255-43f8-99f4-8b4ccf3f8a34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-wczbt_calico-system(191ab5cb-a255-43f8-99f4-8b4ccf3f8a34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d22e2b9c5779c7a263f24f3f2220f9f113da22c1fdc458231c287083773ad2a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-wczbt" podUID="191ab5cb-a255-43f8-99f4-8b4ccf3f8a34" Sep 16 04:38:17.045742 containerd[1528]: time="2025-09-16T04:38:17.045509677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 16 04:38:17.327741 kubelet[2666]: I0916 04:38:17.327618 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:17.922504 systemd[1]: Created slice kubepods-besteffort-podf6d01e5d_7118_4f45_86dd_a1a5e54fa711.slice - libcontainer container kubepods-besteffort-podf6d01e5d_7118_4f45_86dd_a1a5e54fa711.slice. Sep 16 04:38:17.924694 containerd[1528]: time="2025-09-16T04:38:17.924653259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zlcv,Uid:f6d01e5d-7118-4f45-86dd-a1a5e54fa711,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:17.973231 containerd[1528]: time="2025-09-16T04:38:17.973187833Z" level=error msg="Failed to destroy network for sandbox \"6109d1bd9766fb6a2ed82d6f2893237e141575f36f4ac3cda686aa3eb59c3e4c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:17.975155 systemd[1]: run-netns-cni\x2d59ee7ef4\x2db4d3\x2dd700\x2d1295\x2d8f850729ab73.mount: Deactivated successfully. Sep 16 04:38:17.976271 containerd[1528]: time="2025-09-16T04:38:17.976208394Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zlcv,Uid:f6d01e5d-7118-4f45-86dd-a1a5e54fa711,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6109d1bd9766fb6a2ed82d6f2893237e141575f36f4ac3cda686aa3eb59c3e4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:17.977269 kubelet[2666]: E0916 04:38:17.977236 2666 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6109d1bd9766fb6a2ed82d6f2893237e141575f36f4ac3cda686aa3eb59c3e4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:38:17.977433 kubelet[2666]: E0916 04:38:17.977386 2666 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6109d1bd9766fb6a2ed82d6f2893237e141575f36f4ac3cda686aa3eb59c3e4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5zlcv" Sep 16 04:38:17.977433 kubelet[2666]: E0916 04:38:17.977410 2666 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6109d1bd9766fb6a2ed82d6f2893237e141575f36f4ac3cda686aa3eb59c3e4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5zlcv" Sep 16 04:38:17.977586 kubelet[2666]: E0916 04:38:17.977555 2666 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5zlcv_calico-system(f6d01e5d-7118-4f45-86dd-a1a5e54fa711)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5zlcv_calico-system(f6d01e5d-7118-4f45-86dd-a1a5e54fa711)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6109d1bd9766fb6a2ed82d6f2893237e141575f36f4ac3cda686aa3eb59c3e4c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5zlcv" podUID="f6d01e5d-7118-4f45-86dd-a1a5e54fa711" Sep 16 04:38:21.074042 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3245921617.mount: Deactivated successfully. Sep 16 04:38:21.344550 containerd[1528]: time="2025-09-16T04:38:21.322073636Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 16 04:38:21.344550 containerd[1528]: time="2025-09-16T04:38:21.325131997Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.279374279s" Sep 16 04:38:21.344550 containerd[1528]: time="2025-09-16T04:38:21.344514922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 16 04:38:21.345096 containerd[1528]: time="2025-09-16T04:38:21.325374677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:21.345249 containerd[1528]: time="2025-09-16T04:38:21.345136642Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:21.345656 containerd[1528]: time="2025-09-16T04:38:21.345618682Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:21.352122 containerd[1528]: time="2025-09-16T04:38:21.352088763Z" level=info msg="CreateContainer within sandbox \"f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 16 04:38:21.365490 containerd[1528]: time="2025-09-16T04:38:21.365448407Z" level=info msg="Container a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:21.386805 containerd[1528]: time="2025-09-16T04:38:21.386682532Z" level=info msg="CreateContainer within sandbox \"f97fd80de518b1b1ac72d5c218439ee6fef8cc8d98acaeec5047fcb3788b4139\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708\"" Sep 16 04:38:21.387779 containerd[1528]: time="2025-09-16T04:38:21.387201173Z" level=info msg="StartContainer for \"a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708\"" Sep 16 04:38:21.389670 containerd[1528]: time="2025-09-16T04:38:21.389635453Z" level=info msg="connecting to shim a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708" address="unix:///run/containerd/s/0701a3afc69199a184763d73fb9acb7dcffb4cfb3d6f8459ab390f573301954f" protocol=ttrpc version=3 Sep 16 04:38:21.413483 systemd[1]: Started cri-containerd-a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708.scope - libcontainer container a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708. Sep 16 04:38:21.449504 containerd[1528]: time="2025-09-16T04:38:21.449400709Z" level=info msg="StartContainer for \"a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708\" returns successfully" Sep 16 04:38:21.561060 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 16 04:38:21.561166 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 16 04:38:21.739769 kubelet[2666]: I0916 04:38:21.739650 2666 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-ca-bundle\") pod \"51090495-1f2e-452d-b64d-ba0a752d16a3\" (UID: \"51090495-1f2e-452d-b64d-ba0a752d16a3\") " Sep 16 04:38:21.741918 kubelet[2666]: I0916 04:38:21.739797 2666 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj74l\" (UniqueName: \"kubernetes.io/projected/51090495-1f2e-452d-b64d-ba0a752d16a3-kube-api-access-cj74l\") pod \"51090495-1f2e-452d-b64d-ba0a752d16a3\" (UID: \"51090495-1f2e-452d-b64d-ba0a752d16a3\") " Sep 16 04:38:21.741918 kubelet[2666]: I0916 04:38:21.739827 2666 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-backend-key-pair\") pod \"51090495-1f2e-452d-b64d-ba0a752d16a3\" (UID: \"51090495-1f2e-452d-b64d-ba0a752d16a3\") " Sep 16 04:38:21.747544 kubelet[2666]: I0916 04:38:21.747478 2666 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "51090495-1f2e-452d-b64d-ba0a752d16a3" (UID: "51090495-1f2e-452d-b64d-ba0a752d16a3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 16 04:38:21.747639 kubelet[2666]: I0916 04:38:21.747599 2666 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51090495-1f2e-452d-b64d-ba0a752d16a3-kube-api-access-cj74l" (OuterVolumeSpecName: "kube-api-access-cj74l") pod "51090495-1f2e-452d-b64d-ba0a752d16a3" (UID: "51090495-1f2e-452d-b64d-ba0a752d16a3"). InnerVolumeSpecName "kube-api-access-cj74l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 16 04:38:21.747769 kubelet[2666]: I0916 04:38:21.747738 2666 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "51090495-1f2e-452d-b64d-ba0a752d16a3" (UID: "51090495-1f2e-452d-b64d-ba0a752d16a3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 16 04:38:21.841058 kubelet[2666]: I0916 04:38:21.841005 2666 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 16 04:38:21.841058 kubelet[2666]: I0916 04:38:21.841040 2666 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cj74l\" (UniqueName: \"kubernetes.io/projected/51090495-1f2e-452d-b64d-ba0a752d16a3-kube-api-access-cj74l\") on node \"localhost\" DevicePath \"\"" Sep 16 04:38:21.841058 kubelet[2666]: I0916 04:38:21.841051 2666 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/51090495-1f2e-452d-b64d-ba0a752d16a3-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 16 04:38:22.074734 systemd[1]: var-lib-kubelet-pods-51090495\x2d1f2e\x2d452d\x2db64d\x2dba0a752d16a3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dcj74l.mount: Deactivated successfully. Sep 16 04:38:22.074823 systemd[1]: var-lib-kubelet-pods-51090495\x2d1f2e\x2d452d\x2db64d\x2dba0a752d16a3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 16 04:38:22.078796 systemd[1]: Removed slice kubepods-besteffort-pod51090495_1f2e_452d_b64d_ba0a752d16a3.slice - libcontainer container kubepods-besteffort-pod51090495_1f2e_452d_b64d_ba0a752d16a3.slice. Sep 16 04:38:22.090420 kubelet[2666]: I0916 04:38:22.090351 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vdvts" podStartSLOduration=1.845864336 podStartE2EDuration="15.090309714s" podCreationTimestamp="2025-09-16 04:38:07 +0000 UTC" firstStartedPulling="2025-09-16 04:38:08.101416344 +0000 UTC m=+21.287282411" lastFinishedPulling="2025-09-16 04:38:21.345861762 +0000 UTC m=+34.531727789" observedRunningTime="2025-09-16 04:38:22.089770954 +0000 UTC m=+35.275637061" watchObservedRunningTime="2025-09-16 04:38:22.090309714 +0000 UTC m=+35.276175741" Sep 16 04:38:22.154163 systemd[1]: Created slice kubepods-besteffort-pod2d588d46_316f_4970_abfa_f9b3337c51ff.slice - libcontainer container kubepods-besteffort-pod2d588d46_316f_4970_abfa_f9b3337c51ff.slice. Sep 16 04:38:22.243857 kubelet[2666]: I0916 04:38:22.243820 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfmqx\" (UniqueName: \"kubernetes.io/projected/2d588d46-316f-4970-abfa-f9b3337c51ff-kube-api-access-tfmqx\") pod \"whisker-67899b97c9-lvw9t\" (UID: \"2d588d46-316f-4970-abfa-f9b3337c51ff\") " pod="calico-system/whisker-67899b97c9-lvw9t" Sep 16 04:38:22.243857 kubelet[2666]: I0916 04:38:22.243867 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d588d46-316f-4970-abfa-f9b3337c51ff-whisker-ca-bundle\") pod \"whisker-67899b97c9-lvw9t\" (UID: \"2d588d46-316f-4970-abfa-f9b3337c51ff\") " pod="calico-system/whisker-67899b97c9-lvw9t" Sep 16 04:38:22.244020 kubelet[2666]: I0916 04:38:22.243930 2666 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2d588d46-316f-4970-abfa-f9b3337c51ff-whisker-backend-key-pair\") pod \"whisker-67899b97c9-lvw9t\" (UID: \"2d588d46-316f-4970-abfa-f9b3337c51ff\") " pod="calico-system/whisker-67899b97c9-lvw9t" Sep 16 04:38:22.458944 containerd[1528]: time="2025-09-16T04:38:22.458584926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-67899b97c9-lvw9t,Uid:2d588d46-316f-4970-abfa-f9b3337c51ff,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:22.603916 systemd-networkd[1456]: calia044cff72ba: Link UP Sep 16 04:38:22.604342 systemd-networkd[1456]: calia044cff72ba: Gained carrier Sep 16 04:38:22.621513 containerd[1528]: 2025-09-16 04:38:22.479 [INFO][3804] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 16 04:38:22.621513 containerd[1528]: 2025-09-16 04:38:22.506 [INFO][3804] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--67899b97c9--lvw9t-eth0 whisker-67899b97c9- calico-system 2d588d46-316f-4970-abfa-f9b3337c51ff 859 0 2025-09-16 04:38:22 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:67899b97c9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-67899b97c9-lvw9t eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calia044cff72ba [] [] }} ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-" Sep 16 04:38:22.621513 containerd[1528]: 2025-09-16 04:38:22.506 [INFO][3804] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" Sep 16 04:38:22.621513 containerd[1528]: 2025-09-16 04:38:22.561 [INFO][3818] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" HandleID="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Workload="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.561 [INFO][3818] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" HandleID="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Workload="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137ba0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-67899b97c9-lvw9t", "timestamp":"2025-09-16 04:38:22.561706232 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.561 [INFO][3818] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.562 [INFO][3818] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.562 [INFO][3818] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.572 [INFO][3818] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" host="localhost" Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.577 [INFO][3818] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.581 [INFO][3818] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.582 [INFO][3818] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.584 [INFO][3818] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:22.621720 containerd[1528]: 2025-09-16 04:38:22.584 [INFO][3818] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" host="localhost" Sep 16 04:38:22.621925 containerd[1528]: 2025-09-16 04:38:22.586 [INFO][3818] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7 Sep 16 04:38:22.621925 containerd[1528]: 2025-09-16 04:38:22.589 [INFO][3818] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" host="localhost" Sep 16 04:38:22.621925 containerd[1528]: 2025-09-16 04:38:22.594 [INFO][3818] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" host="localhost" Sep 16 04:38:22.621925 containerd[1528]: 2025-09-16 04:38:22.594 [INFO][3818] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" host="localhost" Sep 16 04:38:22.621925 containerd[1528]: 2025-09-16 04:38:22.594 [INFO][3818] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:22.621925 containerd[1528]: 2025-09-16 04:38:22.594 [INFO][3818] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" HandleID="k8s-pod-network.41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Workload="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" Sep 16 04:38:22.622029 containerd[1528]: 2025-09-16 04:38:22.596 [INFO][3804] cni-plugin/k8s.go 418: Populated endpoint ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--67899b97c9--lvw9t-eth0", GenerateName:"whisker-67899b97c9-", Namespace:"calico-system", SelfLink:"", UID:"2d588d46-316f-4970-abfa-f9b3337c51ff", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"67899b97c9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-67899b97c9-lvw9t", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia044cff72ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:22.622029 containerd[1528]: 2025-09-16 04:38:22.596 [INFO][3804] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" Sep 16 04:38:22.622095 containerd[1528]: 2025-09-16 04:38:22.597 [INFO][3804] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia044cff72ba ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" Sep 16 04:38:22.622095 containerd[1528]: 2025-09-16 04:38:22.605 [INFO][3804] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" Sep 16 04:38:22.622132 containerd[1528]: 2025-09-16 04:38:22.605 [INFO][3804] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--67899b97c9--lvw9t-eth0", GenerateName:"whisker-67899b97c9-", Namespace:"calico-system", SelfLink:"", UID:"2d588d46-316f-4970-abfa-f9b3337c51ff", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"67899b97c9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7", Pod:"whisker-67899b97c9-lvw9t", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia044cff72ba", MAC:"8e:e5:bc:7e:f7:5c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:22.622179 containerd[1528]: 2025-09-16 04:38:22.617 [INFO][3804] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" Namespace="calico-system" Pod="whisker-67899b97c9-lvw9t" WorkloadEndpoint="localhost-k8s-whisker--67899b97c9--lvw9t-eth0" Sep 16 04:38:22.681266 containerd[1528]: time="2025-09-16T04:38:22.681045462Z" level=info msg="connecting to shim 41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7" address="unix:///run/containerd/s/6567df4d4183a24352aee00eff77c89ca189b39685596ef7368a2525b34ecd94" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:22.709489 systemd[1]: Started cri-containerd-41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7.scope - libcontainer container 41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7. Sep 16 04:38:22.719846 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:22.745355 containerd[1528]: time="2025-09-16T04:38:22.745219238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-67899b97c9-lvw9t,Uid:2d588d46-316f-4970-abfa-f9b3337c51ff,Namespace:calico-system,Attempt:0,} returns sandbox id \"41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7\"" Sep 16 04:38:22.750855 containerd[1528]: time="2025-09-16T04:38:22.750407519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 16 04:38:22.920163 kubelet[2666]: I0916 04:38:22.920114 2666 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51090495-1f2e-452d-b64d-ba0a752d16a3" path="/var/lib/kubelet/pods/51090495-1f2e-452d-b64d-ba0a752d16a3/volumes" Sep 16 04:38:23.075166 kubelet[2666]: I0916 04:38:23.075129 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:23.339103 systemd-networkd[1456]: vxlan.calico: Link UP Sep 16 04:38:23.339111 systemd-networkd[1456]: vxlan.calico: Gained carrier Sep 16 04:38:23.911349 containerd[1528]: time="2025-09-16T04:38:23.911290323Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:23.912241 containerd[1528]: time="2025-09-16T04:38:23.912195604Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 16 04:38:23.912970 containerd[1528]: time="2025-09-16T04:38:23.912940284Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:23.915186 containerd[1528]: time="2025-09-16T04:38:23.915153844Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:23.915806 containerd[1528]: time="2025-09-16T04:38:23.915772685Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.165228526s" Sep 16 04:38:23.915806 containerd[1528]: time="2025-09-16T04:38:23.915803365Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 16 04:38:23.917844 containerd[1528]: time="2025-09-16T04:38:23.917817165Z" level=info msg="CreateContainer within sandbox \"41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 16 04:38:23.925168 containerd[1528]: time="2025-09-16T04:38:23.923616246Z" level=info msg="Container 33a3b989964c0fdb22ba6e98f05578be266db8d4f78f06172eb98780f531fb7e: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:23.934086 containerd[1528]: time="2025-09-16T04:38:23.933963929Z" level=info msg="CreateContainer within sandbox \"41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"33a3b989964c0fdb22ba6e98f05578be266db8d4f78f06172eb98780f531fb7e\"" Sep 16 04:38:23.934443 containerd[1528]: time="2025-09-16T04:38:23.934418609Z" level=info msg="StartContainer for \"33a3b989964c0fdb22ba6e98f05578be266db8d4f78f06172eb98780f531fb7e\"" Sep 16 04:38:23.935616 containerd[1528]: time="2025-09-16T04:38:23.935591729Z" level=info msg="connecting to shim 33a3b989964c0fdb22ba6e98f05578be266db8d4f78f06172eb98780f531fb7e" address="unix:///run/containerd/s/6567df4d4183a24352aee00eff77c89ca189b39685596ef7368a2525b34ecd94" protocol=ttrpc version=3 Sep 16 04:38:23.958495 systemd[1]: Started cri-containerd-33a3b989964c0fdb22ba6e98f05578be266db8d4f78f06172eb98780f531fb7e.scope - libcontainer container 33a3b989964c0fdb22ba6e98f05578be266db8d4f78f06172eb98780f531fb7e. Sep 16 04:38:24.006632 containerd[1528]: time="2025-09-16T04:38:24.006574947Z" level=info msg="StartContainer for \"33a3b989964c0fdb22ba6e98f05578be266db8d4f78f06172eb98780f531fb7e\" returns successfully" Sep 16 04:38:24.007559 containerd[1528]: time="2025-09-16T04:38:24.007527827Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 16 04:38:24.547552 systemd-networkd[1456]: calia044cff72ba: Gained IPv6LL Sep 16 04:38:24.611470 systemd-networkd[1456]: vxlan.calico: Gained IPv6LL Sep 16 04:38:25.545468 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1333433968.mount: Deactivated successfully. Sep 16 04:38:25.565134 containerd[1528]: time="2025-09-16T04:38:25.565090631Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:25.566047 containerd[1528]: time="2025-09-16T04:38:25.565887151Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 16 04:38:25.566918 containerd[1528]: time="2025-09-16T04:38:25.566882191Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:25.569144 containerd[1528]: time="2025-09-16T04:38:25.569098991Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:25.569818 containerd[1528]: time="2025-09-16T04:38:25.569777432Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.562203245s" Sep 16 04:38:25.569922 containerd[1528]: time="2025-09-16T04:38:25.569906072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 16 04:38:25.572402 containerd[1528]: time="2025-09-16T04:38:25.571852952Z" level=info msg="CreateContainer within sandbox \"41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 16 04:38:25.583352 containerd[1528]: time="2025-09-16T04:38:25.583168555Z" level=info msg="Container f4aaff15a546348b6d21856c13a852f402b4106c06acd96ffe059e4a1691affa: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:25.590461 containerd[1528]: time="2025-09-16T04:38:25.590423996Z" level=info msg="CreateContainer within sandbox \"41a8f1eac80b53515109402749a3a8e485888319e31252631a2fd1e370eb30c7\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"f4aaff15a546348b6d21856c13a852f402b4106c06acd96ffe059e4a1691affa\"" Sep 16 04:38:25.591253 containerd[1528]: time="2025-09-16T04:38:25.591172036Z" level=info msg="StartContainer for \"f4aaff15a546348b6d21856c13a852f402b4106c06acd96ffe059e4a1691affa\"" Sep 16 04:38:25.592840 containerd[1528]: time="2025-09-16T04:38:25.592393157Z" level=info msg="connecting to shim f4aaff15a546348b6d21856c13a852f402b4106c06acd96ffe059e4a1691affa" address="unix:///run/containerd/s/6567df4d4183a24352aee00eff77c89ca189b39685596ef7368a2525b34ecd94" protocol=ttrpc version=3 Sep 16 04:38:25.620550 systemd[1]: Started cri-containerd-f4aaff15a546348b6d21856c13a852f402b4106c06acd96ffe059e4a1691affa.scope - libcontainer container f4aaff15a546348b6d21856c13a852f402b4106c06acd96ffe059e4a1691affa. Sep 16 04:38:25.654842 containerd[1528]: time="2025-09-16T04:38:25.654806611Z" level=info msg="StartContainer for \"f4aaff15a546348b6d21856c13a852f402b4106c06acd96ffe059e4a1691affa\" returns successfully" Sep 16 04:38:26.096200 kubelet[2666]: I0916 04:38:26.096125 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-67899b97c9-lvw9t" podStartSLOduration=1.275739599 podStartE2EDuration="4.096108312s" podCreationTimestamp="2025-09-16 04:38:22 +0000 UTC" firstStartedPulling="2025-09-16 04:38:22.750218559 +0000 UTC m=+35.936084626" lastFinishedPulling="2025-09-16 04:38:25.570587312 +0000 UTC m=+38.756453339" observedRunningTime="2025-09-16 04:38:26.095794752 +0000 UTC m=+39.281660819" watchObservedRunningTime="2025-09-16 04:38:26.096108312 +0000 UTC m=+39.281974379" Sep 16 04:38:27.917978 containerd[1528]: time="2025-09-16T04:38:27.917930672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-779d9c954d-c96p4,Uid:9922ce3d-2d4e-4090-a23f-cae50ccd6bc0,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:28.068585 systemd-networkd[1456]: cali25b44902991: Link UP Sep 16 04:38:28.068754 systemd-networkd[1456]: cali25b44902991: Gained carrier Sep 16 04:38:28.085588 containerd[1528]: 2025-09-16 04:38:27.972 [INFO][4166] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0 calico-kube-controllers-779d9c954d- calico-system 9922ce3d-2d4e-4090-a23f-cae50ccd6bc0 784 0 2025-09-16 04:38:07 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:779d9c954d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-779d9c954d-c96p4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali25b44902991 [] [] }} ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-" Sep 16 04:38:28.085588 containerd[1528]: 2025-09-16 04:38:27.972 [INFO][4166] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" Sep 16 04:38:28.085588 containerd[1528]: 2025-09-16 04:38:28.003 [INFO][4181] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" HandleID="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Workload="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.003 [INFO][4181] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" HandleID="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Workload="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000596fb0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-779d9c954d-c96p4", "timestamp":"2025-09-16 04:38:28.003814291 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.004 [INFO][4181] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.004 [INFO][4181] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.004 [INFO][4181] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.014 [INFO][4181] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" host="localhost" Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.039 [INFO][4181] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.044 [INFO][4181] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.046 [INFO][4181] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.048 [INFO][4181] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:28.085803 containerd[1528]: 2025-09-16 04:38:28.048 [INFO][4181] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" host="localhost" Sep 16 04:38:28.086057 containerd[1528]: 2025-09-16 04:38:28.050 [INFO][4181] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1 Sep 16 04:38:28.086057 containerd[1528]: 2025-09-16 04:38:28.059 [INFO][4181] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" host="localhost" Sep 16 04:38:28.086057 containerd[1528]: 2025-09-16 04:38:28.064 [INFO][4181] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" host="localhost" Sep 16 04:38:28.086057 containerd[1528]: 2025-09-16 04:38:28.064 [INFO][4181] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" host="localhost" Sep 16 04:38:28.086057 containerd[1528]: 2025-09-16 04:38:28.064 [INFO][4181] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:28.086057 containerd[1528]: 2025-09-16 04:38:28.064 [INFO][4181] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" HandleID="k8s-pod-network.4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Workload="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" Sep 16 04:38:28.086170 containerd[1528]: 2025-09-16 04:38:28.066 [INFO][4166] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0", GenerateName:"calico-kube-controllers-779d9c954d-", Namespace:"calico-system", SelfLink:"", UID:"9922ce3d-2d4e-4090-a23f-cae50ccd6bc0", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"779d9c954d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-779d9c954d-c96p4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali25b44902991", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:28.086227 containerd[1528]: 2025-09-16 04:38:28.066 [INFO][4166] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" Sep 16 04:38:28.086227 containerd[1528]: 2025-09-16 04:38:28.066 [INFO][4166] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali25b44902991 ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" Sep 16 04:38:28.086227 containerd[1528]: 2025-09-16 04:38:28.068 [INFO][4166] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" Sep 16 04:38:28.086412 containerd[1528]: 2025-09-16 04:38:28.069 [INFO][4166] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0", GenerateName:"calico-kube-controllers-779d9c954d-", Namespace:"calico-system", SelfLink:"", UID:"9922ce3d-2d4e-4090-a23f-cae50ccd6bc0", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"779d9c954d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1", Pod:"calico-kube-controllers-779d9c954d-c96p4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali25b44902991", MAC:"22:29:2a:a8:3c:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:28.086470 containerd[1528]: 2025-09-16 04:38:28.080 [INFO][4166] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" Namespace="calico-system" Pod="calico-kube-controllers-779d9c954d-c96p4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--779d9c954d--c96p4-eth0" Sep 16 04:38:28.125755 containerd[1528]: time="2025-09-16T04:38:28.125708237Z" level=info msg="connecting to shim 4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1" address="unix:///run/containerd/s/6dbb6841318cacc42fe1666f9315761be7ab6351a657e92e0a56ba1d37760baa" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:28.155494 systemd[1]: Started cri-containerd-4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1.scope - libcontainer container 4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1. Sep 16 04:38:28.167788 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:28.221434 containerd[1528]: time="2025-09-16T04:38:28.221306017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-779d9c954d-c96p4,Uid:9922ce3d-2d4e-4090-a23f-cae50ccd6bc0,Namespace:calico-system,Attempt:0,} returns sandbox id \"4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1\"" Sep 16 04:38:28.227057 containerd[1528]: time="2025-09-16T04:38:28.226988378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 16 04:38:28.676569 systemd[1]: Started sshd@7-10.0.0.111:22-10.0.0.1:54866.service - OpenSSH per-connection server daemon (10.0.0.1:54866). Sep 16 04:38:28.728394 sshd[4257]: Accepted publickey for core from 10.0.0.1 port 54866 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:28.729940 sshd-session[4257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:28.734414 systemd-logind[1510]: New session 8 of user core. Sep 16 04:38:28.741520 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 16 04:38:28.924868 sshd[4260]: Connection closed by 10.0.0.1 port 54866 Sep 16 04:38:28.925582 sshd-session[4257]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:28.929725 systemd[1]: sshd@7-10.0.0.111:22-10.0.0.1:54866.service: Deactivated successfully. Sep 16 04:38:28.932376 systemd[1]: session-8.scope: Deactivated successfully. Sep 16 04:38:28.933418 systemd-logind[1510]: Session 8 logged out. Waiting for processes to exit. Sep 16 04:38:28.934621 systemd-logind[1510]: Removed session 8. Sep 16 04:38:29.795564 systemd-networkd[1456]: cali25b44902991: Gained IPv6LL Sep 16 04:38:29.918569 containerd[1528]: time="2025-09-16T04:38:29.918422171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wczbt,Uid:191ab5cb-a255-43f8-99f4-8b4ccf3f8a34,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:29.918569 containerd[1528]: time="2025-09-16T04:38:29.918465171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-x6s7t,Uid:7187ad0a-ae2a-4b74-a73b-72d3fb6e0068,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:38:29.985591 containerd[1528]: time="2025-09-16T04:38:29.985547865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:29.986714 containerd[1528]: time="2025-09-16T04:38:29.986683505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 16 04:38:29.987438 containerd[1528]: time="2025-09-16T04:38:29.987395745Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:29.992647 containerd[1528]: time="2025-09-16T04:38:29.991713506Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:29.994786 containerd[1528]: time="2025-09-16T04:38:29.994735067Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 1.767656329s" Sep 16 04:38:29.994786 containerd[1528]: time="2025-09-16T04:38:29.994772427Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 16 04:38:30.017450 containerd[1528]: time="2025-09-16T04:38:30.017410671Z" level=info msg="CreateContainer within sandbox \"4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 16 04:38:30.026571 containerd[1528]: time="2025-09-16T04:38:30.026523793Z" level=info msg="Container ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:30.036385 containerd[1528]: time="2025-09-16T04:38:30.035029075Z" level=info msg="CreateContainer within sandbox \"4aa0a81da07bb782131b5e555473aeca6268285bbe6af064f88bfebbd99322c1\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0\"" Sep 16 04:38:30.036385 containerd[1528]: time="2025-09-16T04:38:30.035813355Z" level=info msg="StartContainer for \"ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0\"" Sep 16 04:38:30.037567 containerd[1528]: time="2025-09-16T04:38:30.037533555Z" level=info msg="connecting to shim ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0" address="unix:///run/containerd/s/6dbb6841318cacc42fe1666f9315761be7ab6351a657e92e0a56ba1d37760baa" protocol=ttrpc version=3 Sep 16 04:38:30.065550 systemd[1]: Started cri-containerd-ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0.scope - libcontainer container ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0. Sep 16 04:38:30.088363 systemd-networkd[1456]: calif80647bf8a5: Link UP Sep 16 04:38:30.088738 systemd-networkd[1456]: calif80647bf8a5: Gained carrier Sep 16 04:38:30.113942 containerd[1528]: 2025-09-16 04:38:29.989 [INFO][4282] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0 calico-apiserver-6cfbc464b6- calico-apiserver 7187ad0a-ae2a-4b74-a73b-72d3fb6e0068 789 0 2025-09-16 04:38:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6cfbc464b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6cfbc464b6-x6s7t eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif80647bf8a5 [] [] }} ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-" Sep 16 04:38:30.113942 containerd[1528]: 2025-09-16 04:38:29.989 [INFO][4282] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" Sep 16 04:38:30.113942 containerd[1528]: 2025-09-16 04:38:30.037 [INFO][4311] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" HandleID="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Workload="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.037 [INFO][4311] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" HandleID="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Workload="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400012f5f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6cfbc464b6-x6s7t", "timestamp":"2025-09-16 04:38:30.037355595 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.037 [INFO][4311] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.037 [INFO][4311] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.037 [INFO][4311] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.048 [INFO][4311] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" host="localhost" Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.057 [INFO][4311] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.063 [INFO][4311] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.065 [INFO][4311] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.068 [INFO][4311] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:30.114137 containerd[1528]: 2025-09-16 04:38:30.068 [INFO][4311] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" host="localhost" Sep 16 04:38:30.114354 containerd[1528]: 2025-09-16 04:38:30.071 [INFO][4311] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9 Sep 16 04:38:30.114354 containerd[1528]: 2025-09-16 04:38:30.074 [INFO][4311] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" host="localhost" Sep 16 04:38:30.114354 containerd[1528]: 2025-09-16 04:38:30.079 [INFO][4311] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" host="localhost" Sep 16 04:38:30.114354 containerd[1528]: 2025-09-16 04:38:30.079 [INFO][4311] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" host="localhost" Sep 16 04:38:30.114354 containerd[1528]: 2025-09-16 04:38:30.080 [INFO][4311] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:30.114354 containerd[1528]: 2025-09-16 04:38:30.080 [INFO][4311] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" HandleID="k8s-pod-network.19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Workload="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" Sep 16 04:38:30.114475 containerd[1528]: 2025-09-16 04:38:30.086 [INFO][4282] cni-plugin/k8s.go 418: Populated endpoint ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0", GenerateName:"calico-apiserver-6cfbc464b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"7187ad0a-ae2a-4b74-a73b-72d3fb6e0068", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cfbc464b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6cfbc464b6-x6s7t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif80647bf8a5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:30.114527 containerd[1528]: 2025-09-16 04:38:30.086 [INFO][4282] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" Sep 16 04:38:30.114527 containerd[1528]: 2025-09-16 04:38:30.086 [INFO][4282] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif80647bf8a5 ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" Sep 16 04:38:30.114527 containerd[1528]: 2025-09-16 04:38:30.088 [INFO][4282] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" Sep 16 04:38:30.114587 containerd[1528]: 2025-09-16 04:38:30.092 [INFO][4282] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0", GenerateName:"calico-apiserver-6cfbc464b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"7187ad0a-ae2a-4b74-a73b-72d3fb6e0068", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cfbc464b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9", Pod:"calico-apiserver-6cfbc464b6-x6s7t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif80647bf8a5", MAC:"6a:bb:35:70:e8:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:30.114647 containerd[1528]: 2025-09-16 04:38:30.108 [INFO][4282] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-x6s7t" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--x6s7t-eth0" Sep 16 04:38:30.123934 containerd[1528]: time="2025-09-16T04:38:30.123890973Z" level=info msg="StartContainer for \"ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0\" returns successfully" Sep 16 04:38:30.136268 containerd[1528]: time="2025-09-16T04:38:30.136180575Z" level=info msg="connecting to shim 19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9" address="unix:///run/containerd/s/c81a3220a8d9d5f5cfdf14962987fcc17234bbc08c09c0218f8c90cd92500061" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:30.165591 systemd[1]: Started cri-containerd-19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9.scope - libcontainer container 19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9. Sep 16 04:38:30.184872 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:30.187362 systemd-networkd[1456]: califc2e123d3cb: Link UP Sep 16 04:38:30.187749 systemd-networkd[1456]: califc2e123d3cb: Gained carrier Sep 16 04:38:30.211087 containerd[1528]: 2025-09-16 04:38:29.997 [INFO][4279] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--wczbt-eth0 goldmane-54d579b49d- calico-system 191ab5cb-a255-43f8-99f4-8b4ccf3f8a34 791 0 2025-09-16 04:38:07 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-wczbt eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califc2e123d3cb [] [] }} ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-" Sep 16 04:38:30.211087 containerd[1528]: 2025-09-16 04:38:29.997 [INFO][4279] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" Sep 16 04:38:30.211087 containerd[1528]: 2025-09-16 04:38:30.051 [INFO][4319] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" HandleID="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Workload="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.051 [INFO][4319] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" HandleID="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Workload="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000580f10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-wczbt", "timestamp":"2025-09-16 04:38:30.051118678 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.051 [INFO][4319] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.080 [INFO][4319] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.080 [INFO][4319] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.148 [INFO][4319] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" host="localhost" Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.156 [INFO][4319] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.164 [INFO][4319] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.166 [INFO][4319] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.169 [INFO][4319] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:30.214012 containerd[1528]: 2025-09-16 04:38:30.169 [INFO][4319] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" host="localhost" Sep 16 04:38:30.214215 containerd[1528]: 2025-09-16 04:38:30.171 [INFO][4319] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4 Sep 16 04:38:30.214215 containerd[1528]: 2025-09-16 04:38:30.175 [INFO][4319] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" host="localhost" Sep 16 04:38:30.214215 containerd[1528]: 2025-09-16 04:38:30.181 [INFO][4319] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" host="localhost" Sep 16 04:38:30.214215 containerd[1528]: 2025-09-16 04:38:30.181 [INFO][4319] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" host="localhost" Sep 16 04:38:30.214215 containerd[1528]: 2025-09-16 04:38:30.181 [INFO][4319] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:30.214215 containerd[1528]: 2025-09-16 04:38:30.181 [INFO][4319] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" HandleID="k8s-pod-network.9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Workload="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" Sep 16 04:38:30.214348 containerd[1528]: 2025-09-16 04:38:30.184 [INFO][4279] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--wczbt-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"191ab5cb-a255-43f8-99f4-8b4ccf3f8a34", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-wczbt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califc2e123d3cb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:30.214348 containerd[1528]: 2025-09-16 04:38:30.184 [INFO][4279] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" Sep 16 04:38:30.214592 containerd[1528]: 2025-09-16 04:38:30.184 [INFO][4279] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc2e123d3cb ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" Sep 16 04:38:30.214592 containerd[1528]: 2025-09-16 04:38:30.188 [INFO][4279] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" Sep 16 04:38:30.214645 containerd[1528]: 2025-09-16 04:38:30.190 [INFO][4279] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--wczbt-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"191ab5cb-a255-43f8-99f4-8b4ccf3f8a34", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4", Pod:"goldmane-54d579b49d-wczbt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califc2e123d3cb", MAC:"ea:d3:ca:38:d6:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:30.214702 containerd[1528]: 2025-09-16 04:38:30.207 [INFO][4279] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" Namespace="calico-system" Pod="goldmane-54d579b49d-wczbt" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--wczbt-eth0" Sep 16 04:38:30.228868 containerd[1528]: time="2025-09-16T04:38:30.228822954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-x6s7t,Uid:7187ad0a-ae2a-4b74-a73b-72d3fb6e0068,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9\"" Sep 16 04:38:30.230286 containerd[1528]: time="2025-09-16T04:38:30.230246554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:38:30.242784 containerd[1528]: time="2025-09-16T04:38:30.242731037Z" level=info msg="connecting to shim 9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4" address="unix:///run/containerd/s/c0e2cd838e80bbaf60c19e430ec1c72bbb79222884d2fc90fb426a6b75f30cb1" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:30.275493 systemd[1]: Started cri-containerd-9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4.scope - libcontainer container 9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4. Sep 16 04:38:30.286397 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:30.307811 containerd[1528]: time="2025-09-16T04:38:30.307758130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wczbt,Uid:191ab5cb-a255-43f8-99f4-8b4ccf3f8a34,Namespace:calico-system,Attempt:0,} returns sandbox id \"9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4\"" Sep 16 04:38:30.919769 containerd[1528]: time="2025-09-16T04:38:30.919620733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lv566,Uid:610c18ac-cdd1-464e-aa63-481268b282c8,Namespace:kube-system,Attempt:0,}" Sep 16 04:38:30.920158 containerd[1528]: time="2025-09-16T04:38:30.919942093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-lx4r7,Uid:9dfceabc-d3cc-47a6-863c-141ab1a44e04,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:38:31.051997 systemd-networkd[1456]: cali263b75f44e3: Link UP Sep 16 04:38:31.052532 systemd-networkd[1456]: cali263b75f44e3: Gained carrier Sep 16 04:38:31.098462 containerd[1528]: 2025-09-16 04:38:30.973 [INFO][4488] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0 calico-apiserver-6cfbc464b6- calico-apiserver 9dfceabc-d3cc-47a6-863c-141ab1a44e04 793 0 2025-09-16 04:38:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6cfbc464b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6cfbc464b6-lx4r7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali263b75f44e3 [] [] }} ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-" Sep 16 04:38:31.098462 containerd[1528]: 2025-09-16 04:38:30.974 [INFO][4488] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" Sep 16 04:38:31.098462 containerd[1528]: 2025-09-16 04:38:31.006 [INFO][4504] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" HandleID="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Workload="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.006 [INFO][4504] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" HandleID="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Workload="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d7b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6cfbc464b6-lx4r7", "timestamp":"2025-09-16 04:38:31.006002031 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.006 [INFO][4504] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.006 [INFO][4504] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.006 [INFO][4504] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.017 [INFO][4504] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" host="localhost" Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.022 [INFO][4504] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.027 [INFO][4504] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.029 [INFO][4504] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.032 [INFO][4504] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:31.098699 containerd[1528]: 2025-09-16 04:38:31.032 [INFO][4504] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" host="localhost" Sep 16 04:38:31.098928 containerd[1528]: 2025-09-16 04:38:31.034 [INFO][4504] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0 Sep 16 04:38:31.098928 containerd[1528]: 2025-09-16 04:38:31.039 [INFO][4504] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" host="localhost" Sep 16 04:38:31.098928 containerd[1528]: 2025-09-16 04:38:31.045 [INFO][4504] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" host="localhost" Sep 16 04:38:31.098928 containerd[1528]: 2025-09-16 04:38:31.045 [INFO][4504] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" host="localhost" Sep 16 04:38:31.098928 containerd[1528]: 2025-09-16 04:38:31.045 [INFO][4504] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:31.098928 containerd[1528]: 2025-09-16 04:38:31.045 [INFO][4504] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" HandleID="k8s-pod-network.4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Workload="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" Sep 16 04:38:31.099285 containerd[1528]: 2025-09-16 04:38:31.048 [INFO][4488] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0", GenerateName:"calico-apiserver-6cfbc464b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"9dfceabc-d3cc-47a6-863c-141ab1a44e04", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cfbc464b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6cfbc464b6-lx4r7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali263b75f44e3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:31.099410 containerd[1528]: 2025-09-16 04:38:31.048 [INFO][4488] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" Sep 16 04:38:31.099410 containerd[1528]: 2025-09-16 04:38:31.048 [INFO][4488] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali263b75f44e3 ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" Sep 16 04:38:31.099410 containerd[1528]: 2025-09-16 04:38:31.052 [INFO][4488] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" Sep 16 04:38:31.099483 containerd[1528]: 2025-09-16 04:38:31.053 [INFO][4488] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0", GenerateName:"calico-apiserver-6cfbc464b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"9dfceabc-d3cc-47a6-863c-141ab1a44e04", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cfbc464b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0", Pod:"calico-apiserver-6cfbc464b6-lx4r7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali263b75f44e3", MAC:"ae:98:a6:8e:0c:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:31.099538 containerd[1528]: 2025-09-16 04:38:31.095 [INFO][4488] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" Namespace="calico-apiserver" Pod="calico-apiserver-6cfbc464b6-lx4r7" WorkloadEndpoint="localhost-k8s-calico--apiserver--6cfbc464b6--lx4r7-eth0" Sep 16 04:38:31.137068 kubelet[2666]: I0916 04:38:31.136953 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-779d9c954d-c96p4" podStartSLOduration=22.364215287 podStartE2EDuration="24.136933496s" podCreationTimestamp="2025-09-16 04:38:07 +0000 UTC" firstStartedPulling="2025-09-16 04:38:28.224535378 +0000 UTC m=+41.410401445" lastFinishedPulling="2025-09-16 04:38:29.997253587 +0000 UTC m=+43.183119654" observedRunningTime="2025-09-16 04:38:31.135423856 +0000 UTC m=+44.321289923" watchObservedRunningTime="2025-09-16 04:38:31.136933496 +0000 UTC m=+44.322799563" Sep 16 04:38:31.139004 containerd[1528]: time="2025-09-16T04:38:31.138964417Z" level=info msg="connecting to shim 4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0" address="unix:///run/containerd/s/c91efa7cb4232418585e53ca83a35d0f3399bbbb5c865963c6d1de1a881a76f2" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:31.173381 systemd-networkd[1456]: cali2c4f2484259: Link UP Sep 16 04:38:31.173765 systemd-networkd[1456]: cali2c4f2484259: Gained carrier Sep 16 04:38:31.180289 systemd[1]: Started cri-containerd-4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0.scope - libcontainer container 4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0. Sep 16 04:38:31.195428 containerd[1528]: 2025-09-16 04:38:30.994 [INFO][4477] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--lv566-eth0 coredns-668d6bf9bc- kube-system 610c18ac-cdd1-464e-aa63-481268b282c8 792 0 2025-09-16 04:37:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-lv566 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2c4f2484259 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-" Sep 16 04:38:31.195428 containerd[1528]: 2025-09-16 04:38:30.994 [INFO][4477] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" Sep 16 04:38:31.195428 containerd[1528]: 2025-09-16 04:38:31.024 [INFO][4512] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" HandleID="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Workload="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.024 [INFO][4512] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" HandleID="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Workload="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400035df20), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-lv566", "timestamp":"2025-09-16 04:38:31.024030954 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.024 [INFO][4512] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.046 [INFO][4512] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.047 [INFO][4512] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.117 [INFO][4512] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" host="localhost" Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.125 [INFO][4512] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.141 [INFO][4512] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.144 [INFO][4512] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.146 [INFO][4512] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:31.195633 containerd[1528]: 2025-09-16 04:38:31.146 [INFO][4512] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" host="localhost" Sep 16 04:38:31.195894 containerd[1528]: 2025-09-16 04:38:31.147 [INFO][4512] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83 Sep 16 04:38:31.195894 containerd[1528]: 2025-09-16 04:38:31.155 [INFO][4512] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" host="localhost" Sep 16 04:38:31.195894 containerd[1528]: 2025-09-16 04:38:31.166 [INFO][4512] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" host="localhost" Sep 16 04:38:31.195894 containerd[1528]: 2025-09-16 04:38:31.166 [INFO][4512] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" host="localhost" Sep 16 04:38:31.195894 containerd[1528]: 2025-09-16 04:38:31.166 [INFO][4512] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:31.195894 containerd[1528]: 2025-09-16 04:38:31.166 [INFO][4512] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" HandleID="k8s-pod-network.093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Workload="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" Sep 16 04:38:31.196588 containerd[1528]: 2025-09-16 04:38:31.170 [INFO][4477] cni-plugin/k8s.go 418: Populated endpoint ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--lv566-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"610c18ac-cdd1-464e-aa63-481268b282c8", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 37, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-lv566", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2c4f2484259", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:31.196671 containerd[1528]: 2025-09-16 04:38:31.170 [INFO][4477] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" Sep 16 04:38:31.196671 containerd[1528]: 2025-09-16 04:38:31.170 [INFO][4477] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2c4f2484259 ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" Sep 16 04:38:31.196671 containerd[1528]: 2025-09-16 04:38:31.174 [INFO][4477] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" Sep 16 04:38:31.196740 containerd[1528]: 2025-09-16 04:38:31.175 [INFO][4477] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--lv566-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"610c18ac-cdd1-464e-aa63-481268b282c8", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 37, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83", Pod:"coredns-668d6bf9bc-lv566", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2c4f2484259", MAC:"22:fc:7d:c8:f4:1c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:31.196740 containerd[1528]: 2025-09-16 04:38:31.191 [INFO][4477] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" Namespace="kube-system" Pod="coredns-668d6bf9bc-lv566" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--lv566-eth0" Sep 16 04:38:31.207255 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:31.225798 containerd[1528]: time="2025-09-16T04:38:31.225747874Z" level=info msg="connecting to shim 093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83" address="unix:///run/containerd/s/9cd78f3a0f7cd1ef6c4978c41997c8eaf6a15051cd43e4a966c71e491804a84f" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:31.249148 containerd[1528]: time="2025-09-16T04:38:31.249031198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cfbc464b6-lx4r7,Uid:9dfceabc-d3cc-47a6-863c-141ab1a44e04,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0\"" Sep 16 04:38:31.267509 systemd[1]: Started cri-containerd-093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83.scope - libcontainer container 093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83. Sep 16 04:38:31.279006 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:31.298985 containerd[1528]: time="2025-09-16T04:38:31.298922528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lv566,Uid:610c18ac-cdd1-464e-aa63-481268b282c8,Namespace:kube-system,Attempt:0,} returns sandbox id \"093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83\"" Sep 16 04:38:31.302669 containerd[1528]: time="2025-09-16T04:38:31.302541329Z" level=info msg="CreateContainer within sandbox \"093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:38:31.311711 containerd[1528]: time="2025-09-16T04:38:31.311562371Z" level=info msg="Container e718f493d264d98546ba771eeecf519dc685c5693a47fc14a37ca2234d8d0df2: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:31.316947 containerd[1528]: time="2025-09-16T04:38:31.316896612Z" level=info msg="CreateContainer within sandbox \"093206025283b196b623857d2b8fbc27173cf9b9f2a949d2e65918238eba7d83\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e718f493d264d98546ba771eeecf519dc685c5693a47fc14a37ca2234d8d0df2\"" Sep 16 04:38:31.318505 containerd[1528]: time="2025-09-16T04:38:31.318472052Z" level=info msg="StartContainer for \"e718f493d264d98546ba771eeecf519dc685c5693a47fc14a37ca2234d8d0df2\"" Sep 16 04:38:31.319352 containerd[1528]: time="2025-09-16T04:38:31.319290132Z" level=info msg="connecting to shim e718f493d264d98546ba771eeecf519dc685c5693a47fc14a37ca2234d8d0df2" address="unix:///run/containerd/s/9cd78f3a0f7cd1ef6c4978c41997c8eaf6a15051cd43e4a966c71e491804a84f" protocol=ttrpc version=3 Sep 16 04:38:31.353534 systemd[1]: Started cri-containerd-e718f493d264d98546ba771eeecf519dc685c5693a47fc14a37ca2234d8d0df2.scope - libcontainer container e718f493d264d98546ba771eeecf519dc685c5693a47fc14a37ca2234d8d0df2. Sep 16 04:38:31.381710 containerd[1528]: time="2025-09-16T04:38:31.381439864Z" level=info msg="StartContainer for \"e718f493d264d98546ba771eeecf519dc685c5693a47fc14a37ca2234d8d0df2\" returns successfully" Sep 16 04:38:31.779652 systemd-networkd[1456]: calif80647bf8a5: Gained IPv6LL Sep 16 04:38:31.907536 systemd-networkd[1456]: califc2e123d3cb: Gained IPv6LL Sep 16 04:38:31.918434 containerd[1528]: time="2025-09-16T04:38:31.918390450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x2hwg,Uid:42a76947-1770-41d9-ba2c-8e5f1b8c14bc,Namespace:kube-system,Attempt:0,}" Sep 16 04:38:32.087687 systemd-networkd[1456]: califc3457b7a25: Link UP Sep 16 04:38:32.087952 systemd-networkd[1456]: califc3457b7a25: Gained carrier Sep 16 04:38:32.122647 kubelet[2666]: I0916 04:38:32.122465 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:31.971 [INFO][4674] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0 coredns-668d6bf9bc- kube-system 42a76947-1770-41d9-ba2c-8e5f1b8c14bc 781 0 2025-09-16 04:37:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-x2hwg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califc3457b7a25 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:31.971 [INFO][4674] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.010 [INFO][4690] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" HandleID="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Workload="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.010 [INFO][4690] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" HandleID="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Workload="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004ded0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-x2hwg", "timestamp":"2025-09-16 04:38:32.010510148 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.010 [INFO][4690] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.010 [INFO][4690] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.010 [INFO][4690] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.026 [INFO][4690] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.032 [INFO][4690] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.039 [INFO][4690] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.044 [INFO][4690] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.050 [INFO][4690] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.050 [INFO][4690] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.055 [INFO][4690] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59 Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.059 [INFO][4690] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.073 [INFO][4690] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.073 [INFO][4690] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" host="localhost" Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.073 [INFO][4690] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:32.137757 containerd[1528]: 2025-09-16 04:38:32.073 [INFO][4690] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" HandleID="k8s-pod-network.6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Workload="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" Sep 16 04:38:32.138974 containerd[1528]: 2025-09-16 04:38:32.081 [INFO][4674] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"42a76947-1770-41d9-ba2c-8e5f1b8c14bc", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 37, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-x2hwg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califc3457b7a25", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:32.138974 containerd[1528]: 2025-09-16 04:38:32.081 [INFO][4674] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" Sep 16 04:38:32.138974 containerd[1528]: 2025-09-16 04:38:32.083 [INFO][4674] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc3457b7a25 ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" Sep 16 04:38:32.138974 containerd[1528]: 2025-09-16 04:38:32.088 [INFO][4674] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" Sep 16 04:38:32.138974 containerd[1528]: 2025-09-16 04:38:32.089 [INFO][4674] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"42a76947-1770-41d9-ba2c-8e5f1b8c14bc", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 37, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59", Pod:"coredns-668d6bf9bc-x2hwg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califc3457b7a25", MAC:"fe:1b:30:4c:b3:23", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:32.138974 containerd[1528]: 2025-09-16 04:38:32.129 [INFO][4674] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" Namespace="kube-system" Pod="coredns-668d6bf9bc-x2hwg" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x2hwg-eth0" Sep 16 04:38:32.193783 kubelet[2666]: I0916 04:38:32.193681 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-lv566" podStartSLOduration=39.193663824 podStartE2EDuration="39.193663824s" podCreationTimestamp="2025-09-16 04:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:38:32.153678816 +0000 UTC m=+45.339544963" watchObservedRunningTime="2025-09-16 04:38:32.193663824 +0000 UTC m=+45.379529891" Sep 16 04:38:32.215682 containerd[1528]: time="2025-09-16T04:38:32.214986708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:32.217546 containerd[1528]: time="2025-09-16T04:38:32.217472188Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 16 04:38:32.219532 containerd[1528]: time="2025-09-16T04:38:32.219496309Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:32.222432 containerd[1528]: time="2025-09-16T04:38:32.222397069Z" level=info msg="connecting to shim 6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59" address="unix:///run/containerd/s/0116083fca8120fe07ea1035c3afa448a94ca0131cd619d4150c4c3967c00ae8" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:32.224167 containerd[1528]: time="2025-09-16T04:38:32.224132189Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:32.225487 containerd[1528]: time="2025-09-16T04:38:32.225452550Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 1.995164036s" Sep 16 04:38:32.225580 containerd[1528]: time="2025-09-16T04:38:32.225499910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 16 04:38:32.228548 containerd[1528]: time="2025-09-16T04:38:32.228402950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 16 04:38:32.230170 containerd[1528]: time="2025-09-16T04:38:32.230062471Z" level=info msg="CreateContainer within sandbox \"19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:38:32.239119 containerd[1528]: time="2025-09-16T04:38:32.239072392Z" level=info msg="Container 54b47be7114855040d55725bbc4d900dceeee621869cee6acdbef22503fc8ef4: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:32.252152 containerd[1528]: time="2025-09-16T04:38:32.252108395Z" level=info msg="CreateContainer within sandbox \"19d8bb3cb86da60d74701eb5350b0f60d5761c36e32b9c7777b58aad5887d0c9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"54b47be7114855040d55725bbc4d900dceeee621869cee6acdbef22503fc8ef4\"" Sep 16 04:38:32.253894 containerd[1528]: time="2025-09-16T04:38:32.253821635Z" level=info msg="StartContainer for \"54b47be7114855040d55725bbc4d900dceeee621869cee6acdbef22503fc8ef4\"" Sep 16 04:38:32.256450 containerd[1528]: time="2025-09-16T04:38:32.256399996Z" level=info msg="connecting to shim 54b47be7114855040d55725bbc4d900dceeee621869cee6acdbef22503fc8ef4" address="unix:///run/containerd/s/c81a3220a8d9d5f5cfdf14962987fcc17234bbc08c09c0218f8c90cd92500061" protocol=ttrpc version=3 Sep 16 04:38:32.257533 systemd[1]: Started cri-containerd-6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59.scope - libcontainer container 6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59. Sep 16 04:38:32.277655 systemd[1]: Started cri-containerd-54b47be7114855040d55725bbc4d900dceeee621869cee6acdbef22503fc8ef4.scope - libcontainer container 54b47be7114855040d55725bbc4d900dceeee621869cee6acdbef22503fc8ef4. Sep 16 04:38:32.282054 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:32.291517 systemd-networkd[1456]: cali263b75f44e3: Gained IPv6LL Sep 16 04:38:32.309267 containerd[1528]: time="2025-09-16T04:38:32.309230726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x2hwg,Uid:42a76947-1770-41d9-ba2c-8e5f1b8c14bc,Namespace:kube-system,Attempt:0,} returns sandbox id \"6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59\"" Sep 16 04:38:32.319483 containerd[1528]: time="2025-09-16T04:38:32.319436728Z" level=info msg="CreateContainer within sandbox \"6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:38:32.324225 containerd[1528]: time="2025-09-16T04:38:32.324184569Z" level=info msg="StartContainer for \"54b47be7114855040d55725bbc4d900dceeee621869cee6acdbef22503fc8ef4\" returns successfully" Sep 16 04:38:32.332179 containerd[1528]: time="2025-09-16T04:38:32.331530250Z" level=info msg="Container e826febdc3eead7178b70982861f582fc68c5a611daf3311bf1d63c8d4e850e9: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:32.339122 containerd[1528]: time="2025-09-16T04:38:32.339066292Z" level=info msg="CreateContainer within sandbox \"6e1aa249671f3acb2a43d24025489ef3cbce986ca4fac19b7116f178ed1eca59\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e826febdc3eead7178b70982861f582fc68c5a611daf3311bf1d63c8d4e850e9\"" Sep 16 04:38:32.339807 containerd[1528]: time="2025-09-16T04:38:32.339692972Z" level=info msg="StartContainer for \"e826febdc3eead7178b70982861f582fc68c5a611daf3311bf1d63c8d4e850e9\"" Sep 16 04:38:32.342154 containerd[1528]: time="2025-09-16T04:38:32.342125532Z" level=info msg="connecting to shim e826febdc3eead7178b70982861f582fc68c5a611daf3311bf1d63c8d4e850e9" address="unix:///run/containerd/s/0116083fca8120fe07ea1035c3afa448a94ca0131cd619d4150c4c3967c00ae8" protocol=ttrpc version=3 Sep 16 04:38:32.367744 systemd[1]: Started cri-containerd-e826febdc3eead7178b70982861f582fc68c5a611daf3311bf1d63c8d4e850e9.scope - libcontainer container e826febdc3eead7178b70982861f582fc68c5a611daf3311bf1d63c8d4e850e9. Sep 16 04:38:32.412554 containerd[1528]: time="2025-09-16T04:38:32.412515386Z" level=info msg="StartContainer for \"e826febdc3eead7178b70982861f582fc68c5a611daf3311bf1d63c8d4e850e9\" returns successfully" Sep 16 04:38:32.921688 containerd[1528]: time="2025-09-16T04:38:32.921642484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zlcv,Uid:f6d01e5d-7118-4f45-86dd-a1a5e54fa711,Namespace:calico-system,Attempt:0,}" Sep 16 04:38:32.931528 systemd-networkd[1456]: cali2c4f2484259: Gained IPv6LL Sep 16 04:38:33.053988 systemd-networkd[1456]: cali8a2878d9108: Link UP Sep 16 04:38:33.054305 systemd-networkd[1456]: cali8a2878d9108: Gained carrier Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:32.982 [INFO][4835] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--5zlcv-eth0 csi-node-driver- calico-system f6d01e5d-7118-4f45-86dd-a1a5e54fa711 664 0 2025-09-16 04:38:07 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-5zlcv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8a2878d9108 [] [] }} ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:32.982 [INFO][4835] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-eth0" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.010 [INFO][4849] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" HandleID="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Workload="localhost-k8s-csi--node--driver--5zlcv-eth0" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.010 [INFO][4849] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" HandleID="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Workload="localhost-k8s-csi--node--driver--5zlcv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323490), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-5zlcv", "timestamp":"2025-09-16 04:38:33.010724541 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.010 [INFO][4849] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.011 [INFO][4849] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.011 [INFO][4849] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.020 [INFO][4849] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.026 [INFO][4849] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.031 [INFO][4849] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.033 [INFO][4849] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.036 [INFO][4849] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.036 [INFO][4849] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.038 [INFO][4849] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7 Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.041 [INFO][4849] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.048 [INFO][4849] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.049 [INFO][4849] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" host="localhost" Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.049 [INFO][4849] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:38:33.070888 containerd[1528]: 2025-09-16 04:38:33.049 [INFO][4849] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" HandleID="k8s-pod-network.ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Workload="localhost-k8s-csi--node--driver--5zlcv-eth0" Sep 16 04:38:33.071748 containerd[1528]: 2025-09-16 04:38:33.051 [INFO][4835] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--5zlcv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f6d01e5d-7118-4f45-86dd-a1a5e54fa711", ResourceVersion:"664", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-5zlcv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8a2878d9108", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:33.071748 containerd[1528]: 2025-09-16 04:38:33.051 [INFO][4835] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-eth0" Sep 16 04:38:33.071748 containerd[1528]: 2025-09-16 04:38:33.052 [INFO][4835] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8a2878d9108 ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-eth0" Sep 16 04:38:33.071748 containerd[1528]: 2025-09-16 04:38:33.053 [INFO][4835] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-eth0" Sep 16 04:38:33.071748 containerd[1528]: 2025-09-16 04:38:33.054 [INFO][4835] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--5zlcv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f6d01e5d-7118-4f45-86dd-a1a5e54fa711", ResourceVersion:"664", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 38, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7", Pod:"csi-node-driver-5zlcv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8a2878d9108", MAC:"c6:ff:c2:c1:7d:95", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:38:33.071748 containerd[1528]: 2025-09-16 04:38:33.068 [INFO][4835] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" Namespace="calico-system" Pod="csi-node-driver-5zlcv" WorkloadEndpoint="localhost-k8s-csi--node--driver--5zlcv-eth0" Sep 16 04:38:33.184656 kubelet[2666]: I0916 04:38:33.184517 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6cfbc464b6-x6s7t" podStartSLOduration=29.186861578 podStartE2EDuration="31.184489854s" podCreationTimestamp="2025-09-16 04:38:02 +0000 UTC" firstStartedPulling="2025-09-16 04:38:30.230004514 +0000 UTC m=+43.415870541" lastFinishedPulling="2025-09-16 04:38:32.22763275 +0000 UTC m=+45.413498817" observedRunningTime="2025-09-16 04:38:33.183826054 +0000 UTC m=+46.369692121" watchObservedRunningTime="2025-09-16 04:38:33.184489854 +0000 UTC m=+46.370355921" Sep 16 04:38:33.206342 kubelet[2666]: I0916 04:38:33.206179 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-x2hwg" podStartSLOduration=40.206155978 podStartE2EDuration="40.206155978s" podCreationTimestamp="2025-09-16 04:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:38:33.200239617 +0000 UTC m=+46.386105684" watchObservedRunningTime="2025-09-16 04:38:33.206155978 +0000 UTC m=+46.392022205" Sep 16 04:38:33.217299 containerd[1528]: time="2025-09-16T04:38:33.217248980Z" level=info msg="connecting to shim ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7" address="unix:///run/containerd/s/38688916bfafb2ec56ee29d80050e555ff7e44915d2532d585fe6453ccd30201" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:38:33.263633 systemd[1]: Started cri-containerd-ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7.scope - libcontainer container ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7. Sep 16 04:38:33.275413 systemd-resolved[1349]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 16 04:38:33.301689 containerd[1528]: time="2025-09-16T04:38:33.301636956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zlcv,Uid:f6d01e5d-7118-4f45-86dd-a1a5e54fa711,Namespace:calico-system,Attempt:0,} returns sandbox id \"ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7\"" Sep 16 04:38:33.315509 systemd-networkd[1456]: califc3457b7a25: Gained IPv6LL Sep 16 04:38:33.947425 systemd[1]: Started sshd@8-10.0.0.111:22-10.0.0.1:60880.service - OpenSSH per-connection server daemon (10.0.0.1:60880). Sep 16 04:38:34.048174 sshd[4923]: Accepted publickey for core from 10.0.0.1 port 60880 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:34.049297 sshd-session[4923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:34.056909 systemd-logind[1510]: New session 9 of user core. Sep 16 04:38:34.061576 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 16 04:38:34.328848 sshd[4930]: Connection closed by 10.0.0.1 port 60880 Sep 16 04:38:34.330179 sshd-session[4923]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:34.335680 systemd[1]: sshd@8-10.0.0.111:22-10.0.0.1:60880.service: Deactivated successfully. Sep 16 04:38:34.339686 systemd[1]: session-9.scope: Deactivated successfully. Sep 16 04:38:34.340895 systemd-logind[1510]: Session 9 logged out. Waiting for processes to exit. Sep 16 04:38:34.344208 systemd-logind[1510]: Removed session 9. Sep 16 04:38:34.467545 systemd-networkd[1456]: cali8a2878d9108: Gained IPv6LL Sep 16 04:38:34.502056 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2609839697.mount: Deactivated successfully. Sep 16 04:38:34.732749 kubelet[2666]: I0916 04:38:34.732563 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:34.902529 kubelet[2666]: I0916 04:38:34.902490 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:34.993684 containerd[1528]: time="2025-09-16T04:38:34.993546791Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0\" id:\"95fe038ecfbeffcb4114245b90a10c7e0de0425dc67d845278f0c712b6eda620\" pid:4988 exited_at:{seconds:1757997514 nanos:971304027}" Sep 16 04:38:35.033699 containerd[1528]: time="2025-09-16T04:38:35.033623958Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708\" id:\"4d185e69e028ff24b68b9519f4fa74244054d8f030f0da83b2f6b3452af491f7\" pid:4964 exited_at:{seconds:1757997515 nanos:33070318}" Sep 16 04:38:35.068481 containerd[1528]: time="2025-09-16T04:38:35.068409885Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0\" id:\"c0db32390d50b690bc097834f0ead03b578441d545c0f33e60d49cff5d3e5069\" pid:5013 exited_at:{seconds:1757997515 nanos:67741805}" Sep 16 04:38:35.246561 containerd[1528]: time="2025-09-16T04:38:35.246435117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:35.247269 containerd[1528]: time="2025-09-16T04:38:35.247133317Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 16 04:38:35.248406 containerd[1528]: time="2025-09-16T04:38:35.248369637Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:35.251146 containerd[1528]: time="2025-09-16T04:38:35.250770438Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:35.251622 containerd[1528]: time="2025-09-16T04:38:35.251593398Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.023048768s" Sep 16 04:38:35.251705 containerd[1528]: time="2025-09-16T04:38:35.251690438Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 16 04:38:35.252765 containerd[1528]: time="2025-09-16T04:38:35.252729798Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:38:35.255222 containerd[1528]: time="2025-09-16T04:38:35.255183639Z" level=info msg="CreateContainer within sandbox \"9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 16 04:38:35.263035 containerd[1528]: time="2025-09-16T04:38:35.262766360Z" level=info msg="Container 128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:35.278187 containerd[1528]: time="2025-09-16T04:38:35.278140283Z" level=info msg="CreateContainer within sandbox \"9ae62e94ace2f8280921084de37f5e5cd4defc080781d24ed074cc3634e249f4\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2\"" Sep 16 04:38:35.278848 containerd[1528]: time="2025-09-16T04:38:35.278810603Z" level=info msg="StartContainer for \"128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2\"" Sep 16 04:38:35.280868 containerd[1528]: time="2025-09-16T04:38:35.280833723Z" level=info msg="connecting to shim 128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2" address="unix:///run/containerd/s/c0e2cd838e80bbaf60c19e430ec1c72bbb79222884d2fc90fb426a6b75f30cb1" protocol=ttrpc version=3 Sep 16 04:38:35.288365 containerd[1528]: time="2025-09-16T04:38:35.288058565Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7460af17ed3f3f09445393609deb801bfc1529536d29d6ae6285906770bc708\" id:\"2bc0a6ef64e0569d03f3111a234be15945b752c54d0dbbd529a9450225609559\" pid:5038 exited_at:{seconds:1757997515 nanos:287470404}" Sep 16 04:38:35.304619 systemd[1]: Started cri-containerd-128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2.scope - libcontainer container 128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2. Sep 16 04:38:35.344737 containerd[1528]: time="2025-09-16T04:38:35.344690055Z" level=info msg="StartContainer for \"128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2\" returns successfully" Sep 16 04:38:35.527349 containerd[1528]: time="2025-09-16T04:38:35.526510448Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:35.527674 containerd[1528]: time="2025-09-16T04:38:35.527634608Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 16 04:38:35.529954 containerd[1528]: time="2025-09-16T04:38:35.529918528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 277.01729ms" Sep 16 04:38:35.530056 containerd[1528]: time="2025-09-16T04:38:35.529964808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 16 04:38:35.531563 containerd[1528]: time="2025-09-16T04:38:35.531519889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 16 04:38:35.533348 containerd[1528]: time="2025-09-16T04:38:35.533154889Z" level=info msg="CreateContainer within sandbox \"4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:38:35.544607 containerd[1528]: time="2025-09-16T04:38:35.544552731Z" level=info msg="Container 84383828c418d8d668b6bd44cf8d2538d7f89d95132dc4224d266ed3583bd889: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:35.551138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4270445254.mount: Deactivated successfully. Sep 16 04:38:35.555096 containerd[1528]: time="2025-09-16T04:38:35.555028133Z" level=info msg="CreateContainer within sandbox \"4f92802d8aaf3c1dee76a466ef4b45c13325257e4c6adbee316b2b70758eaeb0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"84383828c418d8d668b6bd44cf8d2538d7f89d95132dc4224d266ed3583bd889\"" Sep 16 04:38:35.555914 containerd[1528]: time="2025-09-16T04:38:35.555881533Z" level=info msg="StartContainer for \"84383828c418d8d668b6bd44cf8d2538d7f89d95132dc4224d266ed3583bd889\"" Sep 16 04:38:35.557152 containerd[1528]: time="2025-09-16T04:38:35.557111853Z" level=info msg="connecting to shim 84383828c418d8d668b6bd44cf8d2538d7f89d95132dc4224d266ed3583bd889" address="unix:///run/containerd/s/c91efa7cb4232418585e53ca83a35d0f3399bbbb5c865963c6d1de1a881a76f2" protocol=ttrpc version=3 Sep 16 04:38:35.596541 systemd[1]: Started cri-containerd-84383828c418d8d668b6bd44cf8d2538d7f89d95132dc4224d266ed3583bd889.scope - libcontainer container 84383828c418d8d668b6bd44cf8d2538d7f89d95132dc4224d266ed3583bd889. Sep 16 04:38:35.651110 containerd[1528]: time="2025-09-16T04:38:35.651059870Z" level=info msg="StartContainer for \"84383828c418d8d668b6bd44cf8d2538d7f89d95132dc4224d266ed3583bd889\" returns successfully" Sep 16 04:38:36.180510 kubelet[2666]: I0916 04:38:36.180438 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-wczbt" podStartSLOduration=24.236656658 podStartE2EDuration="29.180119166s" podCreationTimestamp="2025-09-16 04:38:07 +0000 UTC" firstStartedPulling="2025-09-16 04:38:30.30894429 +0000 UTC m=+43.494810317" lastFinishedPulling="2025-09-16 04:38:35.252406798 +0000 UTC m=+48.438272825" observedRunningTime="2025-09-16 04:38:36.159010402 +0000 UTC m=+49.344876469" watchObservedRunningTime="2025-09-16 04:38:36.180119166 +0000 UTC m=+49.365985193" Sep 16 04:38:36.631564 containerd[1528]: time="2025-09-16T04:38:36.631513166Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:36.632604 containerd[1528]: time="2025-09-16T04:38:36.632514246Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 16 04:38:36.634421 containerd[1528]: time="2025-09-16T04:38:36.634392647Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:36.636833 containerd[1528]: time="2025-09-16T04:38:36.636787567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:36.637484 containerd[1528]: time="2025-09-16T04:38:36.637458367Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.105892318s" Sep 16 04:38:36.637525 containerd[1528]: time="2025-09-16T04:38:36.637488767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 16 04:38:36.641017 containerd[1528]: time="2025-09-16T04:38:36.640975168Z" level=info msg="CreateContainer within sandbox \"ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 16 04:38:36.651170 containerd[1528]: time="2025-09-16T04:38:36.651125290Z" level=info msg="Container eb816307651373fa49c2c21d6bc4bed6472c29ed58734a21eddd35720e041d1d: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:36.672596 containerd[1528]: time="2025-09-16T04:38:36.672447613Z" level=info msg="CreateContainer within sandbox \"ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"eb816307651373fa49c2c21d6bc4bed6472c29ed58734a21eddd35720e041d1d\"" Sep 16 04:38:36.674112 containerd[1528]: time="2025-09-16T04:38:36.674076934Z" level=info msg="StartContainer for \"eb816307651373fa49c2c21d6bc4bed6472c29ed58734a21eddd35720e041d1d\"" Sep 16 04:38:36.677641 containerd[1528]: time="2025-09-16T04:38:36.677598774Z" level=info msg="connecting to shim eb816307651373fa49c2c21d6bc4bed6472c29ed58734a21eddd35720e041d1d" address="unix:///run/containerd/s/38688916bfafb2ec56ee29d80050e555ff7e44915d2532d585fe6453ccd30201" protocol=ttrpc version=3 Sep 16 04:38:36.704558 systemd[1]: Started cri-containerd-eb816307651373fa49c2c21d6bc4bed6472c29ed58734a21eddd35720e041d1d.scope - libcontainer container eb816307651373fa49c2c21d6bc4bed6472c29ed58734a21eddd35720e041d1d. Sep 16 04:38:36.741431 containerd[1528]: time="2025-09-16T04:38:36.741386466Z" level=info msg="StartContainer for \"eb816307651373fa49c2c21d6bc4bed6472c29ed58734a21eddd35720e041d1d\" returns successfully" Sep 16 04:38:36.747224 containerd[1528]: time="2025-09-16T04:38:36.747186027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 16 04:38:37.158621 kubelet[2666]: I0916 04:38:37.158576 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:37.846248 kubelet[2666]: I0916 04:38:37.846169 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6cfbc464b6-lx4r7" podStartSLOduration=31.56589753 podStartE2EDuration="35.8461483s" podCreationTimestamp="2025-09-16 04:38:02 +0000 UTC" firstStartedPulling="2025-09-16 04:38:31.250597639 +0000 UTC m=+44.436463706" lastFinishedPulling="2025-09-16 04:38:35.530848409 +0000 UTC m=+48.716714476" observedRunningTime="2025-09-16 04:38:36.177532085 +0000 UTC m=+49.363398152" watchObservedRunningTime="2025-09-16 04:38:37.8461483 +0000 UTC m=+51.032014367" Sep 16 04:38:38.382058 kubelet[2666]: I0916 04:38:38.382016 2666 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:38:38.472850 containerd[1528]: time="2025-09-16T04:38:38.472791848Z" level=info msg="TaskExit event in podsandbox handler container_id:\"128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2\" id:\"d4da67017d6f1fe0c64d821cc365612debd22a50aebb56f7b7f52864ada78bfd\" pid:5185 exit_status:1 exited_at:{seconds:1757997518 nanos:471873328}" Sep 16 04:38:38.545970 containerd[1528]: time="2025-09-16T04:38:38.545925181Z" level=info msg="TaskExit event in podsandbox handler container_id:\"128c6c546ca9318c151d878750ac02c5274a5d326cf67668c15bbafc5e5da9f2\" id:\"8115f0bc96c6efbd92d776ef92352e3f5160511233473932f3822dd2ed28c24f\" pid:5208 exit_status:1 exited_at:{seconds:1757997518 nanos:545589420}" Sep 16 04:38:38.747226 containerd[1528]: time="2025-09-16T04:38:38.746921495Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:38.748013 containerd[1528]: time="2025-09-16T04:38:38.747977535Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 16 04:38:38.748827 containerd[1528]: time="2025-09-16T04:38:38.748736375Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:38.754655 containerd[1528]: time="2025-09-16T04:38:38.754592296Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:38:38.755544 containerd[1528]: time="2025-09-16T04:38:38.755491217Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 2.00826179s" Sep 16 04:38:38.755544 containerd[1528]: time="2025-09-16T04:38:38.755534657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 16 04:38:38.771387 containerd[1528]: time="2025-09-16T04:38:38.771226219Z" level=info msg="CreateContainer within sandbox \"ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 16 04:38:38.780094 containerd[1528]: time="2025-09-16T04:38:38.780060581Z" level=info msg="Container 7d3b9bc49e5f517b6fea8d82c05e374c060cb44e6fa643792253fb0eae7e5899: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:38:38.793853 containerd[1528]: time="2025-09-16T04:38:38.793781983Z" level=info msg="CreateContainer within sandbox \"ec80f7690455680c4af9677b8c4c90bfee7965c459bcfccc350eb38ffe29c2f7\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7d3b9bc49e5f517b6fea8d82c05e374c060cb44e6fa643792253fb0eae7e5899\"" Sep 16 04:38:38.795871 containerd[1528]: time="2025-09-16T04:38:38.795832384Z" level=info msg="StartContainer for \"7d3b9bc49e5f517b6fea8d82c05e374c060cb44e6fa643792253fb0eae7e5899\"" Sep 16 04:38:38.798192 containerd[1528]: time="2025-09-16T04:38:38.798157224Z" level=info msg="connecting to shim 7d3b9bc49e5f517b6fea8d82c05e374c060cb44e6fa643792253fb0eae7e5899" address="unix:///run/containerd/s/38688916bfafb2ec56ee29d80050e555ff7e44915d2532d585fe6453ccd30201" protocol=ttrpc version=3 Sep 16 04:38:38.821541 systemd[1]: Started cri-containerd-7d3b9bc49e5f517b6fea8d82c05e374c060cb44e6fa643792253fb0eae7e5899.scope - libcontainer container 7d3b9bc49e5f517b6fea8d82c05e374c060cb44e6fa643792253fb0eae7e5899. Sep 16 04:38:38.857381 containerd[1528]: time="2025-09-16T04:38:38.857309794Z" level=info msg="StartContainer for \"7d3b9bc49e5f517b6fea8d82c05e374c060cb44e6fa643792253fb0eae7e5899\" returns successfully" Sep 16 04:38:38.985322 kubelet[2666]: I0916 04:38:38.985260 2666 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 16 04:38:38.987659 kubelet[2666]: I0916 04:38:38.987627 2666 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 16 04:38:39.185617 kubelet[2666]: I0916 04:38:39.185319 2666 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5zlcv" podStartSLOduration=26.729990789 podStartE2EDuration="32.18530317s" podCreationTimestamp="2025-09-16 04:38:07 +0000 UTC" firstStartedPulling="2025-09-16 04:38:33.302918076 +0000 UTC m=+46.488784143" lastFinishedPulling="2025-09-16 04:38:38.758230457 +0000 UTC m=+51.944096524" observedRunningTime="2025-09-16 04:38:39.18469993 +0000 UTC m=+52.370565997" watchObservedRunningTime="2025-09-16 04:38:39.18530317 +0000 UTC m=+52.371169237" Sep 16 04:38:39.343564 systemd[1]: Started sshd@9-10.0.0.111:22-10.0.0.1:60890.service - OpenSSH per-connection server daemon (10.0.0.1:60890). Sep 16 04:38:39.420700 sshd[5255]: Accepted publickey for core from 10.0.0.1 port 60890 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:39.422270 sshd-session[5255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:39.426409 systemd-logind[1510]: New session 10 of user core. Sep 16 04:38:39.433527 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 16 04:38:39.645089 sshd[5261]: Connection closed by 10.0.0.1 port 60890 Sep 16 04:38:39.645465 sshd-session[5255]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:39.655676 systemd[1]: sshd@9-10.0.0.111:22-10.0.0.1:60890.service: Deactivated successfully. Sep 16 04:38:39.659182 systemd[1]: session-10.scope: Deactivated successfully. Sep 16 04:38:39.661429 systemd-logind[1510]: Session 10 logged out. Waiting for processes to exit. Sep 16 04:38:39.665236 systemd[1]: Started sshd@10-10.0.0.111:22-10.0.0.1:60902.service - OpenSSH per-connection server daemon (10.0.0.1:60902). Sep 16 04:38:39.667575 systemd-logind[1510]: Removed session 10. Sep 16 04:38:39.739867 sshd[5277]: Accepted publickey for core from 10.0.0.1 port 60902 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:39.742688 sshd-session[5277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:39.748826 systemd-logind[1510]: New session 11 of user core. Sep 16 04:38:39.760572 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 16 04:38:40.015929 sshd[5280]: Connection closed by 10.0.0.1 port 60902 Sep 16 04:38:40.016769 sshd-session[5277]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:40.029504 systemd[1]: sshd@10-10.0.0.111:22-10.0.0.1:60902.service: Deactivated successfully. Sep 16 04:38:40.034766 systemd[1]: session-11.scope: Deactivated successfully. Sep 16 04:38:40.037135 systemd-logind[1510]: Session 11 logged out. Waiting for processes to exit. Sep 16 04:38:40.044780 systemd[1]: Started sshd@11-10.0.0.111:22-10.0.0.1:50498.service - OpenSSH per-connection server daemon (10.0.0.1:50498). Sep 16 04:38:40.045583 systemd-logind[1510]: Removed session 11. Sep 16 04:38:40.099045 sshd[5292]: Accepted publickey for core from 10.0.0.1 port 50498 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:40.101241 sshd-session[5292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:40.106072 systemd-logind[1510]: New session 12 of user core. Sep 16 04:38:40.115552 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 16 04:38:40.270200 sshd[5295]: Connection closed by 10.0.0.1 port 50498 Sep 16 04:38:40.270664 sshd-session[5292]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:40.275537 systemd[1]: sshd@11-10.0.0.111:22-10.0.0.1:50498.service: Deactivated successfully. Sep 16 04:38:40.281647 systemd[1]: session-12.scope: Deactivated successfully. Sep 16 04:38:40.283410 systemd-logind[1510]: Session 12 logged out. Waiting for processes to exit. Sep 16 04:38:40.285028 systemd-logind[1510]: Removed session 12. Sep 16 04:38:45.281705 systemd[1]: Started sshd@12-10.0.0.111:22-10.0.0.1:50636.service - OpenSSH per-connection server daemon (10.0.0.1:50636). Sep 16 04:38:45.353599 sshd[5318]: Accepted publickey for core from 10.0.0.1 port 50636 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:45.354818 sshd-session[5318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:45.358683 systemd-logind[1510]: New session 13 of user core. Sep 16 04:38:45.369515 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 16 04:38:45.512304 sshd[5321]: Connection closed by 10.0.0.1 port 50636 Sep 16 04:38:45.511759 sshd-session[5318]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:45.523600 systemd[1]: sshd@12-10.0.0.111:22-10.0.0.1:50636.service: Deactivated successfully. Sep 16 04:38:45.525389 systemd[1]: session-13.scope: Deactivated successfully. Sep 16 04:38:45.526063 systemd-logind[1510]: Session 13 logged out. Waiting for processes to exit. Sep 16 04:38:45.528401 systemd[1]: Started sshd@13-10.0.0.111:22-10.0.0.1:50638.service - OpenSSH per-connection server daemon (10.0.0.1:50638). Sep 16 04:38:45.529221 systemd-logind[1510]: Removed session 13. Sep 16 04:38:45.583171 sshd[5334]: Accepted publickey for core from 10.0.0.1 port 50638 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:45.584551 sshd-session[5334]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:45.588789 systemd-logind[1510]: New session 14 of user core. Sep 16 04:38:45.598017 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 16 04:38:45.804517 sshd[5337]: Connection closed by 10.0.0.1 port 50638 Sep 16 04:38:45.804861 sshd-session[5334]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:45.814597 systemd[1]: sshd@13-10.0.0.111:22-10.0.0.1:50638.service: Deactivated successfully. Sep 16 04:38:45.816774 systemd[1]: session-14.scope: Deactivated successfully. Sep 16 04:38:45.817615 systemd-logind[1510]: Session 14 logged out. Waiting for processes to exit. Sep 16 04:38:45.820479 systemd[1]: Started sshd@14-10.0.0.111:22-10.0.0.1:50642.service - OpenSSH per-connection server daemon (10.0.0.1:50642). Sep 16 04:38:45.821756 systemd-logind[1510]: Removed session 14. Sep 16 04:38:45.885527 sshd[5349]: Accepted publickey for core from 10.0.0.1 port 50642 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:45.887007 sshd-session[5349]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:45.890891 systemd-logind[1510]: New session 15 of user core. Sep 16 04:38:45.903548 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 16 04:38:46.524429 sshd[5352]: Connection closed by 10.0.0.1 port 50642 Sep 16 04:38:46.526058 sshd-session[5349]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:46.536245 systemd[1]: sshd@14-10.0.0.111:22-10.0.0.1:50642.service: Deactivated successfully. Sep 16 04:38:46.539295 systemd[1]: session-15.scope: Deactivated successfully. Sep 16 04:38:46.540586 systemd-logind[1510]: Session 15 logged out. Waiting for processes to exit. Sep 16 04:38:46.549110 systemd[1]: Started sshd@15-10.0.0.111:22-10.0.0.1:50648.service - OpenSSH per-connection server daemon (10.0.0.1:50648). Sep 16 04:38:46.553218 systemd-logind[1510]: Removed session 15. Sep 16 04:38:46.601704 sshd[5372]: Accepted publickey for core from 10.0.0.1 port 50648 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:46.602754 sshd-session[5372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:46.607084 systemd-logind[1510]: New session 16 of user core. Sep 16 04:38:46.615494 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 16 04:38:46.884919 sshd[5375]: Connection closed by 10.0.0.1 port 50648 Sep 16 04:38:46.886254 sshd-session[5372]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:46.894610 systemd[1]: sshd@15-10.0.0.111:22-10.0.0.1:50648.service: Deactivated successfully. Sep 16 04:38:46.896729 systemd[1]: session-16.scope: Deactivated successfully. Sep 16 04:38:46.901927 systemd-logind[1510]: Session 16 logged out. Waiting for processes to exit. Sep 16 04:38:46.905864 systemd[1]: Started sshd@16-10.0.0.111:22-10.0.0.1:50660.service - OpenSSH per-connection server daemon (10.0.0.1:50660). Sep 16 04:38:46.909230 systemd-logind[1510]: Removed session 16. Sep 16 04:38:46.966319 sshd[5386]: Accepted publickey for core from 10.0.0.1 port 50660 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:46.968069 sshd-session[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:46.976695 systemd-logind[1510]: New session 17 of user core. Sep 16 04:38:46.980504 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 16 04:38:47.156851 sshd[5391]: Connection closed by 10.0.0.1 port 50660 Sep 16 04:38:47.157648 sshd-session[5386]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:47.163743 systemd[1]: sshd@16-10.0.0.111:22-10.0.0.1:50660.service: Deactivated successfully. Sep 16 04:38:47.166963 systemd[1]: session-17.scope: Deactivated successfully. Sep 16 04:38:47.170031 systemd-logind[1510]: Session 17 logged out. Waiting for processes to exit. Sep 16 04:38:47.174257 systemd-logind[1510]: Removed session 17. Sep 16 04:38:52.178192 systemd[1]: Started sshd@17-10.0.0.111:22-10.0.0.1:55488.service - OpenSSH per-connection server daemon (10.0.0.1:55488). Sep 16 04:38:52.242358 sshd[5407]: Accepted publickey for core from 10.0.0.1 port 55488 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:52.243854 sshd-session[5407]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:52.248259 systemd-logind[1510]: New session 18 of user core. Sep 16 04:38:52.256545 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 16 04:38:52.501434 sshd[5410]: Connection closed by 10.0.0.1 port 55488 Sep 16 04:38:52.501545 sshd-session[5407]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:52.505234 systemd[1]: sshd@17-10.0.0.111:22-10.0.0.1:55488.service: Deactivated successfully. Sep 16 04:38:52.507384 systemd[1]: session-18.scope: Deactivated successfully. Sep 16 04:38:52.508987 systemd-logind[1510]: Session 18 logged out. Waiting for processes to exit. Sep 16 04:38:52.510638 systemd-logind[1510]: Removed session 18. Sep 16 04:38:57.521555 systemd[1]: Started sshd@18-10.0.0.111:22-10.0.0.1:55646.service - OpenSSH per-connection server daemon (10.0.0.1:55646). Sep 16 04:38:57.581659 sshd[5429]: Accepted publickey for core from 10.0.0.1 port 55646 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:38:57.582709 sshd-session[5429]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:57.586128 systemd-logind[1510]: New session 19 of user core. Sep 16 04:38:57.597483 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 16 04:38:57.729053 sshd[5432]: Connection closed by 10.0.0.1 port 55646 Sep 16 04:38:57.729400 sshd-session[5429]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:57.733034 systemd[1]: sshd@18-10.0.0.111:22-10.0.0.1:55646.service: Deactivated successfully. Sep 16 04:38:57.734740 systemd[1]: session-19.scope: Deactivated successfully. Sep 16 04:38:57.735490 systemd-logind[1510]: Session 19 logged out. Waiting for processes to exit. Sep 16 04:38:57.736474 systemd-logind[1510]: Removed session 19. Sep 16 04:39:01.491423 containerd[1528]: time="2025-09-16T04:39:01.491245818Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee49afc90ba0420064cafd2f9751138e4672ef961c988712964cae1ae7faaab0\" id:\"1b10a4dd7365ede9d8d66643f595fc88b85ed9b91a2bcfdb0d019dd40c7b33ec\" pid:5458 exited_at:{seconds:1757997541 nanos:490765250}" Sep 16 04:39:02.741975 systemd[1]: Started sshd@19-10.0.0.111:22-10.0.0.1:40422.service - OpenSSH per-connection server daemon (10.0.0.1:40422). Sep 16 04:39:02.804509 sshd[5469]: Accepted publickey for core from 10.0.0.1 port 40422 ssh2: RSA SHA256:UjijsmXvpGlRsfqUQE5UeTvJUwF4O48LgTuQN4JDfoQ Sep 16 04:39:02.805786 sshd-session[5469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:39:02.809526 systemd-logind[1510]: New session 20 of user core. Sep 16 04:39:02.817503 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 16 04:39:02.969604 sshd[5472]: Connection closed by 10.0.0.1 port 40422 Sep 16 04:39:02.970284 sshd-session[5469]: pam_unix(sshd:session): session closed for user core Sep 16 04:39:02.973520 systemd[1]: sshd@19-10.0.0.111:22-10.0.0.1:40422.service: Deactivated successfully. Sep 16 04:39:02.975360 systemd[1]: session-20.scope: Deactivated successfully. Sep 16 04:39:02.975947 systemd-logind[1510]: Session 20 logged out. Waiting for processes to exit. Sep 16 04:39:02.976860 systemd-logind[1510]: Removed session 20.