Sep 9 04:49:43.769314 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 9 04:49:43.769336 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 03:38:34 -00 2025 Sep 9 04:49:43.769346 kernel: KASLR enabled Sep 9 04:49:43.769352 kernel: efi: EFI v2.7 by EDK II Sep 9 04:49:43.769358 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb221f18 Sep 9 04:49:43.769363 kernel: random: crng init done Sep 9 04:49:43.769370 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 9 04:49:43.769376 kernel: secureboot: Secure boot enabled Sep 9 04:49:43.769382 kernel: ACPI: Early table checksum verification disabled Sep 9 04:49:43.769389 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Sep 9 04:49:43.769396 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Sep 9 04:49:43.769401 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769407 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769414 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769421 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769428 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769435 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769441 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769448 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769454 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 04:49:43.769460 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Sep 9 04:49:43.769467 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 04:49:43.769473 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:49:43.769480 kernel: NODE_DATA(0) allocated [mem 0xdc737a00-0xdc73efff] Sep 9 04:49:43.769486 kernel: Zone ranges: Sep 9 04:49:43.769494 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:49:43.769500 kernel: DMA32 empty Sep 9 04:49:43.769507 kernel: Normal empty Sep 9 04:49:43.769513 kernel: Device empty Sep 9 04:49:43.769520 kernel: Movable zone start for each node Sep 9 04:49:43.769526 kernel: Early memory node ranges Sep 9 04:49:43.769532 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Sep 9 04:49:43.769539 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Sep 9 04:49:43.769553 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Sep 9 04:49:43.769560 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Sep 9 04:49:43.769566 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Sep 9 04:49:43.769572 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Sep 9 04:49:43.769582 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Sep 9 04:49:43.769588 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Sep 9 04:49:43.769595 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Sep 9 04:49:43.769604 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Sep 9 04:49:43.769611 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Sep 9 04:49:43.769617 kernel: cma: Reserved 16 MiB at 0x00000000d7a00000 on node -1 Sep 9 04:49:43.769624 kernel: psci: probing for conduit method from ACPI. Sep 9 04:49:43.769632 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 04:49:43.769639 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 04:49:43.769646 kernel: psci: Trusted OS migration not required Sep 9 04:49:43.769652 kernel: psci: SMC Calling Convention v1.1 Sep 9 04:49:43.769659 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 9 04:49:43.769666 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 04:49:43.769672 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 04:49:43.769679 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Sep 9 04:49:43.769686 kernel: Detected PIPT I-cache on CPU0 Sep 9 04:49:43.769694 kernel: CPU features: detected: GIC system register CPU interface Sep 9 04:49:43.769700 kernel: CPU features: detected: Spectre-v4 Sep 9 04:49:43.769707 kernel: CPU features: detected: Spectre-BHB Sep 9 04:49:43.769714 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 04:49:43.769721 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 04:49:43.769728 kernel: CPU features: detected: ARM erratum 1418040 Sep 9 04:49:43.769734 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 04:49:43.769741 kernel: alternatives: applying boot alternatives Sep 9 04:49:43.769749 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:49:43.769757 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 04:49:43.769764 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 04:49:43.769772 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 04:49:43.769779 kernel: Fallback order for Node 0: 0 Sep 9 04:49:43.769785 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Sep 9 04:49:43.769792 kernel: Policy zone: DMA Sep 9 04:49:43.769799 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 04:49:43.769805 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Sep 9 04:49:43.769812 kernel: software IO TLB: area num 4. Sep 9 04:49:43.769819 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Sep 9 04:49:43.769826 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Sep 9 04:49:43.769832 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 9 04:49:43.769839 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 04:49:43.769847 kernel: rcu: RCU event tracing is enabled. Sep 9 04:49:43.769855 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 9 04:49:43.769862 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 04:49:43.769869 kernel: Tracing variant of Tasks RCU enabled. Sep 9 04:49:43.769876 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 04:49:43.769882 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 9 04:49:43.769889 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:49:43.769896 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 9 04:49:43.769903 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 04:49:43.769910 kernel: GICv3: 256 SPIs implemented Sep 9 04:49:43.769916 kernel: GICv3: 0 Extended SPIs implemented Sep 9 04:49:43.769923 kernel: Root IRQ handler: gic_handle_irq Sep 9 04:49:43.769931 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 9 04:49:43.769937 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 9 04:49:43.769944 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 9 04:49:43.769951 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 9 04:49:43.769957 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Sep 9 04:49:43.769964 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Sep 9 04:49:43.769971 kernel: GICv3: using LPI property table @0x0000000040130000 Sep 9 04:49:43.769978 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Sep 9 04:49:43.769984 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 04:49:43.769991 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:49:43.769998 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 9 04:49:43.770005 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 9 04:49:43.770013 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 9 04:49:43.770020 kernel: arm-pv: using stolen time PV Sep 9 04:49:43.770027 kernel: Console: colour dummy device 80x25 Sep 9 04:49:43.770034 kernel: ACPI: Core revision 20240827 Sep 9 04:49:43.770041 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 9 04:49:43.770048 kernel: pid_max: default: 32768 minimum: 301 Sep 9 04:49:43.770055 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 04:49:43.770062 kernel: landlock: Up and running. Sep 9 04:49:43.770069 kernel: SELinux: Initializing. Sep 9 04:49:43.770078 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:49:43.770085 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 04:49:43.770091 kernel: rcu: Hierarchical SRCU implementation. Sep 9 04:49:43.770099 kernel: rcu: Max phase no-delay instances is 400. Sep 9 04:49:43.770106 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 04:49:43.770112 kernel: Remapping and enabling EFI services. Sep 9 04:49:43.770119 kernel: smp: Bringing up secondary CPUs ... Sep 9 04:49:43.770126 kernel: Detected PIPT I-cache on CPU1 Sep 9 04:49:43.770133 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 9 04:49:43.770142 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Sep 9 04:49:43.770154 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:49:43.770161 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 9 04:49:43.770170 kernel: Detected PIPT I-cache on CPU2 Sep 9 04:49:43.770178 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Sep 9 04:49:43.770186 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Sep 9 04:49:43.770193 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:49:43.770200 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Sep 9 04:49:43.770208 kernel: Detected PIPT I-cache on CPU3 Sep 9 04:49:43.770216 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Sep 9 04:49:43.770223 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Sep 9 04:49:43.770231 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 04:49:43.770238 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Sep 9 04:49:43.770378 kernel: smp: Brought up 1 node, 4 CPUs Sep 9 04:49:43.770400 kernel: SMP: Total of 4 processors activated. Sep 9 04:49:43.770408 kernel: CPU: All CPU(s) started at EL1 Sep 9 04:49:43.770416 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 04:49:43.770423 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 04:49:43.770434 kernel: CPU features: detected: Common not Private translations Sep 9 04:49:43.770441 kernel: CPU features: detected: CRC32 instructions Sep 9 04:49:43.770448 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 9 04:49:43.770456 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 04:49:43.770463 kernel: CPU features: detected: LSE atomic instructions Sep 9 04:49:43.770471 kernel: CPU features: detected: Privileged Access Never Sep 9 04:49:43.770478 kernel: CPU features: detected: RAS Extension Support Sep 9 04:49:43.770485 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 04:49:43.770493 kernel: alternatives: applying system-wide alternatives Sep 9 04:49:43.770502 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Sep 9 04:49:43.770510 kernel: Memory: 2422372K/2572288K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38976K init, 1038K bss, 127580K reserved, 16384K cma-reserved) Sep 9 04:49:43.770518 kernel: devtmpfs: initialized Sep 9 04:49:43.770525 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 04:49:43.770533 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 9 04:49:43.770540 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 04:49:43.770554 kernel: 0 pages in range for non-PLT usage Sep 9 04:49:43.770562 kernel: 508560 pages in range for PLT usage Sep 9 04:49:43.770569 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 04:49:43.770579 kernel: SMBIOS 3.0.0 present. Sep 9 04:49:43.770587 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Sep 9 04:49:43.770594 kernel: DMI: Memory slots populated: 1/1 Sep 9 04:49:43.770602 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 04:49:43.770609 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 04:49:43.770617 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 04:49:43.770624 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 04:49:43.770632 kernel: audit: initializing netlink subsys (disabled) Sep 9 04:49:43.770639 kernel: audit: type=2000 audit(0.023:1): state=initialized audit_enabled=0 res=1 Sep 9 04:49:43.770648 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 04:49:43.770655 kernel: cpuidle: using governor menu Sep 9 04:49:43.770662 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 04:49:43.770670 kernel: ASID allocator initialised with 32768 entries Sep 9 04:49:43.770677 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 04:49:43.770684 kernel: Serial: AMBA PL011 UART driver Sep 9 04:49:43.770692 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 04:49:43.770699 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 04:49:43.770707 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 04:49:43.770715 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 04:49:43.770722 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 04:49:43.770730 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 04:49:43.770737 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 04:49:43.770744 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 04:49:43.770752 kernel: ACPI: Added _OSI(Module Device) Sep 9 04:49:43.770759 kernel: ACPI: Added _OSI(Processor Device) Sep 9 04:49:43.770766 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 04:49:43.770773 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 04:49:43.770782 kernel: ACPI: Interpreter enabled Sep 9 04:49:43.770789 kernel: ACPI: Using GIC for interrupt routing Sep 9 04:49:43.770797 kernel: ACPI: MCFG table detected, 1 entries Sep 9 04:49:43.770804 kernel: ACPI: CPU0 has been hot-added Sep 9 04:49:43.770811 kernel: ACPI: CPU1 has been hot-added Sep 9 04:49:43.770819 kernel: ACPI: CPU2 has been hot-added Sep 9 04:49:43.770826 kernel: ACPI: CPU3 has been hot-added Sep 9 04:49:43.770833 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 9 04:49:43.770840 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 04:49:43.770849 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 04:49:43.770996 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 04:49:43.771065 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 04:49:43.771126 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 04:49:43.771185 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 9 04:49:43.771242 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 9 04:49:43.771265 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 9 04:49:43.771277 kernel: PCI host bridge to bus 0000:00 Sep 9 04:49:43.771356 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 9 04:49:43.771414 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 9 04:49:43.771468 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 9 04:49:43.771522 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 04:49:43.771620 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 9 04:49:43.771700 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 04:49:43.771768 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Sep 9 04:49:43.771830 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Sep 9 04:49:43.771892 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 9 04:49:43.771955 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 9 04:49:43.772019 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Sep 9 04:49:43.772084 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Sep 9 04:49:43.772140 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 9 04:49:43.772197 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 9 04:49:43.772272 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 9 04:49:43.772283 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 9 04:49:43.772291 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 9 04:49:43.772299 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 9 04:49:43.772312 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 9 04:49:43.772320 kernel: iommu: Default domain type: Translated Sep 9 04:49:43.772327 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 04:49:43.772337 kernel: efivars: Registered efivars operations Sep 9 04:49:43.772344 kernel: vgaarb: loaded Sep 9 04:49:43.772352 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 04:49:43.772359 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 04:49:43.772367 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 04:49:43.772374 kernel: pnp: PnP ACPI init Sep 9 04:49:43.772448 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 9 04:49:43.772459 kernel: pnp: PnP ACPI: found 1 devices Sep 9 04:49:43.772468 kernel: NET: Registered PF_INET protocol family Sep 9 04:49:43.772475 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 04:49:43.772483 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 04:49:43.772490 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 04:49:43.772498 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 04:49:43.772506 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 04:49:43.772514 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 04:49:43.772521 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:49:43.772530 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 04:49:43.772539 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 04:49:43.772554 kernel: PCI: CLS 0 bytes, default 64 Sep 9 04:49:43.772563 kernel: kvm [1]: HYP mode not available Sep 9 04:49:43.772571 kernel: Initialise system trusted keyrings Sep 9 04:49:43.772579 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 04:49:43.772587 kernel: Key type asymmetric registered Sep 9 04:49:43.772594 kernel: Asymmetric key parser 'x509' registered Sep 9 04:49:43.772602 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 04:49:43.772609 kernel: io scheduler mq-deadline registered Sep 9 04:49:43.772620 kernel: io scheduler kyber registered Sep 9 04:49:43.772627 kernel: io scheduler bfq registered Sep 9 04:49:43.772635 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 9 04:49:43.772643 kernel: ACPI: button: Power Button [PWRB] Sep 9 04:49:43.772651 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 9 04:49:43.772723 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Sep 9 04:49:43.772734 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 04:49:43.772742 kernel: thunder_xcv, ver 1.0 Sep 9 04:49:43.772750 kernel: thunder_bgx, ver 1.0 Sep 9 04:49:43.772759 kernel: nicpf, ver 1.0 Sep 9 04:49:43.772767 kernel: nicvf, ver 1.0 Sep 9 04:49:43.772862 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 04:49:43.772923 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T04:49:43 UTC (1757393383) Sep 9 04:49:43.772933 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 04:49:43.772940 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 04:49:43.772948 kernel: watchdog: NMI not fully supported Sep 9 04:49:43.772955 kernel: watchdog: Hard watchdog permanently disabled Sep 9 04:49:43.772965 kernel: NET: Registered PF_INET6 protocol family Sep 9 04:49:43.772973 kernel: Segment Routing with IPv6 Sep 9 04:49:43.772980 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 04:49:43.772988 kernel: NET: Registered PF_PACKET protocol family Sep 9 04:49:43.772996 kernel: Key type dns_resolver registered Sep 9 04:49:43.773003 kernel: registered taskstats version 1 Sep 9 04:49:43.773011 kernel: Loading compiled-in X.509 certificates Sep 9 04:49:43.773019 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 44d1e8b5c5ffbaa3cedd99c03d41580671fabec5' Sep 9 04:49:43.773026 kernel: Demotion targets for Node 0: null Sep 9 04:49:43.773035 kernel: Key type .fscrypt registered Sep 9 04:49:43.773043 kernel: Key type fscrypt-provisioning registered Sep 9 04:49:43.773050 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 04:49:43.773058 kernel: ima: Allocated hash algorithm: sha1 Sep 9 04:49:43.773065 kernel: ima: No architecture policies found Sep 9 04:49:43.773073 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 04:49:43.773080 kernel: clk: Disabling unused clocks Sep 9 04:49:43.773088 kernel: PM: genpd: Disabling unused power domains Sep 9 04:49:43.773095 kernel: Warning: unable to open an initial console. Sep 9 04:49:43.773104 kernel: Freeing unused kernel memory: 38976K Sep 9 04:49:43.773111 kernel: Run /init as init process Sep 9 04:49:43.773119 kernel: with arguments: Sep 9 04:49:43.773126 kernel: /init Sep 9 04:49:43.773133 kernel: with environment: Sep 9 04:49:43.773140 kernel: HOME=/ Sep 9 04:49:43.773148 kernel: TERM=linux Sep 9 04:49:43.773156 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 04:49:43.773164 systemd[1]: Successfully made /usr/ read-only. Sep 9 04:49:43.773176 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:49:43.773185 systemd[1]: Detected virtualization kvm. Sep 9 04:49:43.773193 systemd[1]: Detected architecture arm64. Sep 9 04:49:43.773200 systemd[1]: Running in initrd. Sep 9 04:49:43.773208 systemd[1]: No hostname configured, using default hostname. Sep 9 04:49:43.773216 systemd[1]: Hostname set to . Sep 9 04:49:43.773224 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:49:43.773233 systemd[1]: Queued start job for default target initrd.target. Sep 9 04:49:43.773241 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:49:43.773284 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:49:43.773305 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 04:49:43.773315 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:49:43.773323 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 04:49:43.773332 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 04:49:43.773344 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 04:49:43.773352 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 04:49:43.773360 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:49:43.773368 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:49:43.773376 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:49:43.773385 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:49:43.773392 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:49:43.773400 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:49:43.773410 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:49:43.773418 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:49:43.773426 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 04:49:43.773434 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 04:49:43.773442 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:49:43.773450 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:49:43.773458 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:49:43.773466 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:49:43.773474 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 04:49:43.773483 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:49:43.773491 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 04:49:43.773499 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 04:49:43.773507 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 04:49:43.773515 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:49:43.773523 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:49:43.773531 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:49:43.773539 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 04:49:43.773558 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:49:43.773586 systemd-journald[243]: Collecting audit messages is disabled. Sep 9 04:49:43.773609 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 04:49:43.773617 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:49:43.773626 systemd-journald[243]: Journal started Sep 9 04:49:43.773644 systemd-journald[243]: Runtime Journal (/run/log/journal/8d826837dc81420abcea1d4b8dc8b11d) is 6M, max 48.5M, 42.4M free. Sep 9 04:49:43.766910 systemd-modules-load[244]: Inserted module 'overlay' Sep 9 04:49:43.775754 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:49:43.779260 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 04:49:43.780676 systemd-modules-load[244]: Inserted module 'br_netfilter' Sep 9 04:49:43.781384 kernel: Bridge firewalling registered Sep 9 04:49:43.781359 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:49:43.782964 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:49:43.786670 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 04:49:43.788601 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:49:43.792377 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:49:43.804390 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:49:43.808180 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:49:43.812364 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:49:43.813425 systemd-tmpfiles[266]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 04:49:43.815943 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:49:43.819442 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:49:43.821750 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:49:43.825578 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:49:43.830792 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 04:49:43.851467 dracut-cmdline[291]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=1e9320fd787e27d01e3b8a1acb67e0c640346112c469b7a652e9dcfc9271bf90 Sep 9 04:49:43.859959 systemd-resolved[284]: Positive Trust Anchors: Sep 9 04:49:43.859978 systemd-resolved[284]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:49:43.860008 systemd-resolved[284]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:49:43.864895 systemd-resolved[284]: Defaulting to hostname 'linux'. Sep 9 04:49:43.865889 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:49:43.868740 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:49:43.922285 kernel: SCSI subsystem initialized Sep 9 04:49:43.927265 kernel: Loading iSCSI transport class v2.0-870. Sep 9 04:49:43.935277 kernel: iscsi: registered transport (tcp) Sep 9 04:49:43.947280 kernel: iscsi: registered transport (qla4xxx) Sep 9 04:49:43.947320 kernel: QLogic iSCSI HBA Driver Sep 9 04:49:43.963531 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:49:43.980575 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:49:43.982179 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:49:44.029149 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 04:49:44.031565 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 04:49:44.097283 kernel: raid6: neonx8 gen() 15777 MB/s Sep 9 04:49:44.114269 kernel: raid6: neonx4 gen() 15738 MB/s Sep 9 04:49:44.131273 kernel: raid6: neonx2 gen() 13141 MB/s Sep 9 04:49:44.148276 kernel: raid6: neonx1 gen() 10378 MB/s Sep 9 04:49:44.165266 kernel: raid6: int64x8 gen() 6874 MB/s Sep 9 04:49:44.182262 kernel: raid6: int64x4 gen() 7312 MB/s Sep 9 04:49:44.199267 kernel: raid6: int64x2 gen() 6073 MB/s Sep 9 04:49:44.216268 kernel: raid6: int64x1 gen() 5022 MB/s Sep 9 04:49:44.216285 kernel: raid6: using algorithm neonx8 gen() 15777 MB/s Sep 9 04:49:44.233271 kernel: raid6: .... xor() 11940 MB/s, rmw enabled Sep 9 04:49:44.233292 kernel: raid6: using neon recovery algorithm Sep 9 04:49:44.238367 kernel: xor: measuring software checksum speed Sep 9 04:49:44.238386 kernel: 8regs : 21020 MB/sec Sep 9 04:49:44.239468 kernel: 32regs : 21687 MB/sec Sep 9 04:49:44.239485 kernel: arm64_neon : 28225 MB/sec Sep 9 04:49:44.239495 kernel: xor: using function: arm64_neon (28225 MB/sec) Sep 9 04:49:44.291280 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 04:49:44.299285 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:49:44.301810 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:49:44.333420 systemd-udevd[500]: Using default interface naming scheme 'v255'. Sep 9 04:49:44.337488 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:49:44.339936 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 04:49:44.362432 dracut-pre-trigger[511]: rd.md=0: removing MD RAID activation Sep 9 04:49:44.386069 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:49:44.388590 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:49:44.436390 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:49:44.439969 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 04:49:44.484286 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Sep 9 04:49:44.484673 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 9 04:49:44.496382 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 04:49:44.496426 kernel: GPT:9289727 != 19775487 Sep 9 04:49:44.496437 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 04:49:44.497752 kernel: GPT:9289727 != 19775487 Sep 9 04:49:44.497782 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 04:49:44.497797 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:49:44.500585 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:49:44.500707 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:49:44.510310 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:49:44.514192 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:49:44.532042 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 9 04:49:44.539464 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 9 04:49:44.540931 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:49:44.543598 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 04:49:44.559876 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 9 04:49:44.561169 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 9 04:49:44.569538 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:49:44.575070 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:49:44.576420 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:49:44.578460 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:49:44.581080 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 04:49:44.583019 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 04:49:44.599941 disk-uuid[593]: Primary Header is updated. Sep 9 04:49:44.599941 disk-uuid[593]: Secondary Entries is updated. Sep 9 04:49:44.599941 disk-uuid[593]: Secondary Header is updated. Sep 9 04:49:44.604005 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:49:44.607479 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:49:45.616272 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 9 04:49:45.617151 disk-uuid[596]: The operation has completed successfully. Sep 9 04:49:45.665374 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 04:49:45.665472 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 04:49:45.692964 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 04:49:45.718610 sh[613]: Success Sep 9 04:49:45.730609 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 04:49:45.730647 kernel: device-mapper: uevent: version 1.0.3 Sep 9 04:49:45.731593 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 04:49:45.741262 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 04:49:45.773402 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 04:49:45.776194 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 04:49:45.794932 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 04:49:45.802414 kernel: BTRFS: device fsid 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (625) Sep 9 04:49:45.802465 kernel: BTRFS info (device dm-0): first mount of filesystem 72a0ff35-b4e8-4772-9a8d-d0e90c3fb364 Sep 9 04:49:45.804494 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:49:45.812278 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 04:49:45.812324 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 04:49:45.813144 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 04:49:45.814660 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:49:45.815963 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 04:49:45.816769 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 04:49:45.818383 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 04:49:45.841419 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (654) Sep 9 04:49:45.845385 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:49:45.845443 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:49:45.849295 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:49:45.849336 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:49:45.853274 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:49:45.854632 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 04:49:45.856556 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 04:49:45.928933 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:49:45.933413 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:49:45.982714 ignition[702]: Ignition 2.22.0 Sep 9 04:49:45.982729 ignition[702]: Stage: fetch-offline Sep 9 04:49:45.982757 ignition[702]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:49:45.984348 systemd-networkd[803]: lo: Link UP Sep 9 04:49:45.982765 ignition[702]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:49:45.984351 systemd-networkd[803]: lo: Gained carrier Sep 9 04:49:45.982838 ignition[702]: parsed url from cmdline: "" Sep 9 04:49:45.985034 systemd-networkd[803]: Enumeration completed Sep 9 04:49:45.982841 ignition[702]: no config URL provided Sep 9 04:49:45.985132 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:49:45.982846 ignition[702]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 04:49:45.985462 systemd-networkd[803]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:49:45.982852 ignition[702]: no config at "/usr/lib/ignition/user.ign" Sep 9 04:49:45.985465 systemd-networkd[803]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:49:45.982871 ignition[702]: op(1): [started] loading QEMU firmware config module Sep 9 04:49:45.986837 systemd[1]: Reached target network.target - Network. Sep 9 04:49:45.982875 ignition[702]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 9 04:49:45.986974 systemd-networkd[803]: eth0: Link UP Sep 9 04:49:45.991306 ignition[702]: op(1): [finished] loading QEMU firmware config module Sep 9 04:49:45.987439 systemd-networkd[803]: eth0: Gained carrier Sep 9 04:49:45.987450 systemd-networkd[803]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:49:46.007333 systemd-networkd[803]: eth0: DHCPv4 address 10.0.0.33/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:49:46.046418 ignition[702]: parsing config with SHA512: 56e33a295ffefe7cc73f511eecb1dd8f1f78002ed2067bd513af6d96691b8c94c1ea4497c037d060c6e5e12453d2486912f28b98122e0e724be18f26d078b543 Sep 9 04:49:46.052461 unknown[702]: fetched base config from "system" Sep 9 04:49:46.052473 unknown[702]: fetched user config from "qemu" Sep 9 04:49:46.052846 ignition[702]: fetch-offline: fetch-offline passed Sep 9 04:49:46.052899 ignition[702]: Ignition finished successfully Sep 9 04:49:46.054766 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:49:46.056209 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 04:49:46.057044 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 04:49:46.088169 ignition[811]: Ignition 2.22.0 Sep 9 04:49:46.088187 ignition[811]: Stage: kargs Sep 9 04:49:46.088400 ignition[811]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:49:46.088410 ignition[811]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:49:46.089173 ignition[811]: kargs: kargs passed Sep 9 04:49:46.089217 ignition[811]: Ignition finished successfully Sep 9 04:49:46.092961 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 04:49:46.094977 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 04:49:46.121659 ignition[819]: Ignition 2.22.0 Sep 9 04:49:46.121675 ignition[819]: Stage: disks Sep 9 04:49:46.121809 ignition[819]: no configs at "/usr/lib/ignition/base.d" Sep 9 04:49:46.124835 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 04:49:46.121818 ignition[819]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:49:46.126016 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 04:49:46.122552 ignition[819]: disks: disks passed Sep 9 04:49:46.127585 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 04:49:46.122596 ignition[819]: Ignition finished successfully Sep 9 04:49:46.129421 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:49:46.131070 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:49:46.132551 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:49:46.134939 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 04:49:46.163597 systemd-resolved[284]: Detected conflict on linux IN A 10.0.0.33 Sep 9 04:49:46.163610 systemd-resolved[284]: Hostname conflict, changing published hostname from 'linux' to 'linux11'. Sep 9 04:49:46.166026 systemd-fsck[829]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 04:49:46.168990 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 04:49:46.174383 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 04:49:46.236272 kernel: EXT4-fs (vda9): mounted filesystem 88574756-967d-44b3-be66-46689c8baf27 r/w with ordered data mode. Quota mode: none. Sep 9 04:49:46.237082 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 04:49:46.238489 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 04:49:46.240716 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:49:46.242315 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 04:49:46.243307 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 04:49:46.243344 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 04:49:46.243365 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:49:46.255127 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 04:49:46.258145 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 04:49:46.262391 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (837) Sep 9 04:49:46.262412 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:49:46.262423 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:49:46.264662 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:49:46.264695 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:49:46.266137 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:49:46.293677 initrd-setup-root[863]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 04:49:46.296765 initrd-setup-root[870]: cut: /sysroot/etc/group: No such file or directory Sep 9 04:49:46.300775 initrd-setup-root[877]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 04:49:46.303978 initrd-setup-root[884]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 04:49:46.372002 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 04:49:46.374183 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 04:49:46.376609 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 04:49:46.400264 kernel: BTRFS info (device vda6): last unmount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:49:46.412312 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 04:49:46.423209 ignition[952]: INFO : Ignition 2.22.0 Sep 9 04:49:46.423209 ignition[952]: INFO : Stage: mount Sep 9 04:49:46.425019 ignition[952]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:49:46.425019 ignition[952]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:49:46.428567 ignition[952]: INFO : mount: mount passed Sep 9 04:49:46.428567 ignition[952]: INFO : Ignition finished successfully Sep 9 04:49:46.429911 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 04:49:46.432315 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 04:49:46.801128 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 04:49:46.802611 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 04:49:46.818310 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (965) Sep 9 04:49:46.818348 kernel: BTRFS info (device vda6): first mount of filesystem ea68277c-dabb-41e9-9258-b2fe475f0ae6 Sep 9 04:49:46.818359 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 04:49:46.821258 kernel: BTRFS info (device vda6): turning on async discard Sep 9 04:49:46.821282 kernel: BTRFS info (device vda6): enabling free space tree Sep 9 04:49:46.822804 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 04:49:46.862819 ignition[983]: INFO : Ignition 2.22.0 Sep 9 04:49:46.862819 ignition[983]: INFO : Stage: files Sep 9 04:49:46.864654 ignition[983]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:49:46.864654 ignition[983]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:49:46.864654 ignition[983]: DEBUG : files: compiled without relabeling support, skipping Sep 9 04:49:46.864654 ignition[983]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 04:49:46.864654 ignition[983]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 04:49:46.871025 ignition[983]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 04:49:46.871025 ignition[983]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 04:49:46.871025 ignition[983]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 04:49:46.871025 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 9 04:49:46.871025 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 9 04:49:46.866592 unknown[983]: wrote ssh authorized keys file for user: core Sep 9 04:49:46.957526 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 04:49:47.316526 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:49:47.318705 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 04:49:47.333166 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:49:47.333166 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 04:49:47.333166 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:49:47.333166 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:49:47.333166 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:49:47.333166 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 9 04:49:47.399415 systemd-networkd[803]: eth0: Gained IPv6LL Sep 9 04:49:47.897893 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 04:49:48.288062 ignition[983]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 9 04:49:48.288062 ignition[983]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 04:49:48.292014 ignition[983]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 9 04:49:48.306875 ignition[983]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:49:48.310096 ignition[983]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 9 04:49:48.311731 ignition[983]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 9 04:49:48.311731 ignition[983]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 9 04:49:48.311731 ignition[983]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 04:49:48.311731 ignition[983]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:49:48.311731 ignition[983]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 04:49:48.311731 ignition[983]: INFO : files: files passed Sep 9 04:49:48.311731 ignition[983]: INFO : Ignition finished successfully Sep 9 04:49:48.313605 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 04:49:48.317377 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 04:49:48.321473 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 04:49:48.334506 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 04:49:48.335672 initrd-setup-root-after-ignition[1010]: grep: /sysroot/oem/oem-release: No such file or directory Sep 9 04:49:48.336302 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 04:49:48.340185 initrd-setup-root-after-ignition[1013]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:49:48.340185 initrd-setup-root-after-ignition[1013]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:49:48.343740 initrd-setup-root-after-ignition[1017]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 04:49:48.343162 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:49:48.345155 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 04:49:48.349392 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 04:49:48.390194 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 04:49:48.391323 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 04:49:48.392778 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 04:49:48.394703 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 04:49:48.396583 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 04:49:48.397456 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 04:49:48.424011 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:49:48.427590 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 04:49:48.453237 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:49:48.454491 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:49:48.456587 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 04:49:48.458447 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 04:49:48.458580 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 04:49:48.461219 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 04:49:48.463391 systemd[1]: Stopped target basic.target - Basic System. Sep 9 04:49:48.465154 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 04:49:48.467006 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 04:49:48.469064 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 04:49:48.471136 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 04:49:48.473312 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 04:49:48.475356 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 04:49:48.477486 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 04:49:48.479470 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 04:49:48.481346 systemd[1]: Stopped target swap.target - Swaps. Sep 9 04:49:48.483120 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 04:49:48.483257 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 04:49:48.485705 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:49:48.487708 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:49:48.489695 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 04:49:48.490334 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:49:48.491605 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 04:49:48.491712 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 04:49:48.494446 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 04:49:48.494566 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 04:49:48.496317 systemd[1]: Stopped target paths.target - Path Units. Sep 9 04:49:48.497944 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 04:49:48.502299 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:49:48.503356 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 04:49:48.505540 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 04:49:48.507010 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 04:49:48.507098 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 04:49:48.508623 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 04:49:48.508697 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 04:49:48.510138 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 04:49:48.510267 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 04:49:48.512017 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 04:49:48.512115 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 04:49:48.514343 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 04:49:48.515692 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 04:49:48.516824 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 04:49:48.516931 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:49:48.518925 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 04:49:48.519032 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 04:49:48.523910 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 04:49:48.526411 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 04:49:48.534473 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 04:49:48.540805 ignition[1038]: INFO : Ignition 2.22.0 Sep 9 04:49:48.540805 ignition[1038]: INFO : Stage: umount Sep 9 04:49:48.542516 ignition[1038]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 04:49:48.542516 ignition[1038]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 9 04:49:48.542516 ignition[1038]: INFO : umount: umount passed Sep 9 04:49:48.542516 ignition[1038]: INFO : Ignition finished successfully Sep 9 04:49:48.543865 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 04:49:48.543965 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 04:49:48.545878 systemd[1]: Stopped target network.target - Network. Sep 9 04:49:48.547205 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 04:49:48.547283 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 04:49:48.549129 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 04:49:48.549174 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 04:49:48.551002 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 04:49:48.551049 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 04:49:48.552908 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 04:49:48.552947 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 04:49:48.554887 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 04:49:48.556589 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 04:49:48.564049 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 04:49:48.564188 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 04:49:48.567519 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 04:49:48.567731 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 04:49:48.567768 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:49:48.573955 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 04:49:48.574176 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 04:49:48.574289 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 04:49:48.578980 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 04:49:48.579371 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 04:49:48.581459 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 04:49:48.581499 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:49:48.584655 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 04:49:48.585676 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 04:49:48.585750 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 04:49:48.588201 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 04:49:48.588269 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:49:48.591498 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 04:49:48.591570 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 04:49:48.595066 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:49:48.599342 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 04:49:48.599639 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 04:49:48.601367 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 04:49:48.603774 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 04:49:48.603851 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 04:49:48.609811 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 04:49:48.609899 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 04:49:48.617845 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 04:49:48.618006 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:49:48.620337 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 04:49:48.620375 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 04:49:48.621456 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 04:49:48.621488 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:49:48.623535 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 04:49:48.623586 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 04:49:48.626345 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 04:49:48.626391 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 04:49:48.629196 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 04:49:48.629242 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 04:49:48.632051 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 04:49:48.633356 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 04:49:48.633412 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:49:48.636407 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 04:49:48.636448 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:49:48.639789 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 04:49:48.639831 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:49:48.643223 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 04:49:48.643277 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:49:48.645982 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 04:49:48.646033 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:49:48.649873 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 04:49:48.649950 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 04:49:48.652744 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 04:49:48.655172 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 04:49:48.671199 systemd[1]: Switching root. Sep 9 04:49:48.697467 systemd-journald[243]: Journal stopped Sep 9 04:49:49.447997 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Sep 9 04:49:49.448042 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 04:49:49.448053 kernel: SELinux: policy capability open_perms=1 Sep 9 04:49:49.448064 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 04:49:49.448073 kernel: SELinux: policy capability always_check_network=0 Sep 9 04:49:49.448085 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 04:49:49.448096 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 04:49:49.448109 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 04:49:49.448122 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 04:49:49.448131 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 04:49:49.448141 kernel: audit: type=1403 audit(1757393388.876:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 04:49:49.448151 systemd[1]: Successfully loaded SELinux policy in 61.906ms. Sep 9 04:49:49.448167 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.494ms. Sep 9 04:49:49.448178 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 04:49:49.448189 systemd[1]: Detected virtualization kvm. Sep 9 04:49:49.448199 systemd[1]: Detected architecture arm64. Sep 9 04:49:49.448209 systemd[1]: Detected first boot. Sep 9 04:49:49.448220 systemd[1]: Initializing machine ID from VM UUID. Sep 9 04:49:49.448230 zram_generator::config[1084]: No configuration found. Sep 9 04:49:49.448243 kernel: NET: Registered PF_VSOCK protocol family Sep 9 04:49:49.448282 systemd[1]: Populated /etc with preset unit settings. Sep 9 04:49:49.448294 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 04:49:49.448304 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 04:49:49.448314 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 04:49:49.448337 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 04:49:49.448350 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 04:49:49.448361 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 04:49:49.448371 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 04:49:49.448381 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 04:49:49.448391 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 04:49:49.448401 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 04:49:49.448414 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 04:49:49.448424 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 04:49:49.448434 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 04:49:49.448446 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 04:49:49.448457 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 04:49:49.448467 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 04:49:49.448478 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 04:49:49.448488 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 04:49:49.448498 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 04:49:49.448508 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 04:49:49.448520 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 04:49:49.448539 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 04:49:49.448549 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 04:49:49.448559 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 04:49:49.448568 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 04:49:49.448578 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 04:49:49.448588 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 04:49:49.448598 systemd[1]: Reached target slices.target - Slice Units. Sep 9 04:49:49.448607 systemd[1]: Reached target swap.target - Swaps. Sep 9 04:49:49.448617 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 04:49:49.448628 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 04:49:49.448641 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 04:49:49.448652 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 04:49:49.448661 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 04:49:49.448671 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 04:49:49.448681 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 04:49:49.448694 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 04:49:49.448704 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 04:49:49.448713 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 04:49:49.448724 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 04:49:49.448735 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 04:49:49.448745 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 04:49:49.448755 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 04:49:49.448765 systemd[1]: Reached target machines.target - Containers. Sep 9 04:49:49.448774 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 04:49:49.448784 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:49:49.448794 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 04:49:49.448805 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 04:49:49.448815 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:49:49.448825 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:49:49.448834 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:49:49.448844 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 04:49:49.448853 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:49:49.448863 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 04:49:49.448873 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 04:49:49.448884 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 04:49:49.448894 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 04:49:49.448904 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 04:49:49.448915 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:49:49.448925 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 04:49:49.448935 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 04:49:49.448945 kernel: loop: module loaded Sep 9 04:49:49.448954 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 04:49:49.448964 kernel: fuse: init (API version 7.41) Sep 9 04:49:49.448975 kernel: ACPI: bus type drm_connector registered Sep 9 04:49:49.448986 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 04:49:49.448996 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 04:49:49.449007 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 04:49:49.449017 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 04:49:49.449026 systemd[1]: Stopped verity-setup.service. Sep 9 04:49:49.449039 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 04:49:49.449049 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 04:49:49.449064 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 04:49:49.449074 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 04:49:49.449084 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 04:49:49.449112 systemd-journald[1152]: Collecting audit messages is disabled. Sep 9 04:49:49.449134 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 04:49:49.449146 systemd-journald[1152]: Journal started Sep 9 04:49:49.449168 systemd-journald[1152]: Runtime Journal (/run/log/journal/8d826837dc81420abcea1d4b8dc8b11d) is 6M, max 48.5M, 42.4M free. Sep 9 04:49:49.237465 systemd[1]: Queued start job for default target multi-user.target. Sep 9 04:49:49.259087 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 9 04:49:49.259464 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 04:49:49.451048 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 04:49:49.451933 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 04:49:49.453503 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 04:49:49.455028 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 04:49:49.455183 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 04:49:49.456655 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:49:49.456808 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:49:49.458290 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:49:49.458451 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:49:49.459874 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:49:49.460079 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:49:49.461601 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 04:49:49.461762 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 04:49:49.463099 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:49:49.463297 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:49:49.464673 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 04:49:49.466170 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 04:49:49.467858 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 04:49:49.469581 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 04:49:49.481622 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 04:49:49.484010 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 04:49:49.485756 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 04:49:49.486777 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 04:49:49.486811 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 04:49:49.488420 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 04:49:49.494397 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 04:49:49.495320 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:49:49.496513 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 04:49:49.498494 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 04:49:49.499562 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:49:49.500675 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 04:49:49.503322 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:49:49.504195 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 04:49:49.507433 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 04:49:49.511207 systemd-journald[1152]: Time spent on flushing to /var/log/journal/8d826837dc81420abcea1d4b8dc8b11d is 18.183ms for 885 entries. Sep 9 04:49:49.511207 systemd-journald[1152]: System Journal (/var/log/journal/8d826837dc81420abcea1d4b8dc8b11d) is 8M, max 195.6M, 187.6M free. Sep 9 04:49:49.535603 systemd-journald[1152]: Received client request to flush runtime journal. Sep 9 04:49:49.535636 kernel: loop0: detected capacity change from 0 to 119368 Sep 9 04:49:49.535648 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 04:49:49.512762 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 04:49:49.515674 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 04:49:49.519168 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 04:49:49.520786 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 04:49:49.530677 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 04:49:49.534134 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 04:49:49.540160 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 04:49:49.541885 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 04:49:49.547090 systemd-tmpfiles[1202]: ACLs are not supported, ignoring. Sep 9 04:49:49.547345 systemd-tmpfiles[1202]: ACLs are not supported, ignoring. Sep 9 04:49:49.548488 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 04:49:49.550274 kernel: loop1: detected capacity change from 0 to 100632 Sep 9 04:49:49.553346 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 04:49:49.560503 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 04:49:49.573413 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 04:49:49.581300 kernel: loop2: detected capacity change from 0 to 211168 Sep 9 04:49:49.592030 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 04:49:49.594593 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 04:49:49.604704 kernel: loop3: detected capacity change from 0 to 119368 Sep 9 04:49:49.615266 kernel: loop4: detected capacity change from 0 to 100632 Sep 9 04:49:49.616703 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Sep 9 04:49:49.616722 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Sep 9 04:49:49.619832 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 04:49:49.620395 kernel: loop5: detected capacity change from 0 to 211168 Sep 9 04:49:49.625285 (sd-merge)[1224]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 9 04:49:49.625664 (sd-merge)[1224]: Merged extensions into '/usr'. Sep 9 04:49:49.628901 systemd[1]: Reload requested from client PID 1201 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 04:49:49.628920 systemd[1]: Reloading... Sep 9 04:49:49.693799 zram_generator::config[1248]: No configuration found. Sep 9 04:49:49.753701 ldconfig[1196]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 04:49:49.842894 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 04:49:49.843185 systemd[1]: Reloading finished in 213 ms. Sep 9 04:49:49.866081 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 04:49:49.867678 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 04:49:49.879516 systemd[1]: Starting ensure-sysext.service... Sep 9 04:49:49.881201 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 04:49:49.890743 systemd[1]: Reload requested from client PID 1287 ('systemctl') (unit ensure-sysext.service)... Sep 9 04:49:49.890761 systemd[1]: Reloading... Sep 9 04:49:49.900276 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 04:49:49.900307 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 04:49:49.900533 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 04:49:49.900726 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 04:49:49.901751 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 04:49:49.902040 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Sep 9 04:49:49.902141 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Sep 9 04:49:49.904904 systemd-tmpfiles[1288]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:49:49.905005 systemd-tmpfiles[1288]: Skipping /boot Sep 9 04:49:49.910803 systemd-tmpfiles[1288]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 04:49:49.910892 systemd-tmpfiles[1288]: Skipping /boot Sep 9 04:49:49.944344 zram_generator::config[1315]: No configuration found. Sep 9 04:49:50.072931 systemd[1]: Reloading finished in 181 ms. Sep 9 04:49:50.092860 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 04:49:50.099649 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 04:49:50.105480 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:49:50.107699 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 04:49:50.109607 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 04:49:50.111902 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 04:49:50.117678 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 04:49:50.121363 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 04:49:50.133184 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 04:49:50.136234 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:49:50.138391 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:49:50.141570 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:49:50.146830 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:49:50.149817 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:49:50.149934 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:49:50.157375 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 04:49:50.161455 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:49:50.161694 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:49:50.163628 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:49:50.163787 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:49:50.165274 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:49:50.165420 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:49:50.171748 systemd-udevd[1359]: Using default interface naming scheme 'v255'. Sep 9 04:49:50.174064 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 04:49:50.175766 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:49:50.179638 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:49:50.181100 augenrules[1388]: No rules Sep 9 04:49:50.188488 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:49:50.193489 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:49:50.194537 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:49:50.194708 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:49:50.196172 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 04:49:50.200043 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 04:49:50.201807 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:49:50.202040 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:49:50.205102 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 04:49:50.208001 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:49:50.208253 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:49:50.211164 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:49:50.211467 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:49:50.218535 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 04:49:50.233074 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:49:50.234324 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:49:50.237279 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 04:49:50.241916 systemd-resolved[1354]: Positive Trust Anchors: Sep 9 04:49:50.242176 systemd-resolved[1354]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 04:49:50.242275 systemd-resolved[1354]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 04:49:50.248994 systemd-resolved[1354]: Defaulting to hostname 'linux'. Sep 9 04:49:50.250710 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 04:49:50.256403 systemd[1]: Finished ensure-sysext.service. Sep 9 04:49:50.260409 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 04:49:50.265406 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:49:50.266369 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 04:49:50.268040 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 04:49:50.281077 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 04:49:50.283469 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 04:49:50.286377 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 04:49:50.287365 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 04:49:50.287441 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 04:49:50.290434 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 04:49:50.295462 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 04:49:50.296576 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 04:49:50.299163 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 04:49:50.299360 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 04:49:50.300877 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 04:49:50.301281 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 04:49:50.303745 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 04:49:50.303888 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 04:49:50.310628 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 04:49:50.313613 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 04:49:50.323525 augenrules[1433]: /sbin/augenrules: No change Sep 9 04:49:50.326846 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 04:49:50.327025 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 04:49:50.328750 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 04:49:50.340471 augenrules[1468]: No rules Sep 9 04:49:50.342650 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:49:50.343605 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:49:50.350953 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 9 04:49:50.354457 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 04:49:50.371598 systemd-networkd[1444]: lo: Link UP Sep 9 04:49:50.371607 systemd-networkd[1444]: lo: Gained carrier Sep 9 04:49:50.373632 systemd-networkd[1444]: Enumeration completed Sep 9 04:49:50.373733 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 04:49:50.374055 systemd-networkd[1444]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:49:50.374063 systemd-networkd[1444]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 04:49:50.374658 systemd-networkd[1444]: eth0: Link UP Sep 9 04:49:50.374766 systemd-networkd[1444]: eth0: Gained carrier Sep 9 04:49:50.374786 systemd-networkd[1444]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 04:49:50.375102 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 04:49:50.376912 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 04:49:50.379283 systemd[1]: Reached target network.target - Network. Sep 9 04:49:50.380441 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 04:49:50.382215 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 04:49:50.383860 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 04:49:50.385191 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 04:49:50.386508 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 04:49:50.386550 systemd[1]: Reached target paths.target - Path Units. Sep 9 04:49:50.387478 systemd-networkd[1444]: eth0: DHCPv4 address 10.0.0.33/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 9 04:49:50.387872 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 04:49:50.388240 systemd-timesyncd[1445]: Network configuration changed, trying to establish connection. Sep 9 04:49:50.389151 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 04:49:50.390353 systemd-timesyncd[1445]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 9 04:49:50.390402 systemd-timesyncd[1445]: Initial clock synchronization to Tue 2025-09-09 04:49:50.276614 UTC. Sep 9 04:49:50.390465 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 04:49:50.391911 systemd[1]: Reached target timers.target - Timer Units. Sep 9 04:49:50.393696 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 04:49:50.395997 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 04:49:50.398611 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 04:49:50.399770 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 04:49:50.401417 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 04:49:50.404471 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 04:49:50.405724 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 04:49:50.408535 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 04:49:50.413378 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 04:49:50.414804 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 04:49:50.415824 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 04:49:50.416810 systemd[1]: Reached target basic.target - Basic System. Sep 9 04:49:50.417809 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:49:50.417841 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 04:49:50.418810 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 04:49:50.422263 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 04:49:50.424869 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 04:49:50.428134 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 04:49:50.430493 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 04:49:50.431271 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 04:49:50.432478 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 04:49:50.435826 jq[1499]: false Sep 9 04:49:50.436828 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 04:49:50.440427 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 04:49:50.443781 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 04:49:50.447647 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 04:49:50.451539 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 04:49:50.454403 extend-filesystems[1500]: Found /dev/vda6 Sep 9 04:49:50.458399 extend-filesystems[1500]: Found /dev/vda9 Sep 9 04:49:50.459419 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 04:49:50.462814 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 04:49:50.465717 extend-filesystems[1500]: Checking size of /dev/vda9 Sep 9 04:49:50.466365 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 04:49:50.468298 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 04:49:50.472921 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 04:49:50.475678 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 04:49:50.475852 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 04:49:50.476146 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 04:49:50.476395 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 04:49:50.479318 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 04:49:50.479484 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 04:49:50.493219 update_engine[1516]: I20250909 04:49:50.492991 1516 main.cc:92] Flatcar Update Engine starting Sep 9 04:49:50.504898 jq[1519]: true Sep 9 04:49:50.505439 extend-filesystems[1500]: Resized partition /dev/vda9 Sep 9 04:49:50.509680 extend-filesystems[1538]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 04:49:50.512106 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 04:49:50.514540 tar[1524]: linux-arm64/LICENSE Sep 9 04:49:50.518044 tar[1524]: linux-arm64/helm Sep 9 04:49:50.519112 (ntainerd)[1540]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 04:49:50.521302 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 9 04:49:50.529947 jq[1539]: true Sep 9 04:49:50.535592 dbus-daemon[1497]: [system] SELinux support is enabled Sep 9 04:49:50.535747 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 04:49:50.543275 update_engine[1516]: I20250909 04:49:50.540925 1516 update_check_scheduler.cc:74] Next update check in 4m58s Sep 9 04:49:50.550857 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 9 04:49:50.544328 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 04:49:50.544361 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 04:49:50.546590 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 04:49:50.546606 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 04:49:50.548109 systemd[1]: Started update-engine.service - Update Engine. Sep 9 04:49:50.561758 extend-filesystems[1538]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 9 04:49:50.561758 extend-filesystems[1538]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 9 04:49:50.561758 extend-filesystems[1538]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 9 04:49:50.565097 extend-filesystems[1500]: Resized filesystem in /dev/vda9 Sep 9 04:49:50.567104 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 04:49:50.568766 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 04:49:50.569078 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 04:49:50.614352 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 04:49:50.620106 bash[1565]: Updated "/home/core/.ssh/authorized_keys" Sep 9 04:49:50.621769 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 04:49:50.631115 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 9 04:49:50.643860 systemd-logind[1507]: Watching system buttons on /dev/input/event0 (Power Button) Sep 9 04:49:50.644864 systemd-logind[1507]: New seat seat0. Sep 9 04:49:50.646173 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 04:49:50.686003 locksmithd[1546]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 04:49:50.723206 containerd[1540]: time="2025-09-09T04:49:50Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 04:49:50.725845 containerd[1540]: time="2025-09-09T04:49:50.725754680Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 04:49:50.734602 containerd[1540]: time="2025-09-09T04:49:50.734564080Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.2µs" Sep 9 04:49:50.734686 containerd[1540]: time="2025-09-09T04:49:50.734670040Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 04:49:50.734740 containerd[1540]: time="2025-09-09T04:49:50.734726400Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 04:49:50.734920 containerd[1540]: time="2025-09-09T04:49:50.734901720Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 04:49:50.734979 containerd[1540]: time="2025-09-09T04:49:50.734965640Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 04:49:50.735056 containerd[1540]: time="2025-09-09T04:49:50.735043400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:49:50.735159 containerd[1540]: time="2025-09-09T04:49:50.735140800Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 04:49:50.735211 containerd[1540]: time="2025-09-09T04:49:50.735198600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:49:50.735495 containerd[1540]: time="2025-09-09T04:49:50.735471760Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 04:49:50.735572 containerd[1540]: time="2025-09-09T04:49:50.735556840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:49:50.735622 containerd[1540]: time="2025-09-09T04:49:50.735609880Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 04:49:50.736224 containerd[1540]: time="2025-09-09T04:49:50.735668920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 04:49:50.736224 containerd[1540]: time="2025-09-09T04:49:50.735750280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 04:49:50.736224 containerd[1540]: time="2025-09-09T04:49:50.735973840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:49:50.736224 containerd[1540]: time="2025-09-09T04:49:50.736001360Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 04:49:50.736224 containerd[1540]: time="2025-09-09T04:49:50.736010880Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 04:49:50.736224 containerd[1540]: time="2025-09-09T04:49:50.736043720Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 04:49:50.736472 containerd[1540]: time="2025-09-09T04:49:50.736266280Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 04:49:50.736472 containerd[1540]: time="2025-09-09T04:49:50.736324360Z" level=info msg="metadata content store policy set" policy=shared Sep 9 04:49:50.741103 containerd[1540]: time="2025-09-09T04:49:50.741064520Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 04:49:50.741163 containerd[1540]: time="2025-09-09T04:49:50.741119640Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 04:49:50.741163 containerd[1540]: time="2025-09-09T04:49:50.741143320Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 04:49:50.741163 containerd[1540]: time="2025-09-09T04:49:50.741155320Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 04:49:50.741213 containerd[1540]: time="2025-09-09T04:49:50.741167640Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 04:49:50.741213 containerd[1540]: time="2025-09-09T04:49:50.741178920Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 04:49:50.741213 containerd[1540]: time="2025-09-09T04:49:50.741192640Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 04:49:50.741213 containerd[1540]: time="2025-09-09T04:49:50.741207960Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 04:49:50.741295 containerd[1540]: time="2025-09-09T04:49:50.741218760Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 04:49:50.741295 containerd[1540]: time="2025-09-09T04:49:50.741229400Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 04:49:50.741295 containerd[1540]: time="2025-09-09T04:49:50.741238200Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 04:49:50.741295 containerd[1540]: time="2025-09-09T04:49:50.741264600Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 04:49:50.741393 containerd[1540]: time="2025-09-09T04:49:50.741368920Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 04:49:50.741419 containerd[1540]: time="2025-09-09T04:49:50.741394240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 04:49:50.741419 containerd[1540]: time="2025-09-09T04:49:50.741410400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741425760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741436240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741447760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741459040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741469440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741479720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741489960Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 04:49:50.741497 containerd[1540]: time="2025-09-09T04:49:50.741500160Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 04:49:50.741716 containerd[1540]: time="2025-09-09T04:49:50.741684040Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 04:49:50.741716 containerd[1540]: time="2025-09-09T04:49:50.741704520Z" level=info msg="Start snapshots syncer" Sep 9 04:49:50.742238 containerd[1540]: time="2025-09-09T04:49:50.741730080Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 04:49:50.742238 containerd[1540]: time="2025-09-09T04:49:50.741949080Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 04:49:50.742373 containerd[1540]: time="2025-09-09T04:49:50.741997280Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 04:49:50.742373 containerd[1540]: time="2025-09-09T04:49:50.742132160Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 04:49:50.742580 containerd[1540]: time="2025-09-09T04:49:50.742551600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 04:49:50.742659 containerd[1540]: time="2025-09-09T04:49:50.742645400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 04:49:50.742710 containerd[1540]: time="2025-09-09T04:49:50.742698000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 04:49:50.742763 containerd[1540]: time="2025-09-09T04:49:50.742749440Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 04:49:50.742829 containerd[1540]: time="2025-09-09T04:49:50.742814720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 04:49:50.742877 containerd[1540]: time="2025-09-09T04:49:50.742865600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 04:49:50.742926 containerd[1540]: time="2025-09-09T04:49:50.742913520Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 04:49:50.743001 containerd[1540]: time="2025-09-09T04:49:50.742986960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 04:49:50.743054 containerd[1540]: time="2025-09-09T04:49:50.743041800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 04:49:50.743104 containerd[1540]: time="2025-09-09T04:49:50.743091600Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 04:49:50.743190 containerd[1540]: time="2025-09-09T04:49:50.743176720Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:49:50.743344 containerd[1540]: time="2025-09-09T04:49:50.743323560Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 04:49:50.743401 containerd[1540]: time="2025-09-09T04:49:50.743387640Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:49:50.743451 containerd[1540]: time="2025-09-09T04:49:50.743438840Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 04:49:50.743498 containerd[1540]: time="2025-09-09T04:49:50.743485960Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 04:49:50.743560 containerd[1540]: time="2025-09-09T04:49:50.743547000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 04:49:50.743614 containerd[1540]: time="2025-09-09T04:49:50.743601400Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 04:49:50.743762 containerd[1540]: time="2025-09-09T04:49:50.743748080Z" level=info msg="runtime interface created" Sep 9 04:49:50.743806 containerd[1540]: time="2025-09-09T04:49:50.743794960Z" level=info msg="created NRI interface" Sep 9 04:49:50.743854 containerd[1540]: time="2025-09-09T04:49:50.743842280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 04:49:50.743903 containerd[1540]: time="2025-09-09T04:49:50.743892160Z" level=info msg="Connect containerd service" Sep 9 04:49:50.743975 containerd[1540]: time="2025-09-09T04:49:50.743960520Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 04:49:50.745972 containerd[1540]: time="2025-09-09T04:49:50.745936440Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 04:49:50.812258 containerd[1540]: time="2025-09-09T04:49:50.812146400Z" level=info msg="Start subscribing containerd event" Sep 9 04:49:50.812258 containerd[1540]: time="2025-09-09T04:49:50.812237440Z" level=info msg="Start recovering state" Sep 9 04:49:50.812396 containerd[1540]: time="2025-09-09T04:49:50.812344160Z" level=info msg="Start event monitor" Sep 9 04:49:50.812396 containerd[1540]: time="2025-09-09T04:49:50.812358680Z" level=info msg="Start cni network conf syncer for default" Sep 9 04:49:50.812396 containerd[1540]: time="2025-09-09T04:49:50.812368040Z" level=info msg="Start streaming server" Sep 9 04:49:50.812396 containerd[1540]: time="2025-09-09T04:49:50.812376320Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 04:49:50.812396 containerd[1540]: time="2025-09-09T04:49:50.812383640Z" level=info msg="runtime interface starting up..." Sep 9 04:49:50.812396 containerd[1540]: time="2025-09-09T04:49:50.812389200Z" level=info msg="starting plugins..." Sep 9 04:49:50.812492 containerd[1540]: time="2025-09-09T04:49:50.812402640Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 04:49:50.812887 containerd[1540]: time="2025-09-09T04:49:50.812867600Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 04:49:50.813298 containerd[1540]: time="2025-09-09T04:49:50.813277680Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 04:49:50.813464 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 04:49:50.816360 containerd[1540]: time="2025-09-09T04:49:50.814550840Z" level=info msg="containerd successfully booted in 0.090479s" Sep 9 04:49:50.859786 tar[1524]: linux-arm64/README.md Sep 9 04:49:50.877218 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 04:49:51.346021 sshd_keygen[1522]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 04:49:51.366291 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 04:49:51.369508 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 04:49:51.396708 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 04:49:51.396912 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 04:49:51.399575 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 04:49:51.427709 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 04:49:51.431563 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 04:49:51.433716 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 04:49:51.434963 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 04:49:52.327427 systemd-networkd[1444]: eth0: Gained IPv6LL Sep 9 04:49:52.329654 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 04:49:52.332597 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 04:49:52.335155 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 9 04:49:52.337332 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:49:52.339131 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 04:49:52.365025 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 04:49:52.367020 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 9 04:49:52.368300 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 9 04:49:52.370580 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 04:49:52.919452 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:49:52.921140 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 04:49:52.923191 (kubelet)[1638]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:49:52.927392 systemd[1]: Startup finished in 2.008s (kernel) + 5.261s (initrd) + 4.113s (userspace) = 11.383s. Sep 9 04:49:53.274446 kubelet[1638]: E0909 04:49:53.274310 1638 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:49:53.276744 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:49:53.276878 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:49:53.277202 systemd[1]: kubelet.service: Consumed 754ms CPU time, 258.1M memory peak. Sep 9 04:49:56.903527 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 04:49:56.908355 systemd[1]: Started sshd@0-10.0.0.33:22-10.0.0.1:51822.service - OpenSSH per-connection server daemon (10.0.0.1:51822). Sep 9 04:49:57.022313 sshd[1651]: Accepted publickey for core from 10.0.0.1 port 51822 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:49:57.025495 sshd-session[1651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:49:57.038094 systemd-logind[1507]: New session 1 of user core. Sep 9 04:49:57.038860 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 04:49:57.040200 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 04:49:57.084408 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 04:49:57.088549 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 04:49:57.106093 (systemd)[1656]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 04:49:57.109753 systemd-logind[1507]: New session c1 of user core. Sep 9 04:49:57.236518 systemd[1656]: Queued start job for default target default.target. Sep 9 04:49:57.247142 systemd[1656]: Created slice app.slice - User Application Slice. Sep 9 04:49:57.247170 systemd[1656]: Reached target paths.target - Paths. Sep 9 04:49:57.247211 systemd[1656]: Reached target timers.target - Timers. Sep 9 04:49:57.248426 systemd[1656]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 04:49:57.258303 systemd[1656]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 04:49:57.258450 systemd[1656]: Reached target sockets.target - Sockets. Sep 9 04:49:57.258497 systemd[1656]: Reached target basic.target - Basic System. Sep 9 04:49:57.258531 systemd[1656]: Reached target default.target - Main User Target. Sep 9 04:49:57.258556 systemd[1656]: Startup finished in 142ms. Sep 9 04:49:57.258649 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 04:49:57.259932 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 04:49:57.323792 systemd[1]: Started sshd@1-10.0.0.33:22-10.0.0.1:51826.service - OpenSSH per-connection server daemon (10.0.0.1:51826). Sep 9 04:49:57.402333 sshd[1667]: Accepted publickey for core from 10.0.0.1 port 51826 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:49:57.404054 sshd-session[1667]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:49:57.411328 systemd-logind[1507]: New session 2 of user core. Sep 9 04:49:57.421698 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 04:49:57.474875 sshd[1670]: Connection closed by 10.0.0.1 port 51826 Sep 9 04:49:57.475349 sshd-session[1667]: pam_unix(sshd:session): session closed for user core Sep 9 04:49:57.496076 systemd[1]: sshd@1-10.0.0.33:22-10.0.0.1:51826.service: Deactivated successfully. Sep 9 04:49:57.498951 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 04:49:57.499653 systemd-logind[1507]: Session 2 logged out. Waiting for processes to exit. Sep 9 04:49:57.502499 systemd[1]: Started sshd@2-10.0.0.33:22-10.0.0.1:51840.service - OpenSSH per-connection server daemon (10.0.0.1:51840). Sep 9 04:49:57.503459 systemd-logind[1507]: Removed session 2. Sep 9 04:49:57.559841 sshd[1676]: Accepted publickey for core from 10.0.0.1 port 51840 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:49:57.561086 sshd-session[1676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:49:57.565435 systemd-logind[1507]: New session 3 of user core. Sep 9 04:49:57.572411 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 04:49:57.622476 sshd[1679]: Connection closed by 10.0.0.1 port 51840 Sep 9 04:49:57.622921 sshd-session[1676]: pam_unix(sshd:session): session closed for user core Sep 9 04:49:57.631096 systemd[1]: sshd@2-10.0.0.33:22-10.0.0.1:51840.service: Deactivated successfully. Sep 9 04:49:57.633572 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 04:49:57.634967 systemd-logind[1507]: Session 3 logged out. Waiting for processes to exit. Sep 9 04:49:57.636286 systemd[1]: Started sshd@3-10.0.0.33:22-10.0.0.1:51856.service - OpenSSH per-connection server daemon (10.0.0.1:51856). Sep 9 04:49:57.637333 systemd-logind[1507]: Removed session 3. Sep 9 04:49:57.683382 sshd[1685]: Accepted publickey for core from 10.0.0.1 port 51856 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:49:57.684592 sshd-session[1685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:49:57.690058 systemd-logind[1507]: New session 4 of user core. Sep 9 04:49:57.701461 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 04:49:57.753344 sshd[1688]: Connection closed by 10.0.0.1 port 51856 Sep 9 04:49:57.753644 sshd-session[1685]: pam_unix(sshd:session): session closed for user core Sep 9 04:49:57.769319 systemd[1]: sshd@3-10.0.0.33:22-10.0.0.1:51856.service: Deactivated successfully. Sep 9 04:49:57.771639 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 04:49:57.773850 systemd-logind[1507]: Session 4 logged out. Waiting for processes to exit. Sep 9 04:49:57.776041 systemd[1]: Started sshd@4-10.0.0.33:22-10.0.0.1:51864.service - OpenSSH per-connection server daemon (10.0.0.1:51864). Sep 9 04:49:57.776550 systemd-logind[1507]: Removed session 4. Sep 9 04:49:57.821775 sshd[1694]: Accepted publickey for core from 10.0.0.1 port 51864 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:49:57.823295 sshd-session[1694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:49:57.830839 systemd-logind[1507]: New session 5 of user core. Sep 9 04:49:57.837428 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 04:49:57.901777 sudo[1698]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 04:49:57.902056 sudo[1698]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:49:57.920161 sudo[1698]: pam_unix(sudo:session): session closed for user root Sep 9 04:49:57.921782 sshd[1697]: Connection closed by 10.0.0.1 port 51864 Sep 9 04:49:57.922381 sshd-session[1694]: pam_unix(sshd:session): session closed for user core Sep 9 04:49:57.934496 systemd[1]: sshd@4-10.0.0.33:22-10.0.0.1:51864.service: Deactivated successfully. Sep 9 04:49:57.936215 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 04:49:57.937769 systemd-logind[1507]: Session 5 logged out. Waiting for processes to exit. Sep 9 04:49:57.939913 systemd[1]: Started sshd@5-10.0.0.33:22-10.0.0.1:51876.service - OpenSSH per-connection server daemon (10.0.0.1:51876). Sep 9 04:49:57.941152 systemd-logind[1507]: Removed session 5. Sep 9 04:49:58.024182 sshd[1704]: Accepted publickey for core from 10.0.0.1 port 51876 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:49:58.026032 sshd-session[1704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:49:58.030090 systemd-logind[1507]: New session 6 of user core. Sep 9 04:49:58.049417 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 04:49:58.101926 sudo[1709]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 04:49:58.102205 sudo[1709]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:49:58.216269 sudo[1709]: pam_unix(sudo:session): session closed for user root Sep 9 04:49:58.221194 sudo[1708]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 04:49:58.221485 sudo[1708]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:49:58.230232 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 04:49:58.286062 augenrules[1731]: No rules Sep 9 04:49:58.290023 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 04:49:58.290224 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 04:49:58.292116 sudo[1708]: pam_unix(sudo:session): session closed for user root Sep 9 04:49:58.293784 sshd[1707]: Connection closed by 10.0.0.1 port 51876 Sep 9 04:49:58.293649 sshd-session[1704]: pam_unix(sshd:session): session closed for user core Sep 9 04:49:58.309354 systemd[1]: sshd@5-10.0.0.33:22-10.0.0.1:51876.service: Deactivated successfully. Sep 9 04:49:58.311225 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 04:49:58.312317 systemd-logind[1507]: Session 6 logged out. Waiting for processes to exit. Sep 9 04:49:58.314877 systemd[1]: Started sshd@6-10.0.0.33:22-10.0.0.1:51884.service - OpenSSH per-connection server daemon (10.0.0.1:51884). Sep 9 04:49:58.315620 systemd-logind[1507]: Removed session 6. Sep 9 04:49:58.379540 sshd[1740]: Accepted publickey for core from 10.0.0.1 port 51884 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:49:58.380921 sshd-session[1740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:49:58.385208 systemd-logind[1507]: New session 7 of user core. Sep 9 04:49:58.400464 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 04:49:58.451185 sudo[1744]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 04:49:58.451454 sudo[1744]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 04:49:58.725858 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 04:49:58.750602 (dockerd)[1764]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 04:49:58.946089 dockerd[1764]: time="2025-09-09T04:49:58.946029368Z" level=info msg="Starting up" Sep 9 04:49:58.946838 dockerd[1764]: time="2025-09-09T04:49:58.946819097Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 04:49:58.956394 dockerd[1764]: time="2025-09-09T04:49:58.956361398Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 04:49:58.988231 dockerd[1764]: time="2025-09-09T04:49:58.987943956Z" level=info msg="Loading containers: start." Sep 9 04:49:58.997264 kernel: Initializing XFRM netlink socket Sep 9 04:49:59.178498 systemd-networkd[1444]: docker0: Link UP Sep 9 04:49:59.181871 dockerd[1764]: time="2025-09-09T04:49:59.181833566Z" level=info msg="Loading containers: done." Sep 9 04:49:59.196101 dockerd[1764]: time="2025-09-09T04:49:59.195808366Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 04:49:59.196101 dockerd[1764]: time="2025-09-09T04:49:59.195879616Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 04:49:59.196101 dockerd[1764]: time="2025-09-09T04:49:59.195949593Z" level=info msg="Initializing buildkit" Sep 9 04:49:59.222180 dockerd[1764]: time="2025-09-09T04:49:59.222134889Z" level=info msg="Completed buildkit initialization" Sep 9 04:49:59.226870 dockerd[1764]: time="2025-09-09T04:49:59.226832204Z" level=info msg="Daemon has completed initialization" Sep 9 04:49:59.226980 dockerd[1764]: time="2025-09-09T04:49:59.226869859Z" level=info msg="API listen on /run/docker.sock" Sep 9 04:49:59.227058 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 04:49:59.993376 containerd[1540]: time="2025-09-09T04:49:59.993263480Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Sep 9 04:50:00.610733 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3423184534.mount: Deactivated successfully. Sep 9 04:50:02.098541 containerd[1540]: time="2025-09-09T04:50:02.097840731Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:02.099070 containerd[1540]: time="2025-09-09T04:50:02.098596770Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=27352615" Sep 9 04:50:02.099508 containerd[1540]: time="2025-09-09T04:50:02.099477002Z" level=info msg="ImageCreate event name:\"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:02.103861 containerd[1540]: time="2025-09-09T04:50:02.103603302Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:02.104749 containerd[1540]: time="2025-09-09T04:50:02.104554860Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"27349413\" in 2.111218759s" Sep 9 04:50:02.104749 containerd[1540]: time="2025-09-09T04:50:02.104628099Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\"" Sep 9 04:50:02.106205 containerd[1540]: time="2025-09-09T04:50:02.105856788Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Sep 9 04:50:03.387646 containerd[1540]: time="2025-09-09T04:50:03.387598422Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:03.388500 containerd[1540]: time="2025-09-09T04:50:03.388089812Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=23536979" Sep 9 04:50:03.389263 containerd[1540]: time="2025-09-09T04:50:03.389222721Z" level=info msg="ImageCreate event name:\"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:03.391926 containerd[1540]: time="2025-09-09T04:50:03.391894453Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:03.393695 containerd[1540]: time="2025-09-09T04:50:03.393665809Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"25093155\" in 1.28777788s" Sep 9 04:50:03.393856 containerd[1540]: time="2025-09-09T04:50:03.393759979Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\"" Sep 9 04:50:03.394143 containerd[1540]: time="2025-09-09T04:50:03.394125729Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Sep 9 04:50:03.527342 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 04:50:03.528668 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:03.665693 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:03.669375 (kubelet)[2050]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:50:03.704866 kubelet[2050]: E0909 04:50:03.704802 2050 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:50:03.708059 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:50:03.708187 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:50:03.709381 systemd[1]: kubelet.service: Consumed 141ms CPU time, 108.5M memory peak. Sep 9 04:50:04.794758 containerd[1540]: time="2025-09-09T04:50:04.794708714Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:04.797910 containerd[1540]: time="2025-09-09T04:50:04.797861999Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=18292016" Sep 9 04:50:04.798980 containerd[1540]: time="2025-09-09T04:50:04.798932751Z" level=info msg="ImageCreate event name:\"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:04.801270 containerd[1540]: time="2025-09-09T04:50:04.801210473Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:04.802707 containerd[1540]: time="2025-09-09T04:50:04.802662667Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"19848210\" in 1.408460235s" Sep 9 04:50:04.802707 containerd[1540]: time="2025-09-09T04:50:04.802701370Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\"" Sep 9 04:50:04.803382 containerd[1540]: time="2025-09-09T04:50:04.803355009Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Sep 9 04:50:05.875809 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount76890628.mount: Deactivated successfully. Sep 9 04:50:06.129300 containerd[1540]: time="2025-09-09T04:50:06.128989310Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:06.129722 containerd[1540]: time="2025-09-09T04:50:06.129693677Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=28199961" Sep 9 04:50:06.130775 containerd[1540]: time="2025-09-09T04:50:06.130716672Z" level=info msg="ImageCreate event name:\"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:06.132690 containerd[1540]: time="2025-09-09T04:50:06.132656706Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:06.133162 containerd[1540]: time="2025-09-09T04:50:06.133130236Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"28198978\" in 1.329746055s" Sep 9 04:50:06.133199 containerd[1540]: time="2025-09-09T04:50:06.133161816Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\"" Sep 9 04:50:06.133853 containerd[1540]: time="2025-09-09T04:50:06.133665129Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 9 04:50:06.633870 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4006662257.mount: Deactivated successfully. Sep 9 04:50:07.749209 containerd[1540]: time="2025-09-09T04:50:07.749151670Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:07.749705 containerd[1540]: time="2025-09-09T04:50:07.749668841Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152119" Sep 9 04:50:07.750717 containerd[1540]: time="2025-09-09T04:50:07.750687689Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:07.754736 containerd[1540]: time="2025-09-09T04:50:07.754686649Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:07.755821 containerd[1540]: time="2025-09-09T04:50:07.755710808Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.622016813s" Sep 9 04:50:07.755821 containerd[1540]: time="2025-09-09T04:50:07.755741517Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 9 04:50:07.756217 containerd[1540]: time="2025-09-09T04:50:07.756115289Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 04:50:08.188385 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1443387457.mount: Deactivated successfully. Sep 9 04:50:08.195738 containerd[1540]: time="2025-09-09T04:50:08.195667987Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:50:08.197106 containerd[1540]: time="2025-09-09T04:50:08.197047320Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Sep 9 04:50:08.197962 containerd[1540]: time="2025-09-09T04:50:08.197915044Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:50:08.203862 containerd[1540]: time="2025-09-09T04:50:08.203824757Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 04:50:08.206656 containerd[1540]: time="2025-09-09T04:50:08.205668287Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 449.524165ms" Sep 9 04:50:08.206656 containerd[1540]: time="2025-09-09T04:50:08.205700120Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 04:50:08.207399 containerd[1540]: time="2025-09-09T04:50:08.207351533Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 9 04:50:08.647837 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2807351839.mount: Deactivated successfully. Sep 9 04:50:11.388398 containerd[1540]: time="2025-09-09T04:50:11.388346310Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:11.389280 containerd[1540]: time="2025-09-09T04:50:11.389061366Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465297" Sep 9 04:50:11.390036 containerd[1540]: time="2025-09-09T04:50:11.389997444Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:11.392912 containerd[1540]: time="2025-09-09T04:50:11.392880287Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:11.394937 containerd[1540]: time="2025-09-09T04:50:11.394817460Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 3.187395786s" Sep 9 04:50:11.394937 containerd[1540]: time="2025-09-09T04:50:11.394850228Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 9 04:50:13.958586 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 04:50:13.959982 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:14.133397 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:14.139504 (kubelet)[2214]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 04:50:14.174402 kubelet[2214]: E0909 04:50:14.174352 2214 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 04:50:14.177028 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 04:50:14.177152 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 04:50:14.177646 systemd[1]: kubelet.service: Consumed 133ms CPU time, 107.3M memory peak. Sep 9 04:50:16.671173 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:16.671815 systemd[1]: kubelet.service: Consumed 133ms CPU time, 107.3M memory peak. Sep 9 04:50:16.675317 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:16.693991 systemd[1]: Reload requested from client PID 2228 ('systemctl') (unit session-7.scope)... Sep 9 04:50:16.694007 systemd[1]: Reloading... Sep 9 04:50:16.777347 zram_generator::config[2272]: No configuration found. Sep 9 04:50:16.983388 systemd[1]: Reloading finished in 289 ms. Sep 9 04:50:17.041717 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 04:50:17.041796 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 04:50:17.042081 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:17.042129 systemd[1]: kubelet.service: Consumed 88ms CPU time, 95M memory peak. Sep 9 04:50:17.043656 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:17.154267 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:17.157836 (kubelet)[2317]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:50:17.188848 kubelet[2317]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:50:17.188848 kubelet[2317]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:50:17.188848 kubelet[2317]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:50:17.189206 kubelet[2317]: I0909 04:50:17.188932 2317 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:50:17.792709 kubelet[2317]: I0909 04:50:17.792668 2317 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 9 04:50:17.792709 kubelet[2317]: I0909 04:50:17.792696 2317 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:50:17.792965 kubelet[2317]: I0909 04:50:17.792938 2317 server.go:956] "Client rotation is on, will bootstrap in background" Sep 9 04:50:17.815308 kubelet[2317]: E0909 04:50:17.815239 2317 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.33:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.33:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 9 04:50:17.815448 kubelet[2317]: I0909 04:50:17.815334 2317 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:50:17.824299 kubelet[2317]: I0909 04:50:17.824266 2317 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:50:17.827117 kubelet[2317]: I0909 04:50:17.827075 2317 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:50:17.828645 kubelet[2317]: I0909 04:50:17.828140 2317 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:50:17.828645 kubelet[2317]: I0909 04:50:17.828177 2317 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:50:17.828645 kubelet[2317]: I0909 04:50:17.828419 2317 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:50:17.828645 kubelet[2317]: I0909 04:50:17.828427 2317 container_manager_linux.go:303] "Creating device plugin manager" Sep 9 04:50:17.829274 kubelet[2317]: I0909 04:50:17.829241 2317 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:50:17.831760 kubelet[2317]: I0909 04:50:17.831743 2317 kubelet.go:480] "Attempting to sync node with API server" Sep 9 04:50:17.831921 kubelet[2317]: I0909 04:50:17.831898 2317 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:50:17.831958 kubelet[2317]: I0909 04:50:17.831947 2317 kubelet.go:386] "Adding apiserver pod source" Sep 9 04:50:17.833418 kubelet[2317]: I0909 04:50:17.833401 2317 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:50:17.834329 kubelet[2317]: I0909 04:50:17.834311 2317 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:50:17.834978 kubelet[2317]: I0909 04:50:17.834948 2317 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 9 04:50:17.835087 kubelet[2317]: W0909 04:50:17.835067 2317 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 04:50:17.837512 kubelet[2317]: I0909 04:50:17.837489 2317 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:50:17.837579 kubelet[2317]: I0909 04:50:17.837525 2317 server.go:1289] "Started kubelet" Sep 9 04:50:17.838286 kubelet[2317]: E0909 04:50:17.837959 2317 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.33:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.33:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 9 04:50:17.838420 kubelet[2317]: E0909 04:50:17.838380 2317 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.33:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 9 04:50:17.838582 kubelet[2317]: I0909 04:50:17.838543 2317 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:50:17.839573 kubelet[2317]: I0909 04:50:17.839175 2317 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:50:17.839573 kubelet[2317]: I0909 04:50:17.839539 2317 server.go:317] "Adding debug handlers to kubelet server" Sep 9 04:50:17.840291 kubelet[2317]: I0909 04:50:17.840264 2317 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:50:17.842487 kubelet[2317]: I0909 04:50:17.842455 2317 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:50:17.842707 kubelet[2317]: E0909 04:50:17.842680 2317 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:50:17.843416 kubelet[2317]: E0909 04:50:17.842368 2317 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.33:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.33:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18638400a0d06b0b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-09 04:50:17.837505291 +0000 UTC m=+0.676547182,LastTimestamp:2025-09-09 04:50:17.837505291 +0000 UTC m=+0.676547182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 9 04:50:17.843627 kubelet[2317]: I0909 04:50:17.843579 2317 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:50:17.843872 kubelet[2317]: I0909 04:50:17.843854 2317 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:50:17.844098 kubelet[2317]: E0909 04:50:17.844072 2317 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.33:6443: connect: connection refused" interval="200ms" Sep 9 04:50:17.845445 kubelet[2317]: E0909 04:50:17.845418 2317 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.33:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 9 04:50:17.845647 kubelet[2317]: I0909 04:50:17.845617 2317 factory.go:223] Registration of the containerd container factory successfully Sep 9 04:50:17.845647 kubelet[2317]: I0909 04:50:17.845639 2317 factory.go:223] Registration of the systemd container factory successfully Sep 9 04:50:17.845745 kubelet[2317]: I0909 04:50:17.845721 2317 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:50:17.847079 kubelet[2317]: I0909 04:50:17.847055 2317 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:50:17.847141 kubelet[2317]: I0909 04:50:17.847109 2317 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:50:17.847381 kubelet[2317]: E0909 04:50:17.845614 2317 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:50:17.854758 kubelet[2317]: I0909 04:50:17.854732 2317 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:50:17.854758 kubelet[2317]: I0909 04:50:17.854749 2317 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:50:17.856039 kubelet[2317]: I0909 04:50:17.854773 2317 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:50:17.857903 kubelet[2317]: I0909 04:50:17.857779 2317 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 9 04:50:17.858741 kubelet[2317]: I0909 04:50:17.858722 2317 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 9 04:50:17.858803 kubelet[2317]: I0909 04:50:17.858748 2317 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 9 04:50:17.858803 kubelet[2317]: I0909 04:50:17.858768 2317 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:50:17.858803 kubelet[2317]: I0909 04:50:17.858775 2317 kubelet.go:2436] "Starting kubelet main sync loop" Sep 9 04:50:17.858873 kubelet[2317]: E0909 04:50:17.858815 2317 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:50:17.860034 kubelet[2317]: E0909 04:50:17.859531 2317 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.33:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 9 04:50:17.935927 kubelet[2317]: I0909 04:50:17.935877 2317 policy_none.go:49] "None policy: Start" Sep 9 04:50:17.935927 kubelet[2317]: I0909 04:50:17.935913 2317 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:50:17.935927 kubelet[2317]: I0909 04:50:17.935925 2317 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:50:17.940585 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 04:50:17.942951 kubelet[2317]: E0909 04:50:17.942931 2317 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:50:17.954481 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 04:50:17.957154 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 04:50:17.958885 kubelet[2317]: E0909 04:50:17.958865 2317 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 04:50:17.974986 kubelet[2317]: E0909 04:50:17.974968 2317 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 9 04:50:17.975227 kubelet[2317]: I0909 04:50:17.975155 2317 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:50:17.975227 kubelet[2317]: I0909 04:50:17.975173 2317 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:50:17.975448 kubelet[2317]: I0909 04:50:17.975431 2317 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:50:17.976373 kubelet[2317]: E0909 04:50:17.976351 2317 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:50:17.976462 kubelet[2317]: E0909 04:50:17.976400 2317 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 9 04:50:18.045439 kubelet[2317]: E0909 04:50:18.045352 2317 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.33:6443: connect: connection refused" interval="400ms" Sep 9 04:50:18.076864 kubelet[2317]: I0909 04:50:18.076829 2317 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:50:18.077513 kubelet[2317]: E0909 04:50:18.077469 2317 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.33:6443/api/v1/nodes\": dial tcp 10.0.0.33:6443: connect: connection refused" node="localhost" Sep 9 04:50:18.169200 systemd[1]: Created slice kubepods-burstable-pod30b8b5e537aaaf735520c39c9ad06cef.slice - libcontainer container kubepods-burstable-pod30b8b5e537aaaf735520c39c9ad06cef.slice. Sep 9 04:50:18.195629 kubelet[2317]: E0909 04:50:18.195527 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:18.198756 systemd[1]: Created slice kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice - libcontainer container kubepods-burstable-pod8de7187202bee21b84740a213836f615.slice. Sep 9 04:50:18.200601 kubelet[2317]: E0909 04:50:18.200563 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:18.201777 systemd[1]: Created slice kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice - libcontainer container kubepods-burstable-podd75e6f6978d9f275ea19380916c9cccd.slice. Sep 9 04:50:18.203192 kubelet[2317]: E0909 04:50:18.203154 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:18.248613 kubelet[2317]: I0909 04:50:18.248582 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/30b8b5e537aaaf735520c39c9ad06cef-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"30b8b5e537aaaf735520c39c9ad06cef\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:18.248691 kubelet[2317]: I0909 04:50:18.248619 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:18.248691 kubelet[2317]: I0909 04:50:18.248640 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:18.248691 kubelet[2317]: I0909 04:50:18.248675 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:18.248755 kubelet[2317]: I0909 04:50:18.248707 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:18.248755 kubelet[2317]: I0909 04:50:18.248737 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:50:18.248895 kubelet[2317]: I0909 04:50:18.248796 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/30b8b5e537aaaf735520c39c9ad06cef-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"30b8b5e537aaaf735520c39c9ad06cef\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:18.248895 kubelet[2317]: I0909 04:50:18.248847 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/30b8b5e537aaaf735520c39c9ad06cef-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"30b8b5e537aaaf735520c39c9ad06cef\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:18.248895 kubelet[2317]: I0909 04:50:18.248866 2317 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:18.278839 kubelet[2317]: I0909 04:50:18.278805 2317 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:50:18.279147 kubelet[2317]: E0909 04:50:18.279103 2317 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.33:6443/api/v1/nodes\": dial tcp 10.0.0.33:6443: connect: connection refused" node="localhost" Sep 9 04:50:18.445902 kubelet[2317]: E0909 04:50:18.445782 2317 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.33:6443: connect: connection refused" interval="800ms" Sep 9 04:50:18.497056 containerd[1540]: time="2025-09-09T04:50:18.497015796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:30b8b5e537aaaf735520c39c9ad06cef,Namespace:kube-system,Attempt:0,}" Sep 9 04:50:18.501697 containerd[1540]: time="2025-09-09T04:50:18.501587430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,}" Sep 9 04:50:18.504140 containerd[1540]: time="2025-09-09T04:50:18.504111335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,}" Sep 9 04:50:18.529960 containerd[1540]: time="2025-09-09T04:50:18.529922764Z" level=info msg="connecting to shim d53807f937c134bba47a4ed7e46b0dc12ac91638081313031eb1452a91101217" address="unix:///run/containerd/s/2c9c67b7e5c4229ef3dae5c2a5951c2ab8cdeff4b65a3e0964a0ed6d0547e451" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:18.530789 containerd[1540]: time="2025-09-09T04:50:18.530755083Z" level=info msg="connecting to shim c0060fa8712f73ea4d67a99ce969a856bf01b89e7687da707c5dc1f0c2ce0d22" address="unix:///run/containerd/s/85f1cb32b04620f792637f72c36dd04f9b0e5e9b23f8d1156d1e2c9aaa1570fe" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:18.543985 containerd[1540]: time="2025-09-09T04:50:18.543942029Z" level=info msg="connecting to shim c5f0235b59cd9eb5164127a6e4e318cff26587ff9fa6825220f08c49f7014dc9" address="unix:///run/containerd/s/b0c2c28e8de069d9194e59a0887fc74cef5f6988df38c657e7e117c35810e9c3" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:18.561409 systemd[1]: Started cri-containerd-c0060fa8712f73ea4d67a99ce969a856bf01b89e7687da707c5dc1f0c2ce0d22.scope - libcontainer container c0060fa8712f73ea4d67a99ce969a856bf01b89e7687da707c5dc1f0c2ce0d22. Sep 9 04:50:18.562465 systemd[1]: Started cri-containerd-d53807f937c134bba47a4ed7e46b0dc12ac91638081313031eb1452a91101217.scope - libcontainer container d53807f937c134bba47a4ed7e46b0dc12ac91638081313031eb1452a91101217. Sep 9 04:50:18.584459 systemd[1]: Started cri-containerd-c5f0235b59cd9eb5164127a6e4e318cff26587ff9fa6825220f08c49f7014dc9.scope - libcontainer container c5f0235b59cd9eb5164127a6e4e318cff26587ff9fa6825220f08c49f7014dc9. Sep 9 04:50:18.617305 containerd[1540]: time="2025-09-09T04:50:18.616834431Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:30b8b5e537aaaf735520c39c9ad06cef,Namespace:kube-system,Attempt:0,} returns sandbox id \"c0060fa8712f73ea4d67a99ce969a856bf01b89e7687da707c5dc1f0c2ce0d22\"" Sep 9 04:50:18.622880 containerd[1540]: time="2025-09-09T04:50:18.622840670Z" level=info msg="CreateContainer within sandbox \"c0060fa8712f73ea4d67a99ce969a856bf01b89e7687da707c5dc1f0c2ce0d22\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 04:50:18.623724 containerd[1540]: time="2025-09-09T04:50:18.623698779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:8de7187202bee21b84740a213836f615,Namespace:kube-system,Attempt:0,} returns sandbox id \"d53807f937c134bba47a4ed7e46b0dc12ac91638081313031eb1452a91101217\"" Sep 9 04:50:18.628579 containerd[1540]: time="2025-09-09T04:50:18.628542868Z" level=info msg="CreateContainer within sandbox \"d53807f937c134bba47a4ed7e46b0dc12ac91638081313031eb1452a91101217\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 04:50:18.632783 containerd[1540]: time="2025-09-09T04:50:18.632702741Z" level=info msg="Container 093ea01ea63b70e8cdae70e147a623f5fef9ea5ef93f08bfe4a0f16661c3fe5b: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:18.636825 containerd[1540]: time="2025-09-09T04:50:18.636321743Z" level=info msg="Container b61815473a9fbb0b254b51144f3e23c7bee3049c3d62286cf63d4ac6fc440c3b: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:18.637903 containerd[1540]: time="2025-09-09T04:50:18.637863107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d75e6f6978d9f275ea19380916c9cccd,Namespace:kube-system,Attempt:0,} returns sandbox id \"c5f0235b59cd9eb5164127a6e4e318cff26587ff9fa6825220f08c49f7014dc9\"" Sep 9 04:50:18.643292 containerd[1540]: time="2025-09-09T04:50:18.642519309Z" level=info msg="CreateContainer within sandbox \"c5f0235b59cd9eb5164127a6e4e318cff26587ff9fa6825220f08c49f7014dc9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 04:50:18.646142 containerd[1540]: time="2025-09-09T04:50:18.646059661Z" level=info msg="CreateContainer within sandbox \"c0060fa8712f73ea4d67a99ce969a856bf01b89e7687da707c5dc1f0c2ce0d22\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"093ea01ea63b70e8cdae70e147a623f5fef9ea5ef93f08bfe4a0f16661c3fe5b\"" Sep 9 04:50:18.646779 containerd[1540]: time="2025-09-09T04:50:18.646661948Z" level=info msg="StartContainer for \"093ea01ea63b70e8cdae70e147a623f5fef9ea5ef93f08bfe4a0f16661c3fe5b\"" Sep 9 04:50:18.648177 containerd[1540]: time="2025-09-09T04:50:18.648144816Z" level=info msg="connecting to shim 093ea01ea63b70e8cdae70e147a623f5fef9ea5ef93f08bfe4a0f16661c3fe5b" address="unix:///run/containerd/s/85f1cb32b04620f792637f72c36dd04f9b0e5e9b23f8d1156d1e2c9aaa1570fe" protocol=ttrpc version=3 Sep 9 04:50:18.648320 containerd[1540]: time="2025-09-09T04:50:18.648283202Z" level=info msg="CreateContainer within sandbox \"d53807f937c134bba47a4ed7e46b0dc12ac91638081313031eb1452a91101217\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b61815473a9fbb0b254b51144f3e23c7bee3049c3d62286cf63d4ac6fc440c3b\"" Sep 9 04:50:18.648786 containerd[1540]: time="2025-09-09T04:50:18.648755620Z" level=info msg="StartContainer for \"b61815473a9fbb0b254b51144f3e23c7bee3049c3d62286cf63d4ac6fc440c3b\"" Sep 9 04:50:18.649743 containerd[1540]: time="2025-09-09T04:50:18.649717928Z" level=info msg="connecting to shim b61815473a9fbb0b254b51144f3e23c7bee3049c3d62286cf63d4ac6fc440c3b" address="unix:///run/containerd/s/2c9c67b7e5c4229ef3dae5c2a5951c2ab8cdeff4b65a3e0964a0ed6d0547e451" protocol=ttrpc version=3 Sep 9 04:50:18.651962 containerd[1540]: time="2025-09-09T04:50:18.651902724Z" level=info msg="Container 3dd8210888d2a575cbd3dbfbaacf434fbd513d8d7c8678d5935ae0c6827fa70a: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:18.660117 containerd[1540]: time="2025-09-09T04:50:18.660064571Z" level=info msg="CreateContainer within sandbox \"c5f0235b59cd9eb5164127a6e4e318cff26587ff9fa6825220f08c49f7014dc9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3dd8210888d2a575cbd3dbfbaacf434fbd513d8d7c8678d5935ae0c6827fa70a\"" Sep 9 04:50:18.660750 containerd[1540]: time="2025-09-09T04:50:18.660724076Z" level=info msg="StartContainer for \"3dd8210888d2a575cbd3dbfbaacf434fbd513d8d7c8678d5935ae0c6827fa70a\"" Sep 9 04:50:18.662217 containerd[1540]: time="2025-09-09T04:50:18.661854879Z" level=info msg="connecting to shim 3dd8210888d2a575cbd3dbfbaacf434fbd513d8d7c8678d5935ae0c6827fa70a" address="unix:///run/containerd/s/b0c2c28e8de069d9194e59a0887fc74cef5f6988df38c657e7e117c35810e9c3" protocol=ttrpc version=3 Sep 9 04:50:18.671402 systemd[1]: Started cri-containerd-093ea01ea63b70e8cdae70e147a623f5fef9ea5ef93f08bfe4a0f16661c3fe5b.scope - libcontainer container 093ea01ea63b70e8cdae70e147a623f5fef9ea5ef93f08bfe4a0f16661c3fe5b. Sep 9 04:50:18.674867 systemd[1]: Started cri-containerd-b61815473a9fbb0b254b51144f3e23c7bee3049c3d62286cf63d4ac6fc440c3b.scope - libcontainer container b61815473a9fbb0b254b51144f3e23c7bee3049c3d62286cf63d4ac6fc440c3b. Sep 9 04:50:18.681136 kubelet[2317]: I0909 04:50:18.681110 2317 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:50:18.681625 kubelet[2317]: E0909 04:50:18.681597 2317 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.33:6443/api/v1/nodes\": dial tcp 10.0.0.33:6443: connect: connection refused" node="localhost" Sep 9 04:50:18.687420 systemd[1]: Started cri-containerd-3dd8210888d2a575cbd3dbfbaacf434fbd513d8d7c8678d5935ae0c6827fa70a.scope - libcontainer container 3dd8210888d2a575cbd3dbfbaacf434fbd513d8d7c8678d5935ae0c6827fa70a. Sep 9 04:50:18.728287 containerd[1540]: time="2025-09-09T04:50:18.728087414Z" level=info msg="StartContainer for \"093ea01ea63b70e8cdae70e147a623f5fef9ea5ef93f08bfe4a0f16661c3fe5b\" returns successfully" Sep 9 04:50:18.738169 containerd[1540]: time="2025-09-09T04:50:18.738130974Z" level=info msg="StartContainer for \"b61815473a9fbb0b254b51144f3e23c7bee3049c3d62286cf63d4ac6fc440c3b\" returns successfully" Sep 9 04:50:18.746622 containerd[1540]: time="2025-09-09T04:50:18.746584669Z" level=info msg="StartContainer for \"3dd8210888d2a575cbd3dbfbaacf434fbd513d8d7c8678d5935ae0c6827fa70a\" returns successfully" Sep 9 04:50:18.868289 kubelet[2317]: E0909 04:50:18.868259 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:18.868885 kubelet[2317]: E0909 04:50:18.868868 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:18.869915 kubelet[2317]: E0909 04:50:18.869891 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:19.482641 kubelet[2317]: I0909 04:50:19.482600 2317 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:50:19.871664 kubelet[2317]: E0909 04:50:19.871576 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:19.872172 kubelet[2317]: E0909 04:50:19.872152 2317 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 9 04:50:21.287610 kubelet[2317]: E0909 04:50:21.287569 2317 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 9 04:50:21.385324 kubelet[2317]: I0909 04:50:21.385280 2317 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 04:50:21.443724 kubelet[2317]: I0909 04:50:21.443611 2317 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:21.450534 kubelet[2317]: E0909 04:50:21.450496 2317 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:21.450534 kubelet[2317]: I0909 04:50:21.450527 2317 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:21.454673 kubelet[2317]: E0909 04:50:21.454395 2317 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:21.454673 kubelet[2317]: I0909 04:50:21.454555 2317 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:50:21.457142 kubelet[2317]: E0909 04:50:21.457114 2317 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 9 04:50:21.836734 kubelet[2317]: I0909 04:50:21.836700 2317 apiserver.go:52] "Watching apiserver" Sep 9 04:50:21.847434 kubelet[2317]: I0909 04:50:21.847400 2317 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:50:23.096545 systemd[1]: Reload requested from client PID 2609 ('systemctl') (unit session-7.scope)... Sep 9 04:50:23.096560 systemd[1]: Reloading... Sep 9 04:50:23.165356 zram_generator::config[2655]: No configuration found. Sep 9 04:50:23.320603 systemd[1]: Reloading finished in 223 ms. Sep 9 04:50:23.351476 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:23.368543 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 04:50:23.368862 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:23.368930 systemd[1]: kubelet.service: Consumed 1.032s CPU time, 127.7M memory peak. Sep 9 04:50:23.370480 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 04:50:23.497116 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 04:50:23.501883 (kubelet)[2694]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 04:50:23.548340 kubelet[2694]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:50:23.548340 kubelet[2694]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 04:50:23.548340 kubelet[2694]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 04:50:23.548665 kubelet[2694]: I0909 04:50:23.548382 2694 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 04:50:23.554570 kubelet[2694]: I0909 04:50:23.554537 2694 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 9 04:50:23.554570 kubelet[2694]: I0909 04:50:23.554563 2694 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 04:50:23.554778 kubelet[2694]: I0909 04:50:23.554759 2694 server.go:956] "Client rotation is on, will bootstrap in background" Sep 9 04:50:23.556009 kubelet[2694]: I0909 04:50:23.555980 2694 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 9 04:50:23.558253 kubelet[2694]: I0909 04:50:23.558214 2694 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 04:50:23.563087 kubelet[2694]: I0909 04:50:23.563061 2694 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 04:50:23.565884 kubelet[2694]: I0909 04:50:23.565832 2694 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 04:50:23.566069 kubelet[2694]: I0909 04:50:23.566043 2694 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 04:50:23.566211 kubelet[2694]: I0909 04:50:23.566067 2694 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 04:50:23.566319 kubelet[2694]: I0909 04:50:23.566225 2694 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 04:50:23.566319 kubelet[2694]: I0909 04:50:23.566243 2694 container_manager_linux.go:303] "Creating device plugin manager" Sep 9 04:50:23.566319 kubelet[2694]: I0909 04:50:23.566312 2694 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:50:23.566990 kubelet[2694]: I0909 04:50:23.566459 2694 kubelet.go:480] "Attempting to sync node with API server" Sep 9 04:50:23.566990 kubelet[2694]: I0909 04:50:23.566478 2694 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 04:50:23.566990 kubelet[2694]: I0909 04:50:23.566502 2694 kubelet.go:386] "Adding apiserver pod source" Sep 9 04:50:23.566990 kubelet[2694]: I0909 04:50:23.566516 2694 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 04:50:23.567509 kubelet[2694]: I0909 04:50:23.567491 2694 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 04:50:23.571981 kubelet[2694]: I0909 04:50:23.571952 2694 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 9 04:50:23.574427 kubelet[2694]: I0909 04:50:23.574395 2694 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 04:50:23.574491 kubelet[2694]: I0909 04:50:23.574434 2694 server.go:1289] "Started kubelet" Sep 9 04:50:23.574743 kubelet[2694]: I0909 04:50:23.574697 2694 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 04:50:23.576510 kubelet[2694]: I0909 04:50:23.576490 2694 server.go:317] "Adding debug handlers to kubelet server" Sep 9 04:50:23.577344 kubelet[2694]: I0909 04:50:23.574702 2694 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 04:50:23.580208 kubelet[2694]: I0909 04:50:23.580187 2694 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 04:50:23.580332 kubelet[2694]: E0909 04:50:23.579641 2694 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 04:50:23.580888 kubelet[2694]: I0909 04:50:23.580574 2694 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 04:50:23.581546 kubelet[2694]: I0909 04:50:23.581519 2694 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 04:50:23.587055 kubelet[2694]: E0909 04:50:23.587026 2694 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 9 04:50:23.587295 kubelet[2694]: I0909 04:50:23.587191 2694 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 04:50:23.587581 kubelet[2694]: I0909 04:50:23.587558 2694 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 04:50:23.587926 kubelet[2694]: I0909 04:50:23.587910 2694 reconciler.go:26] "Reconciler: start to sync state" Sep 9 04:50:23.590027 kubelet[2694]: I0909 04:50:23.589544 2694 factory.go:223] Registration of the systemd container factory successfully Sep 9 04:50:23.590027 kubelet[2694]: I0909 04:50:23.589640 2694 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 04:50:23.591898 kubelet[2694]: I0909 04:50:23.591835 2694 factory.go:223] Registration of the containerd container factory successfully Sep 9 04:50:23.599784 kubelet[2694]: I0909 04:50:23.599753 2694 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 9 04:50:23.602845 kubelet[2694]: I0909 04:50:23.602160 2694 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 9 04:50:23.603083 kubelet[2694]: I0909 04:50:23.602963 2694 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 9 04:50:23.603083 kubelet[2694]: I0909 04:50:23.602996 2694 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 04:50:23.603083 kubelet[2694]: I0909 04:50:23.603004 2694 kubelet.go:2436] "Starting kubelet main sync loop" Sep 9 04:50:23.603083 kubelet[2694]: E0909 04:50:23.603057 2694 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 04:50:23.628470 kubelet[2694]: I0909 04:50:23.628449 2694 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 04:50:23.628470 kubelet[2694]: I0909 04:50:23.628463 2694 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 04:50:23.628583 kubelet[2694]: I0909 04:50:23.628482 2694 state_mem.go:36] "Initialized new in-memory state store" Sep 9 04:50:23.628612 kubelet[2694]: I0909 04:50:23.628594 2694 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 04:50:23.628612 kubelet[2694]: I0909 04:50:23.628603 2694 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 04:50:23.628652 kubelet[2694]: I0909 04:50:23.628619 2694 policy_none.go:49] "None policy: Start" Sep 9 04:50:23.628652 kubelet[2694]: I0909 04:50:23.628627 2694 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 04:50:23.628652 kubelet[2694]: I0909 04:50:23.628635 2694 state_mem.go:35] "Initializing new in-memory state store" Sep 9 04:50:23.628737 kubelet[2694]: I0909 04:50:23.628710 2694 state_mem.go:75] "Updated machine memory state" Sep 9 04:50:23.632294 kubelet[2694]: E0909 04:50:23.632265 2694 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 9 04:50:23.632429 kubelet[2694]: I0909 04:50:23.632412 2694 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 04:50:23.632458 kubelet[2694]: I0909 04:50:23.632429 2694 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 04:50:23.632614 kubelet[2694]: I0909 04:50:23.632595 2694 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 04:50:23.633334 kubelet[2694]: E0909 04:50:23.633308 2694 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 04:50:23.704657 kubelet[2694]: I0909 04:50:23.704613 2694 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:23.704758 kubelet[2694]: I0909 04:50:23.704669 2694 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:23.704800 kubelet[2694]: I0909 04:50:23.704777 2694 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 9 04:50:23.734488 kubelet[2694]: I0909 04:50:23.734460 2694 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 9 04:50:23.741101 kubelet[2694]: I0909 04:50:23.741077 2694 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 9 04:50:23.741183 kubelet[2694]: I0909 04:50:23.741147 2694 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 9 04:50:23.788937 kubelet[2694]: I0909 04:50:23.788885 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:23.788937 kubelet[2694]: I0909 04:50:23.788926 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:23.788937 kubelet[2694]: I0909 04:50:23.788956 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:23.789131 kubelet[2694]: I0909 04:50:23.789006 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:23.789131 kubelet[2694]: I0909 04:50:23.789043 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/30b8b5e537aaaf735520c39c9ad06cef-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"30b8b5e537aaaf735520c39c9ad06cef\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:23.789131 kubelet[2694]: I0909 04:50:23.789071 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/30b8b5e537aaaf735520c39c9ad06cef-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"30b8b5e537aaaf735520c39c9ad06cef\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:23.789131 kubelet[2694]: I0909 04:50:23.789085 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/30b8b5e537aaaf735520c39c9ad06cef-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"30b8b5e537aaaf735520c39c9ad06cef\") " pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:23.789131 kubelet[2694]: I0909 04:50:23.789101 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8de7187202bee21b84740a213836f615-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"8de7187202bee21b84740a213836f615\") " pod="kube-system/kube-controller-manager-localhost" Sep 9 04:50:23.789230 kubelet[2694]: I0909 04:50:23.789118 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d75e6f6978d9f275ea19380916c9cccd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d75e6f6978d9f275ea19380916c9cccd\") " pod="kube-system/kube-scheduler-localhost" Sep 9 04:50:24.567821 kubelet[2694]: I0909 04:50:24.567762 2694 apiserver.go:52] "Watching apiserver" Sep 9 04:50:24.588879 kubelet[2694]: I0909 04:50:24.588450 2694 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 04:50:24.618779 kubelet[2694]: I0909 04:50:24.618743 2694 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:24.623457 kubelet[2694]: E0909 04:50:24.623422 2694 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 9 04:50:24.643555 kubelet[2694]: I0909 04:50:24.643494 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.6434786620000001 podStartE2EDuration="1.643478662s" podCreationTimestamp="2025-09-09 04:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:50:24.635945149 +0000 UTC m=+1.128221270" watchObservedRunningTime="2025-09-09 04:50:24.643478662 +0000 UTC m=+1.135754783" Sep 9 04:50:24.655263 kubelet[2694]: I0909 04:50:24.655194 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.655178452 podStartE2EDuration="1.655178452s" podCreationTimestamp="2025-09-09 04:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:50:24.644388101 +0000 UTC m=+1.136664222" watchObservedRunningTime="2025-09-09 04:50:24.655178452 +0000 UTC m=+1.147454573" Sep 9 04:50:24.655395 kubelet[2694]: I0909 04:50:24.655322 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.655316772 podStartE2EDuration="1.655316772s" podCreationTimestamp="2025-09-09 04:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:50:24.653742854 +0000 UTC m=+1.146018975" watchObservedRunningTime="2025-09-09 04:50:24.655316772 +0000 UTC m=+1.147592893" Sep 9 04:50:30.726986 kubelet[2694]: I0909 04:50:30.726952 2694 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 04:50:30.727355 containerd[1540]: time="2025-09-09T04:50:30.727300409Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 04:50:30.727547 kubelet[2694]: I0909 04:50:30.727525 2694 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 04:50:31.844133 systemd[1]: Created slice kubepods-besteffort-pod7f1e049e_d9b0_4f9a_9441_7ee17754e157.slice - libcontainer container kubepods-besteffort-pod7f1e049e_d9b0_4f9a_9441_7ee17754e157.slice. Sep 9 04:50:31.940907 kubelet[2694]: I0909 04:50:31.940850 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f1e049e-d9b0-4f9a-9441-7ee17754e157-lib-modules\") pod \"kube-proxy-ftgfj\" (UID: \"7f1e049e-d9b0-4f9a-9441-7ee17754e157\") " pod="kube-system/kube-proxy-ftgfj" Sep 9 04:50:31.941359 kubelet[2694]: I0909 04:50:31.940967 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnrrh\" (UniqueName: \"kubernetes.io/projected/7f1e049e-d9b0-4f9a-9441-7ee17754e157-kube-api-access-lnrrh\") pod \"kube-proxy-ftgfj\" (UID: \"7f1e049e-d9b0-4f9a-9441-7ee17754e157\") " pod="kube-system/kube-proxy-ftgfj" Sep 9 04:50:31.941359 kubelet[2694]: I0909 04:50:31.941005 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7f1e049e-d9b0-4f9a-9441-7ee17754e157-kube-proxy\") pod \"kube-proxy-ftgfj\" (UID: \"7f1e049e-d9b0-4f9a-9441-7ee17754e157\") " pod="kube-system/kube-proxy-ftgfj" Sep 9 04:50:31.941359 kubelet[2694]: I0909 04:50:31.941023 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7f1e049e-d9b0-4f9a-9441-7ee17754e157-xtables-lock\") pod \"kube-proxy-ftgfj\" (UID: \"7f1e049e-d9b0-4f9a-9441-7ee17754e157\") " pod="kube-system/kube-proxy-ftgfj" Sep 9 04:50:31.955616 systemd[1]: Created slice kubepods-besteffort-pod993de5aa_065f_4447_9290_007f731d4419.slice - libcontainer container kubepods-besteffort-pod993de5aa_065f_4447_9290_007f731d4419.slice. Sep 9 04:50:32.042024 kubelet[2694]: I0909 04:50:32.041396 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/993de5aa-065f-4447-9290-007f731d4419-var-lib-calico\") pod \"tigera-operator-755d956888-cmdtl\" (UID: \"993de5aa-065f-4447-9290-007f731d4419\") " pod="tigera-operator/tigera-operator-755d956888-cmdtl" Sep 9 04:50:32.042024 kubelet[2694]: I0909 04:50:32.041446 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mk9s\" (UniqueName: \"kubernetes.io/projected/993de5aa-065f-4447-9290-007f731d4419-kube-api-access-8mk9s\") pod \"tigera-operator-755d956888-cmdtl\" (UID: \"993de5aa-065f-4447-9290-007f731d4419\") " pod="tigera-operator/tigera-operator-755d956888-cmdtl" Sep 9 04:50:32.160152 containerd[1540]: time="2025-09-09T04:50:32.159971308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ftgfj,Uid:7f1e049e-d9b0-4f9a-9441-7ee17754e157,Namespace:kube-system,Attempt:0,}" Sep 9 04:50:32.174707 containerd[1540]: time="2025-09-09T04:50:32.174667900Z" level=info msg="connecting to shim 3c9ebc48490e5f976f6f64b5826c57b830760ed271d5f1c1f285bb6e0b825676" address="unix:///run/containerd/s/12080203fc6bda0f1de5a0ded4d9d02e23b5d4e81be671a4281e3212820e33ca" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:32.203436 systemd[1]: Started cri-containerd-3c9ebc48490e5f976f6f64b5826c57b830760ed271d5f1c1f285bb6e0b825676.scope - libcontainer container 3c9ebc48490e5f976f6f64b5826c57b830760ed271d5f1c1f285bb6e0b825676. Sep 9 04:50:32.224020 containerd[1540]: time="2025-09-09T04:50:32.223972594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ftgfj,Uid:7f1e049e-d9b0-4f9a-9441-7ee17754e157,Namespace:kube-system,Attempt:0,} returns sandbox id \"3c9ebc48490e5f976f6f64b5826c57b830760ed271d5f1c1f285bb6e0b825676\"" Sep 9 04:50:32.230896 containerd[1540]: time="2025-09-09T04:50:32.230094510Z" level=info msg="CreateContainer within sandbox \"3c9ebc48490e5f976f6f64b5826c57b830760ed271d5f1c1f285bb6e0b825676\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 04:50:32.241958 containerd[1540]: time="2025-09-09T04:50:32.241376784Z" level=info msg="Container a7c316527708b68c528d54b0a63e030554d710f22b0f9f585fc9f7bf8001ac6e: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:32.248675 containerd[1540]: time="2025-09-09T04:50:32.248634700Z" level=info msg="CreateContainer within sandbox \"3c9ebc48490e5f976f6f64b5826c57b830760ed271d5f1c1f285bb6e0b825676\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a7c316527708b68c528d54b0a63e030554d710f22b0f9f585fc9f7bf8001ac6e\"" Sep 9 04:50:32.249532 containerd[1540]: time="2025-09-09T04:50:32.249469780Z" level=info msg="StartContainer for \"a7c316527708b68c528d54b0a63e030554d710f22b0f9f585fc9f7bf8001ac6e\"" Sep 9 04:50:32.251098 containerd[1540]: time="2025-09-09T04:50:32.251035059Z" level=info msg="connecting to shim a7c316527708b68c528d54b0a63e030554d710f22b0f9f585fc9f7bf8001ac6e" address="unix:///run/containerd/s/12080203fc6bda0f1de5a0ded4d9d02e23b5d4e81be671a4281e3212820e33ca" protocol=ttrpc version=3 Sep 9 04:50:32.258969 containerd[1540]: time="2025-09-09T04:50:32.258911975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-cmdtl,Uid:993de5aa-065f-4447-9290-007f731d4419,Namespace:tigera-operator,Attempt:0,}" Sep 9 04:50:32.268399 systemd[1]: Started cri-containerd-a7c316527708b68c528d54b0a63e030554d710f22b0f9f585fc9f7bf8001ac6e.scope - libcontainer container a7c316527708b68c528d54b0a63e030554d710f22b0f9f585fc9f7bf8001ac6e. Sep 9 04:50:32.277795 containerd[1540]: time="2025-09-09T04:50:32.277748405Z" level=info msg="connecting to shim 4b935654cf10b90c2de93c99a9d770d69ceda0b5555db2f695879acb35e0eb84" address="unix:///run/containerd/s/e5cdc68d73257de396a2223e8fba028a421cd3824245793023d453911d1836f3" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:32.300444 systemd[1]: Started cri-containerd-4b935654cf10b90c2de93c99a9d770d69ceda0b5555db2f695879acb35e0eb84.scope - libcontainer container 4b935654cf10b90c2de93c99a9d770d69ceda0b5555db2f695879acb35e0eb84. Sep 9 04:50:32.308454 containerd[1540]: time="2025-09-09T04:50:32.308393188Z" level=info msg="StartContainer for \"a7c316527708b68c528d54b0a63e030554d710f22b0f9f585fc9f7bf8001ac6e\" returns successfully" Sep 9 04:50:32.334991 containerd[1540]: time="2025-09-09T04:50:32.334948094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-cmdtl,Uid:993de5aa-065f-4447-9290-007f731d4419,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4b935654cf10b90c2de93c99a9d770d69ceda0b5555db2f695879acb35e0eb84\"" Sep 9 04:50:32.336898 containerd[1540]: time="2025-09-09T04:50:32.336861133Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 04:50:33.583428 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount42511745.mount: Deactivated successfully. Sep 9 04:50:34.190501 containerd[1540]: time="2025-09-09T04:50:34.190452768Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:34.191170 containerd[1540]: time="2025-09-09T04:50:34.191144328Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 04:50:34.191917 containerd[1540]: time="2025-09-09T04:50:34.191894928Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:34.194182 containerd[1540]: time="2025-09-09T04:50:34.194129247Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:34.194833 containerd[1540]: time="2025-09-09T04:50:34.194797926Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 1.857900193s" Sep 9 04:50:34.194890 containerd[1540]: time="2025-09-09T04:50:34.194830846Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 04:50:34.199750 containerd[1540]: time="2025-09-09T04:50:34.199721244Z" level=info msg="CreateContainer within sandbox \"4b935654cf10b90c2de93c99a9d770d69ceda0b5555db2f695879acb35e0eb84\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 04:50:34.205658 containerd[1540]: time="2025-09-09T04:50:34.205631801Z" level=info msg="Container 34378fdfce10c1d5c2889e0a02a80417a7be193972d2db0d0e3bea14cbf397c1: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:34.213067 containerd[1540]: time="2025-09-09T04:50:34.213010957Z" level=info msg="CreateContainer within sandbox \"4b935654cf10b90c2de93c99a9d770d69ceda0b5555db2f695879acb35e0eb84\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"34378fdfce10c1d5c2889e0a02a80417a7be193972d2db0d0e3bea14cbf397c1\"" Sep 9 04:50:34.213424 containerd[1540]: time="2025-09-09T04:50:34.213395997Z" level=info msg="StartContainer for \"34378fdfce10c1d5c2889e0a02a80417a7be193972d2db0d0e3bea14cbf397c1\"" Sep 9 04:50:34.214193 containerd[1540]: time="2025-09-09T04:50:34.214164237Z" level=info msg="connecting to shim 34378fdfce10c1d5c2889e0a02a80417a7be193972d2db0d0e3bea14cbf397c1" address="unix:///run/containerd/s/e5cdc68d73257de396a2223e8fba028a421cd3824245793023d453911d1836f3" protocol=ttrpc version=3 Sep 9 04:50:34.234421 systemd[1]: Started cri-containerd-34378fdfce10c1d5c2889e0a02a80417a7be193972d2db0d0e3bea14cbf397c1.scope - libcontainer container 34378fdfce10c1d5c2889e0a02a80417a7be193972d2db0d0e3bea14cbf397c1. Sep 9 04:50:34.260147 containerd[1540]: time="2025-09-09T04:50:34.260113454Z" level=info msg="StartContainer for \"34378fdfce10c1d5c2889e0a02a80417a7be193972d2db0d0e3bea14cbf397c1\" returns successfully" Sep 9 04:50:34.520039 kubelet[2694]: I0909 04:50:34.519757 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-ftgfj" podStartSLOduration=3.519742768 podStartE2EDuration="3.519742768s" podCreationTimestamp="2025-09-09 04:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:50:32.643297807 +0000 UTC m=+9.135573968" watchObservedRunningTime="2025-09-09 04:50:34.519742768 +0000 UTC m=+11.012018849" Sep 9 04:50:34.650785 kubelet[2694]: I0909 04:50:34.650728 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-cmdtl" podStartSLOduration=1.790915151 podStartE2EDuration="3.650712504s" podCreationTimestamp="2025-09-09 04:50:31 +0000 UTC" firstStartedPulling="2025-09-09 04:50:32.336092213 +0000 UTC m=+8.828368334" lastFinishedPulling="2025-09-09 04:50:34.195889566 +0000 UTC m=+10.688165687" observedRunningTime="2025-09-09 04:50:34.650468384 +0000 UTC m=+11.142744505" watchObservedRunningTime="2025-09-09 04:50:34.650712504 +0000 UTC m=+11.142988625" Sep 9 04:50:35.546270 update_engine[1516]: I20250909 04:50:35.545301 1516 update_attempter.cc:509] Updating boot flags... Sep 9 04:50:39.415394 sudo[1744]: pam_unix(sudo:session): session closed for user root Sep 9 04:50:39.416518 sshd[1743]: Connection closed by 10.0.0.1 port 51884 Sep 9 04:50:39.416998 sshd-session[1740]: pam_unix(sshd:session): session closed for user core Sep 9 04:50:39.420892 systemd[1]: sshd@6-10.0.0.33:22-10.0.0.1:51884.service: Deactivated successfully. Sep 9 04:50:39.423332 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 04:50:39.423576 systemd[1]: session-7.scope: Consumed 6.920s CPU time, 230.5M memory peak. Sep 9 04:50:39.425630 systemd-logind[1507]: Session 7 logged out. Waiting for processes to exit. Sep 9 04:50:39.426899 systemd-logind[1507]: Removed session 7. Sep 9 04:50:44.195329 systemd[1]: Created slice kubepods-besteffort-podce665adb_b757_491f_92e8_8abf3235f5e4.slice - libcontainer container kubepods-besteffort-podce665adb_b757_491f_92e8_8abf3235f5e4.slice. Sep 9 04:50:44.232960 kubelet[2694]: I0909 04:50:44.232624 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce665adb-b757-491f-92e8-8abf3235f5e4-tigera-ca-bundle\") pod \"calico-typha-7d9584ccfc-mhnrh\" (UID: \"ce665adb-b757-491f-92e8-8abf3235f5e4\") " pod="calico-system/calico-typha-7d9584ccfc-mhnrh" Sep 9 04:50:44.232960 kubelet[2694]: I0909 04:50:44.232877 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ce665adb-b757-491f-92e8-8abf3235f5e4-typha-certs\") pod \"calico-typha-7d9584ccfc-mhnrh\" (UID: \"ce665adb-b757-491f-92e8-8abf3235f5e4\") " pod="calico-system/calico-typha-7d9584ccfc-mhnrh" Sep 9 04:50:44.232960 kubelet[2694]: I0909 04:50:44.232899 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqp5c\" (UniqueName: \"kubernetes.io/projected/ce665adb-b757-491f-92e8-8abf3235f5e4-kube-api-access-nqp5c\") pod \"calico-typha-7d9584ccfc-mhnrh\" (UID: \"ce665adb-b757-491f-92e8-8abf3235f5e4\") " pod="calico-system/calico-typha-7d9584ccfc-mhnrh" Sep 9 04:50:44.398283 systemd[1]: Created slice kubepods-besteffort-pod0b17e656_0b72_41c1_890c_69c293ed9961.slice - libcontainer container kubepods-besteffort-pod0b17e656_0b72_41c1_890c_69c293ed9961.slice. Sep 9 04:50:44.434363 kubelet[2694]: I0909 04:50:44.434313 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-var-run-calico\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434363 kubelet[2694]: I0909 04:50:44.434365 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-cni-net-dir\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434537 kubelet[2694]: I0909 04:50:44.434385 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-flexvol-driver-host\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434537 kubelet[2694]: I0909 04:50:44.434401 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-xtables-lock\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434537 kubelet[2694]: I0909 04:50:44.434417 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msb6v\" (UniqueName: \"kubernetes.io/projected/0b17e656-0b72-41c1-890c-69c293ed9961-kube-api-access-msb6v\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434537 kubelet[2694]: I0909 04:50:44.434435 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-lib-modules\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434537 kubelet[2694]: I0909 04:50:44.434452 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0b17e656-0b72-41c1-890c-69c293ed9961-node-certs\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434647 kubelet[2694]: I0909 04:50:44.434466 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-policysync\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434647 kubelet[2694]: I0909 04:50:44.434480 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b17e656-0b72-41c1-890c-69c293ed9961-tigera-ca-bundle\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434647 kubelet[2694]: I0909 04:50:44.434495 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-var-lib-calico\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434647 kubelet[2694]: I0909 04:50:44.434516 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-cni-log-dir\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.434647 kubelet[2694]: I0909 04:50:44.434531 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0b17e656-0b72-41c1-890c-69c293ed9961-cni-bin-dir\") pod \"calico-node-jkbsz\" (UID: \"0b17e656-0b72-41c1-890c-69c293ed9961\") " pod="calico-system/calico-node-jkbsz" Sep 9 04:50:44.499681 containerd[1540]: time="2025-09-09T04:50:44.499556488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d9584ccfc-mhnrh,Uid:ce665adb-b757-491f-92e8-8abf3235f5e4,Namespace:calico-system,Attempt:0,}" Sep 9 04:50:44.537189 kubelet[2694]: E0909 04:50:44.537138 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.537189 kubelet[2694]: W0909 04:50:44.537173 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.545358 containerd[1540]: time="2025-09-09T04:50:44.545265474Z" level=info msg="connecting to shim fcd9edb219c9566ed226ad3f42458e59267277b52644c9581673e1cb3cb5288c" address="unix:///run/containerd/s/629f6ba7234cc590721aba9cd7d08596cd0b41676bb2759c9229dd686bd7fdd7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:44.556177 kubelet[2694]: E0909 04:50:44.555965 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.556566 kubelet[2694]: E0909 04:50:44.556269 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.556566 kubelet[2694]: W0909 04:50:44.556283 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.556566 kubelet[2694]: E0909 04:50:44.556300 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.556566 kubelet[2694]: E0909 04:50:44.556441 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.556566 kubelet[2694]: W0909 04:50:44.556447 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.556566 kubelet[2694]: E0909 04:50:44.556455 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.557625 kubelet[2694]: E0909 04:50:44.557292 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.557625 kubelet[2694]: W0909 04:50:44.557309 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.557625 kubelet[2694]: E0909 04:50:44.557321 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.557625 kubelet[2694]: E0909 04:50:44.557537 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.557625 kubelet[2694]: W0909 04:50:44.557545 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.557625 kubelet[2694]: E0909 04:50:44.557554 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.557837 kubelet[2694]: E0909 04:50:44.557701 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.557837 kubelet[2694]: W0909 04:50:44.557709 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.557837 kubelet[2694]: E0909 04:50:44.557717 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.557837 kubelet[2694]: E0909 04:50:44.557833 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559377 kubelet[2694]: W0909 04:50:44.557841 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559377 kubelet[2694]: E0909 04:50:44.557848 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.559377 kubelet[2694]: E0909 04:50:44.558029 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559377 kubelet[2694]: W0909 04:50:44.558038 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559377 kubelet[2694]: E0909 04:50:44.558046 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.559377 kubelet[2694]: E0909 04:50:44.558238 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559377 kubelet[2694]: W0909 04:50:44.558260 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559377 kubelet[2694]: E0909 04:50:44.558268 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.559377 kubelet[2694]: E0909 04:50:44.558420 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559377 kubelet[2694]: W0909 04:50:44.558428 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559563 kubelet[2694]: E0909 04:50:44.558435 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.559563 kubelet[2694]: E0909 04:50:44.558579 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559563 kubelet[2694]: W0909 04:50:44.558585 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559563 kubelet[2694]: E0909 04:50:44.558594 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.559563 kubelet[2694]: E0909 04:50:44.558828 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559563 kubelet[2694]: W0909 04:50:44.558837 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559563 kubelet[2694]: E0909 04:50:44.558845 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.559563 kubelet[2694]: E0909 04:50:44.559006 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559563 kubelet[2694]: W0909 04:50:44.559013 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559563 kubelet[2694]: E0909 04:50:44.559021 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.559747 kubelet[2694]: E0909 04:50:44.559170 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.559747 kubelet[2694]: W0909 04:50:44.559179 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.559747 kubelet[2694]: E0909 04:50:44.559187 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.560923 kubelet[2694]: E0909 04:50:44.559849 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.560923 kubelet[2694]: W0909 04:50:44.559876 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.560923 kubelet[2694]: E0909 04:50:44.559886 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.560923 kubelet[2694]: E0909 04:50:44.560050 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.560923 kubelet[2694]: W0909 04:50:44.560058 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.560923 kubelet[2694]: E0909 04:50:44.560065 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.560923 kubelet[2694]: E0909 04:50:44.560882 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.560923 kubelet[2694]: W0909 04:50:44.560892 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.560923 kubelet[2694]: E0909 04:50:44.560903 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.561212 kubelet[2694]: E0909 04:50:44.561156 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.561212 kubelet[2694]: W0909 04:50:44.561165 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.561212 kubelet[2694]: E0909 04:50:44.561173 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.561866 kubelet[2694]: E0909 04:50:44.561332 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.561866 kubelet[2694]: W0909 04:50:44.561344 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.561866 kubelet[2694]: E0909 04:50:44.561354 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.581260 kubelet[2694]: E0909 04:50:44.580280 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.581260 kubelet[2694]: W0909 04:50:44.580303 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.581260 kubelet[2694]: E0909 04:50:44.580321 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.604893 kubelet[2694]: E0909 04:50:44.604461 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drkg7" podUID="48d0235b-b892-4f7a-ad2b-f4f107a0f105" Sep 9 04:50:44.606443 systemd[1]: Started cri-containerd-fcd9edb219c9566ed226ad3f42458e59267277b52644c9581673e1cb3cb5288c.scope - libcontainer container fcd9edb219c9566ed226ad3f42458e59267277b52644c9581673e1cb3cb5288c. Sep 9 04:50:44.607468 kubelet[2694]: E0909 04:50:44.607408 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.607468 kubelet[2694]: W0909 04:50:44.607429 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.607468 kubelet[2694]: E0909 04:50:44.607450 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.607835 kubelet[2694]: E0909 04:50:44.607631 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.616466 kubelet[2694]: W0909 04:50:44.607650 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.616466 kubelet[2694]: E0909 04:50:44.616465 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.616875 kubelet[2694]: E0909 04:50:44.616837 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.616875 kubelet[2694]: W0909 04:50:44.616849 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.616875 kubelet[2694]: E0909 04:50:44.616862 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.617582 kubelet[2694]: E0909 04:50:44.617562 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.617582 kubelet[2694]: W0909 04:50:44.617579 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.617640 kubelet[2694]: E0909 04:50:44.617591 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.618152 kubelet[2694]: E0909 04:50:44.618107 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.618152 kubelet[2694]: W0909 04:50:44.618149 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.618228 kubelet[2694]: E0909 04:50:44.618163 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.618411 kubelet[2694]: E0909 04:50:44.618381 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.618411 kubelet[2694]: W0909 04:50:44.618392 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.618411 kubelet[2694]: E0909 04:50:44.618402 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.618741 kubelet[2694]: E0909 04:50:44.618713 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.618741 kubelet[2694]: W0909 04:50:44.618724 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.618741 kubelet[2694]: E0909 04:50:44.618733 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.619049 kubelet[2694]: E0909 04:50:44.618868 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.619049 kubelet[2694]: W0909 04:50:44.618875 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.619049 kubelet[2694]: E0909 04:50:44.618883 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.619049 kubelet[2694]: E0909 04:50:44.619009 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.619049 kubelet[2694]: W0909 04:50:44.619015 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.619049 kubelet[2694]: E0909 04:50:44.619025 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.619049 kubelet[2694]: E0909 04:50:44.619131 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.619049 kubelet[2694]: W0909 04:50:44.619137 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.619049 kubelet[2694]: E0909 04:50:44.619152 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.620424 kubelet[2694]: E0909 04:50:44.619289 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.620424 kubelet[2694]: W0909 04:50:44.619296 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.620424 kubelet[2694]: E0909 04:50:44.619303 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.620424 kubelet[2694]: E0909 04:50:44.619517 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.620424 kubelet[2694]: W0909 04:50:44.619523 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.620424 kubelet[2694]: E0909 04:50:44.619530 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.620424 kubelet[2694]: E0909 04:50:44.619666 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.620424 kubelet[2694]: W0909 04:50:44.619674 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.620424 kubelet[2694]: E0909 04:50:44.619681 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.620424 kubelet[2694]: E0909 04:50:44.619802 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.621491 kubelet[2694]: W0909 04:50:44.619808 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.621491 kubelet[2694]: E0909 04:50:44.619815 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.621491 kubelet[2694]: E0909 04:50:44.620911 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.621491 kubelet[2694]: W0909 04:50:44.620924 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.621491 kubelet[2694]: E0909 04:50:44.620935 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.621491 kubelet[2694]: E0909 04:50:44.621571 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.621491 kubelet[2694]: W0909 04:50:44.621582 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.621491 kubelet[2694]: E0909 04:50:44.621634 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.622202 kubelet[2694]: E0909 04:50:44.622037 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.622202 kubelet[2694]: W0909 04:50:44.622051 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.622202 kubelet[2694]: E0909 04:50:44.622063 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.623440 kubelet[2694]: E0909 04:50:44.622490 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.623440 kubelet[2694]: W0909 04:50:44.622501 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.623440 kubelet[2694]: E0909 04:50:44.622511 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.623440 kubelet[2694]: E0909 04:50:44.622686 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.623440 kubelet[2694]: W0909 04:50:44.622695 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.623440 kubelet[2694]: E0909 04:50:44.622703 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.623440 kubelet[2694]: E0909 04:50:44.622884 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.623440 kubelet[2694]: W0909 04:50:44.622892 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.623440 kubelet[2694]: E0909 04:50:44.622903 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.637087 kubelet[2694]: E0909 04:50:44.636993 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.637087 kubelet[2694]: W0909 04:50:44.637042 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.637087 kubelet[2694]: E0909 04:50:44.637061 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.637374 kubelet[2694]: I0909 04:50:44.637089 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfksm\" (UniqueName: \"kubernetes.io/projected/48d0235b-b892-4f7a-ad2b-f4f107a0f105-kube-api-access-kfksm\") pod \"csi-node-driver-drkg7\" (UID: \"48d0235b-b892-4f7a-ad2b-f4f107a0f105\") " pod="calico-system/csi-node-driver-drkg7" Sep 9 04:50:44.638114 kubelet[2694]: E0909 04:50:44.637538 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.638114 kubelet[2694]: W0909 04:50:44.637557 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.638114 kubelet[2694]: E0909 04:50:44.637569 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.638114 kubelet[2694]: I0909 04:50:44.637588 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/48d0235b-b892-4f7a-ad2b-f4f107a0f105-registration-dir\") pod \"csi-node-driver-drkg7\" (UID: \"48d0235b-b892-4f7a-ad2b-f4f107a0f105\") " pod="calico-system/csi-node-driver-drkg7" Sep 9 04:50:44.638114 kubelet[2694]: E0909 04:50:44.637944 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.638114 kubelet[2694]: W0909 04:50:44.637955 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.638114 kubelet[2694]: E0909 04:50:44.637965 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.638114 kubelet[2694]: I0909 04:50:44.637992 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/48d0235b-b892-4f7a-ad2b-f4f107a0f105-varrun\") pod \"csi-node-driver-drkg7\" (UID: \"48d0235b-b892-4f7a-ad2b-f4f107a0f105\") " pod="calico-system/csi-node-driver-drkg7" Sep 9 04:50:44.638763 kubelet[2694]: E0909 04:50:44.638520 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.638763 kubelet[2694]: W0909 04:50:44.638632 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.638763 kubelet[2694]: E0909 04:50:44.638648 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.638763 kubelet[2694]: I0909 04:50:44.638672 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48d0235b-b892-4f7a-ad2b-f4f107a0f105-kubelet-dir\") pod \"csi-node-driver-drkg7\" (UID: \"48d0235b-b892-4f7a-ad2b-f4f107a0f105\") " pod="calico-system/csi-node-driver-drkg7" Sep 9 04:50:44.639909 kubelet[2694]: E0909 04:50:44.639885 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.639909 kubelet[2694]: W0909 04:50:44.639901 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.639909 kubelet[2694]: E0909 04:50:44.639913 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.640080 kubelet[2694]: I0909 04:50:44.639934 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/48d0235b-b892-4f7a-ad2b-f4f107a0f105-socket-dir\") pod \"csi-node-driver-drkg7\" (UID: \"48d0235b-b892-4f7a-ad2b-f4f107a0f105\") " pod="calico-system/csi-node-driver-drkg7" Sep 9 04:50:44.640137 kubelet[2694]: E0909 04:50:44.640129 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.640196 kubelet[2694]: W0909 04:50:44.640138 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.640196 kubelet[2694]: E0909 04:50:44.640158 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.640586 kubelet[2694]: E0909 04:50:44.640559 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.640586 kubelet[2694]: W0909 04:50:44.640574 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.640586 kubelet[2694]: E0909 04:50:44.640586 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.640850 kubelet[2694]: E0909 04:50:44.640836 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.640850 kubelet[2694]: W0909 04:50:44.640848 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.640945 kubelet[2694]: E0909 04:50:44.640858 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.641030 kubelet[2694]: E0909 04:50:44.641020 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.641074 kubelet[2694]: W0909 04:50:44.641050 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.641074 kubelet[2694]: E0909 04:50:44.641062 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.641357 kubelet[2694]: E0909 04:50:44.641342 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.641357 kubelet[2694]: W0909 04:50:44.641354 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.641881 kubelet[2694]: E0909 04:50:44.641363 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.641881 kubelet[2694]: E0909 04:50:44.641534 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.641881 kubelet[2694]: W0909 04:50:44.641542 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.641881 kubelet[2694]: E0909 04:50:44.641550 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.642114 kubelet[2694]: E0909 04:50:44.641996 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.642114 kubelet[2694]: W0909 04:50:44.642008 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.642114 kubelet[2694]: E0909 04:50:44.642020 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.642822 kubelet[2694]: E0909 04:50:44.642798 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.642822 kubelet[2694]: W0909 04:50:44.642815 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.642918 kubelet[2694]: E0909 04:50:44.642830 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.643014 kubelet[2694]: E0909 04:50:44.643003 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.643014 kubelet[2694]: W0909 04:50:44.643012 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.643077 kubelet[2694]: E0909 04:50:44.643023 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.643347 kubelet[2694]: E0909 04:50:44.643330 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.643347 kubelet[2694]: W0909 04:50:44.643345 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.643735 kubelet[2694]: E0909 04:50:44.643358 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.649552 containerd[1540]: time="2025-09-09T04:50:44.649511043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d9584ccfc-mhnrh,Uid:ce665adb-b757-491f-92e8-8abf3235f5e4,Namespace:calico-system,Attempt:0,} returns sandbox id \"fcd9edb219c9566ed226ad3f42458e59267277b52644c9581673e1cb3cb5288c\"" Sep 9 04:50:44.653214 containerd[1540]: time="2025-09-09T04:50:44.653183722Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 04:50:44.701890 containerd[1540]: time="2025-09-09T04:50:44.701706387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jkbsz,Uid:0b17e656-0b72-41c1-890c-69c293ed9961,Namespace:calico-system,Attempt:0,}" Sep 9 04:50:44.743573 kubelet[2694]: E0909 04:50:44.743383 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.743573 kubelet[2694]: W0909 04:50:44.743407 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.743573 kubelet[2694]: E0909 04:50:44.743443 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.744822 kubelet[2694]: E0909 04:50:44.744098 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.744822 kubelet[2694]: W0909 04:50:44.744121 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.744822 kubelet[2694]: E0909 04:50:44.744135 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.745619 kubelet[2694]: E0909 04:50:44.745395 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.746034 kubelet[2694]: W0909 04:50:44.745953 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.746187 kubelet[2694]: E0909 04:50:44.746111 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.746679 kubelet[2694]: E0909 04:50:44.746658 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.746679 kubelet[2694]: W0909 04:50:44.746674 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.746879 kubelet[2694]: E0909 04:50:44.746687 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.747957 kubelet[2694]: E0909 04:50:44.747369 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.747957 kubelet[2694]: W0909 04:50:44.747383 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.747957 kubelet[2694]: E0909 04:50:44.747437 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.747957 kubelet[2694]: E0909 04:50:44.747867 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.747957 kubelet[2694]: W0909 04:50:44.747877 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.747957 kubelet[2694]: E0909 04:50:44.747888 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.749086 containerd[1540]: time="2025-09-09T04:50:44.748914293Z" level=info msg="connecting to shim 119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee" address="unix:///run/containerd/s/a0004ec757c5328a43617b2cc3b298663f09f714452b133218380a2765d8d46a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:44.749229 kubelet[2694]: E0909 04:50:44.748983 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.749229 kubelet[2694]: W0909 04:50:44.748993 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.749229 kubelet[2694]: E0909 04:50:44.749058 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.749407 kubelet[2694]: E0909 04:50:44.749388 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.749407 kubelet[2694]: W0909 04:50:44.749398 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.749576 kubelet[2694]: E0909 04:50:44.749408 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.750445 kubelet[2694]: E0909 04:50:44.750300 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.750445 kubelet[2694]: W0909 04:50:44.750317 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.750445 kubelet[2694]: E0909 04:50:44.750330 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.750918 kubelet[2694]: E0909 04:50:44.750809 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.750918 kubelet[2694]: W0909 04:50:44.750825 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.751423 kubelet[2694]: E0909 04:50:44.751185 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.752785 kubelet[2694]: E0909 04:50:44.752663 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.752785 kubelet[2694]: W0909 04:50:44.752685 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.752785 kubelet[2694]: E0909 04:50:44.752698 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.755413 kubelet[2694]: E0909 04:50:44.755375 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.755413 kubelet[2694]: W0909 04:50:44.755396 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.755575 kubelet[2694]: E0909 04:50:44.755453 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.757261 kubelet[2694]: E0909 04:50:44.757204 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.757261 kubelet[2694]: W0909 04:50:44.757224 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.757567 kubelet[2694]: E0909 04:50:44.757242 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.759186 kubelet[2694]: E0909 04:50:44.759139 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.759186 kubelet[2694]: W0909 04:50:44.759166 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.759186 kubelet[2694]: E0909 04:50:44.759180 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.761965 kubelet[2694]: E0909 04:50:44.761916 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.761965 kubelet[2694]: W0909 04:50:44.761930 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.761965 kubelet[2694]: E0909 04:50:44.761942 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.765592 kubelet[2694]: E0909 04:50:44.765567 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.765592 kubelet[2694]: W0909 04:50:44.765587 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.765859 kubelet[2694]: E0909 04:50:44.765602 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.766232 kubelet[2694]: E0909 04:50:44.766194 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.766232 kubelet[2694]: W0909 04:50:44.766212 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.766232 kubelet[2694]: E0909 04:50:44.766223 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.766460 kubelet[2694]: E0909 04:50:44.766445 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.766460 kubelet[2694]: W0909 04:50:44.766458 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.766583 kubelet[2694]: E0909 04:50:44.766467 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.766899 kubelet[2694]: E0909 04:50:44.766883 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.766899 kubelet[2694]: W0909 04:50:44.766897 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.767003 kubelet[2694]: E0909 04:50:44.766908 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.767113 kubelet[2694]: E0909 04:50:44.767095 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.767113 kubelet[2694]: W0909 04:50:44.767109 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.767202 kubelet[2694]: E0909 04:50:44.767119 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.767378 kubelet[2694]: E0909 04:50:44.767349 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.767378 kubelet[2694]: W0909 04:50:44.767366 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.767378 kubelet[2694]: E0909 04:50:44.767376 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.767950 kubelet[2694]: E0909 04:50:44.767934 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.768047 kubelet[2694]: W0909 04:50:44.768033 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.768117 kubelet[2694]: E0909 04:50:44.768104 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.768675 kubelet[2694]: E0909 04:50:44.768590 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.768771 kubelet[2694]: W0909 04:50:44.768756 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.768918 kubelet[2694]: E0909 04:50:44.768847 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.769811 kubelet[2694]: E0909 04:50:44.769778 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.769947 kubelet[2694]: W0909 04:50:44.769866 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.769947 kubelet[2694]: E0909 04:50:44.769890 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.772023 kubelet[2694]: E0909 04:50:44.771999 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.772119 kubelet[2694]: W0909 04:50:44.772105 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.772185 kubelet[2694]: E0909 04:50:44.772174 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.787294 kubelet[2694]: E0909 04:50:44.786847 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:44.787294 kubelet[2694]: W0909 04:50:44.787228 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:44.787294 kubelet[2694]: E0909 04:50:44.787265 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:44.799385 systemd[1]: Started cri-containerd-119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee.scope - libcontainer container 119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee. Sep 9 04:50:44.836346 containerd[1540]: time="2025-09-09T04:50:44.836280426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jkbsz,Uid:0b17e656-0b72-41c1-890c-69c293ed9961,Namespace:calico-system,Attempt:0,} returns sandbox id \"119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee\"" Sep 9 04:50:46.173866 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4195400067.mount: Deactivated successfully. Sep 9 04:50:46.603809 kubelet[2694]: E0909 04:50:46.603756 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drkg7" podUID="48d0235b-b892-4f7a-ad2b-f4f107a0f105" Sep 9 04:50:46.710552 containerd[1540]: time="2025-09-09T04:50:46.710274530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:46.711011 containerd[1540]: time="2025-09-09T04:50:46.710974210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 04:50:46.711976 containerd[1540]: time="2025-09-09T04:50:46.711941570Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:46.713879 containerd[1540]: time="2025-09-09T04:50:46.713852689Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:46.714615 containerd[1540]: time="2025-09-09T04:50:46.714576329Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.061353807s" Sep 9 04:50:46.714615 containerd[1540]: time="2025-09-09T04:50:46.714610489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 04:50:46.715545 containerd[1540]: time="2025-09-09T04:50:46.715503249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 04:50:46.728730 containerd[1540]: time="2025-09-09T04:50:46.728689925Z" level=info msg="CreateContainer within sandbox \"fcd9edb219c9566ed226ad3f42458e59267277b52644c9581673e1cb3cb5288c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 04:50:46.735127 containerd[1540]: time="2025-09-09T04:50:46.735088923Z" level=info msg="Container 3495282c0cb338f97ea9c6ee5df36ee8ad295a983c7be214e8733d242efd360b: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:46.742138 containerd[1540]: time="2025-09-09T04:50:46.742081441Z" level=info msg="CreateContainer within sandbox \"fcd9edb219c9566ed226ad3f42458e59267277b52644c9581673e1cb3cb5288c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3495282c0cb338f97ea9c6ee5df36ee8ad295a983c7be214e8733d242efd360b\"" Sep 9 04:50:46.742625 containerd[1540]: time="2025-09-09T04:50:46.742600561Z" level=info msg="StartContainer for \"3495282c0cb338f97ea9c6ee5df36ee8ad295a983c7be214e8733d242efd360b\"" Sep 9 04:50:46.746017 containerd[1540]: time="2025-09-09T04:50:46.745588600Z" level=info msg="connecting to shim 3495282c0cb338f97ea9c6ee5df36ee8ad295a983c7be214e8733d242efd360b" address="unix:///run/containerd/s/629f6ba7234cc590721aba9cd7d08596cd0b41676bb2759c9229dd686bd7fdd7" protocol=ttrpc version=3 Sep 9 04:50:46.775462 systemd[1]: Started cri-containerd-3495282c0cb338f97ea9c6ee5df36ee8ad295a983c7be214e8733d242efd360b.scope - libcontainer container 3495282c0cb338f97ea9c6ee5df36ee8ad295a983c7be214e8733d242efd360b. Sep 9 04:50:46.843288 containerd[1540]: time="2025-09-09T04:50:46.843236173Z" level=info msg="StartContainer for \"3495282c0cb338f97ea9c6ee5df36ee8ad295a983c7be214e8733d242efd360b\" returns successfully" Sep 9 04:50:47.745395 kubelet[2694]: E0909 04:50:47.745360 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.745395 kubelet[2694]: W0909 04:50:47.745386 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.745395 kubelet[2694]: E0909 04:50:47.745406 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.746150 kubelet[2694]: E0909 04:50:47.746011 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.746150 kubelet[2694]: W0909 04:50:47.746029 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.746150 kubelet[2694]: E0909 04:50:47.746098 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.746515 kubelet[2694]: E0909 04:50:47.746481 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.746515 kubelet[2694]: W0909 04:50:47.746496 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.746662 kubelet[2694]: E0909 04:50:47.746611 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.746948 kubelet[2694]: E0909 04:50:47.746879 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.746948 kubelet[2694]: W0909 04:50:47.746891 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.746948 kubelet[2694]: E0909 04:50:47.746901 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.747268 kubelet[2694]: E0909 04:50:47.747231 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.747392 kubelet[2694]: W0909 04:50:47.747335 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.747392 kubelet[2694]: E0909 04:50:47.747352 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.747711 kubelet[2694]: E0909 04:50:47.747642 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.747711 kubelet[2694]: W0909 04:50:47.747657 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.747711 kubelet[2694]: E0909 04:50:47.747667 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.748016 kubelet[2694]: E0909 04:50:47.748002 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.748150 kubelet[2694]: W0909 04:50:47.748078 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.748150 kubelet[2694]: E0909 04:50:47.748093 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.748463 kubelet[2694]: E0909 04:50:47.748442 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.748649 kubelet[2694]: W0909 04:50:47.748535 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.748649 kubelet[2694]: E0909 04:50:47.748552 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.748807 kubelet[2694]: E0909 04:50:47.748795 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.748864 kubelet[2694]: W0909 04:50:47.748853 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.748926 kubelet[2694]: E0909 04:50:47.748915 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.749268 kubelet[2694]: E0909 04:50:47.749197 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.749268 kubelet[2694]: W0909 04:50:47.749211 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.749268 kubelet[2694]: E0909 04:50:47.749223 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.749631 kubelet[2694]: E0909 04:50:47.749619 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.749701 kubelet[2694]: W0909 04:50:47.749689 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.749754 kubelet[2694]: E0909 04:50:47.749744 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.750086 kubelet[2694]: E0909 04:50:47.749994 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.750086 kubelet[2694]: W0909 04:50:47.750006 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.750086 kubelet[2694]: E0909 04:50:47.750041 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.750387 kubelet[2694]: E0909 04:50:47.750373 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.750595 kubelet[2694]: W0909 04:50:47.750490 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.750595 kubelet[2694]: E0909 04:50:47.750507 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.750830 kubelet[2694]: E0909 04:50:47.750816 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.750954 kubelet[2694]: W0909 04:50:47.750891 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.750954 kubelet[2694]: E0909 04:50:47.750908 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.751307 kubelet[2694]: E0909 04:50:47.751293 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.751441 kubelet[2694]: W0909 04:50:47.751363 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.751441 kubelet[2694]: E0909 04:50:47.751380 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.773957 kubelet[2694]: E0909 04:50:47.773887 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.773957 kubelet[2694]: W0909 04:50:47.773911 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.774195 kubelet[2694]: E0909 04:50:47.773929 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.774483 kubelet[2694]: E0909 04:50:47.774471 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.774563 kubelet[2694]: W0909 04:50:47.774551 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.774623 kubelet[2694]: E0909 04:50:47.774612 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.774969 kubelet[2694]: E0909 04:50:47.774911 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.774969 kubelet[2694]: W0909 04:50:47.774923 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.774969 kubelet[2694]: E0909 04:50:47.774936 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.775242 kubelet[2694]: E0909 04:50:47.775220 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.775306 kubelet[2694]: W0909 04:50:47.775268 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.775306 kubelet[2694]: E0909 04:50:47.775286 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.775530 kubelet[2694]: E0909 04:50:47.775517 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.775530 kubelet[2694]: W0909 04:50:47.775529 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.775620 kubelet[2694]: E0909 04:50:47.775539 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.775754 kubelet[2694]: E0909 04:50:47.775737 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.775754 kubelet[2694]: W0909 04:50:47.775748 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.775809 kubelet[2694]: E0909 04:50:47.775757 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.776015 kubelet[2694]: E0909 04:50:47.776002 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.776015 kubelet[2694]: W0909 04:50:47.776014 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.776077 kubelet[2694]: E0909 04:50:47.776023 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.776300 kubelet[2694]: E0909 04:50:47.776267 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.776300 kubelet[2694]: W0909 04:50:47.776276 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.776366 kubelet[2694]: E0909 04:50:47.776302 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.776585 kubelet[2694]: E0909 04:50:47.776558 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.776585 kubelet[2694]: W0909 04:50:47.776571 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.776585 kubelet[2694]: E0909 04:50:47.776579 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.776762 kubelet[2694]: E0909 04:50:47.776746 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.776762 kubelet[2694]: W0909 04:50:47.776759 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.776824 kubelet[2694]: E0909 04:50:47.776769 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.777079 kubelet[2694]: E0909 04:50:47.777065 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.777079 kubelet[2694]: W0909 04:50:47.777078 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.777161 kubelet[2694]: E0909 04:50:47.777087 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.777366 kubelet[2694]: E0909 04:50:47.777352 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.777366 kubelet[2694]: W0909 04:50:47.777365 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.777438 kubelet[2694]: E0909 04:50:47.777374 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.777811 kubelet[2694]: E0909 04:50:47.777744 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.777811 kubelet[2694]: W0909 04:50:47.777759 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.777811 kubelet[2694]: E0909 04:50:47.777770 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.778123 kubelet[2694]: E0909 04:50:47.778039 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.778123 kubelet[2694]: W0909 04:50:47.778051 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.778123 kubelet[2694]: E0909 04:50:47.778061 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.778468 kubelet[2694]: E0909 04:50:47.778455 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.778601 kubelet[2694]: W0909 04:50:47.778530 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.778601 kubelet[2694]: E0909 04:50:47.778547 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.778820 kubelet[2694]: E0909 04:50:47.778808 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.778882 kubelet[2694]: W0909 04:50:47.778870 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.778931 kubelet[2694]: E0909 04:50:47.778921 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.779212 kubelet[2694]: E0909 04:50:47.779149 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.779212 kubelet[2694]: W0909 04:50:47.779161 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.779212 kubelet[2694]: E0909 04:50:47.779170 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:47.779536 kubelet[2694]: E0909 04:50:47.779486 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 04:50:47.779536 kubelet[2694]: W0909 04:50:47.779499 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 04:50:47.779536 kubelet[2694]: E0909 04:50:47.779510 2694 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 04:50:48.065307 containerd[1540]: time="2025-09-09T04:50:48.065206166Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:48.066282 containerd[1540]: time="2025-09-09T04:50:48.066171606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 04:50:48.067739 containerd[1540]: time="2025-09-09T04:50:48.067603886Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:48.071289 containerd[1540]: time="2025-09-09T04:50:48.070840925Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:48.071480 containerd[1540]: time="2025-09-09T04:50:48.071455005Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.355923196s" Sep 9 04:50:48.071545 containerd[1540]: time="2025-09-09T04:50:48.071481125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 04:50:48.075608 containerd[1540]: time="2025-09-09T04:50:48.075568404Z" level=info msg="CreateContainer within sandbox \"119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 04:50:48.097455 containerd[1540]: time="2025-09-09T04:50:48.097406918Z" level=info msg="Container f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:48.107883 containerd[1540]: time="2025-09-09T04:50:48.107832995Z" level=info msg="CreateContainer within sandbox \"119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6\"" Sep 9 04:50:48.108300 containerd[1540]: time="2025-09-09T04:50:48.108274475Z" level=info msg="StartContainer for \"f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6\"" Sep 9 04:50:48.109699 containerd[1540]: time="2025-09-09T04:50:48.109662475Z" level=info msg="connecting to shim f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6" address="unix:///run/containerd/s/a0004ec757c5328a43617b2cc3b298663f09f714452b133218380a2765d8d46a" protocol=ttrpc version=3 Sep 9 04:50:48.142432 systemd[1]: Started cri-containerd-f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6.scope - libcontainer container f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6. Sep 9 04:50:48.177532 containerd[1540]: time="2025-09-09T04:50:48.177492578Z" level=info msg="StartContainer for \"f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6\" returns successfully" Sep 9 04:50:48.188984 systemd[1]: cri-containerd-f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6.scope: Deactivated successfully. Sep 9 04:50:48.189260 systemd[1]: cri-containerd-f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6.scope: Consumed 28ms CPU time, 6.3M memory peak, 4.5M written to disk. Sep 9 04:50:48.230636 containerd[1540]: time="2025-09-09T04:50:48.230581484Z" level=info msg="received exit event container_id:\"f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6\" id:\"f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6\" pid:3420 exited_at:{seconds:1757393448 nanos:218897327}" Sep 9 04:50:48.230903 containerd[1540]: time="2025-09-09T04:50:48.230872284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6\" id:\"f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6\" pid:3420 exited_at:{seconds:1757393448 nanos:218897327}" Sep 9 04:50:48.275172 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f0b3d9b1b3e4311b18f725ae7abaf6b9314b828326ea0d9e88793b3863f6d6a6-rootfs.mount: Deactivated successfully. Sep 9 04:50:48.603619 kubelet[2694]: E0909 04:50:48.603565 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drkg7" podUID="48d0235b-b892-4f7a-ad2b-f4f107a0f105" Sep 9 04:50:48.686828 kubelet[2694]: I0909 04:50:48.686797 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:50:48.688095 containerd[1540]: time="2025-09-09T04:50:48.687759927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 04:50:48.702445 kubelet[2694]: I0909 04:50:48.702382 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7d9584ccfc-mhnrh" podStartSLOduration=2.6398953560000002 podStartE2EDuration="4.702364683s" podCreationTimestamp="2025-09-09 04:50:44 +0000 UTC" firstStartedPulling="2025-09-09 04:50:44.652915642 +0000 UTC m=+21.145191763" lastFinishedPulling="2025-09-09 04:50:46.715385009 +0000 UTC m=+23.207661090" observedRunningTime="2025-09-09 04:50:47.694901744 +0000 UTC m=+24.187177865" watchObservedRunningTime="2025-09-09 04:50:48.702364683 +0000 UTC m=+25.194640764" Sep 9 04:50:50.604310 kubelet[2694]: E0909 04:50:50.604155 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drkg7" podUID="48d0235b-b892-4f7a-ad2b-f4f107a0f105" Sep 9 04:50:50.666571 kubelet[2694]: I0909 04:50:50.666522 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:50:51.416202 containerd[1540]: time="2025-09-09T04:50:51.416145749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:51.417278 containerd[1540]: time="2025-09-09T04:50:51.417222308Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 04:50:51.418133 containerd[1540]: time="2025-09-09T04:50:51.418093668Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:51.429299 containerd[1540]: time="2025-09-09T04:50:51.428713706Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:51.429575 containerd[1540]: time="2025-09-09T04:50:51.429538866Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.741746979s" Sep 9 04:50:51.429575 containerd[1540]: time="2025-09-09T04:50:51.429569266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 04:50:51.437030 containerd[1540]: time="2025-09-09T04:50:51.436993224Z" level=info msg="CreateContainer within sandbox \"119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 04:50:51.445268 containerd[1540]: time="2025-09-09T04:50:51.444398262Z" level=info msg="Container 8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:51.451579 containerd[1540]: time="2025-09-09T04:50:51.451543381Z" level=info msg="CreateContainer within sandbox \"119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7\"" Sep 9 04:50:51.451991 containerd[1540]: time="2025-09-09T04:50:51.451969661Z" level=info msg="StartContainer for \"8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7\"" Sep 9 04:50:51.454884 containerd[1540]: time="2025-09-09T04:50:51.454834940Z" level=info msg="connecting to shim 8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7" address="unix:///run/containerd/s/a0004ec757c5328a43617b2cc3b298663f09f714452b133218380a2765d8d46a" protocol=ttrpc version=3 Sep 9 04:50:51.480423 systemd[1]: Started cri-containerd-8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7.scope - libcontainer container 8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7. Sep 9 04:50:51.552104 containerd[1540]: time="2025-09-09T04:50:51.552067718Z" level=info msg="StartContainer for \"8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7\" returns successfully" Sep 9 04:50:52.078978 systemd[1]: cri-containerd-8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7.scope: Deactivated successfully. Sep 9 04:50:52.079302 systemd[1]: cri-containerd-8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7.scope: Consumed 479ms CPU time, 177.5M memory peak, 2.7M read from disk, 165.8M written to disk. Sep 9 04:50:52.089717 containerd[1540]: time="2025-09-09T04:50:52.089670436Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7\" id:\"8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7\" pid:3483 exited_at:{seconds:1757393452 nanos:89362796}" Sep 9 04:50:52.089844 containerd[1540]: time="2025-09-09T04:50:52.089746356Z" level=info msg="received exit event container_id:\"8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7\" id:\"8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7\" pid:3483 exited_at:{seconds:1757393452 nanos:89362796}" Sep 9 04:50:52.100980 kubelet[2694]: I0909 04:50:52.100940 2694 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 04:50:52.112670 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8422da581c215535666782892b421f067689ecbb807c1c877765aac1af1a2cf7-rootfs.mount: Deactivated successfully. Sep 9 04:50:52.259479 systemd[1]: Created slice kubepods-burstable-pod0c5aa6fa_4636_44d9_982d_83b18c1811e9.slice - libcontainer container kubepods-burstable-pod0c5aa6fa_4636_44d9_982d_83b18c1811e9.slice. Sep 9 04:50:52.268640 systemd[1]: Created slice kubepods-burstable-pod6aa13cc2_beb8_4ea2_9856_e67d80fcabd1.slice - libcontainer container kubepods-burstable-pod6aa13cc2_beb8_4ea2_9856_e67d80fcabd1.slice. Sep 9 04:50:52.276489 systemd[1]: Created slice kubepods-besteffort-pod08804fed_046c_4958_a0b9_3918de926a18.slice - libcontainer container kubepods-besteffort-pod08804fed_046c_4958_a0b9_3918de926a18.slice. Sep 9 04:50:52.284988 systemd[1]: Created slice kubepods-besteffort-pod9f302fd2_6080_487a_8cb7_150862e1d68d.slice - libcontainer container kubepods-besteffort-pod9f302fd2_6080_487a_8cb7_150862e1d68d.slice. Sep 9 04:50:52.291527 systemd[1]: Created slice kubepods-besteffort-podb7a374d6_dee0_4466_b5a4_000f07fc11a9.slice - libcontainer container kubepods-besteffort-podb7a374d6_dee0_4466_b5a4_000f07fc11a9.slice. Sep 9 04:50:52.298722 systemd[1]: Created slice kubepods-besteffort-pod7f852350_12c8_443c_9eee_567b69f5268d.slice - libcontainer container kubepods-besteffort-pod7f852350_12c8_443c_9eee_567b69f5268d.slice. Sep 9 04:50:52.305135 systemd[1]: Created slice kubepods-besteffort-pod3a948436_27af_4a6b_a20a_c855f1c51f9d.slice - libcontainer container kubepods-besteffort-pod3a948436_27af_4a6b_a20a_c855f1c51f9d.slice. Sep 9 04:50:52.311173 kubelet[2694]: I0909 04:50:52.310741 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68qs7\" (UniqueName: \"kubernetes.io/projected/3a948436-27af-4a6b-a20a-c855f1c51f9d-kube-api-access-68qs7\") pod \"calico-kube-controllers-55885cd57-8mg6f\" (UID: \"3a948436-27af-4a6b-a20a-c855f1c51f9d\") " pod="calico-system/calico-kube-controllers-55885cd57-8mg6f" Sep 9 04:50:52.311173 kubelet[2694]: I0909 04:50:52.310796 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85h44\" (UniqueName: \"kubernetes.io/projected/7f852350-12c8-443c-9eee-567b69f5268d-kube-api-access-85h44\") pod \"goldmane-54d579b49d-jg5js\" (UID: \"7f852350-12c8-443c-9eee-567b69f5268d\") " pod="calico-system/goldmane-54d579b49d-jg5js" Sep 9 04:50:52.311173 kubelet[2694]: I0909 04:50:52.310822 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwtj\" (UniqueName: \"kubernetes.io/projected/6aa13cc2-beb8-4ea2-9856-e67d80fcabd1-kube-api-access-gfwtj\") pod \"coredns-674b8bbfcf-qzc8t\" (UID: \"6aa13cc2-beb8-4ea2-9856-e67d80fcabd1\") " pod="kube-system/coredns-674b8bbfcf-qzc8t" Sep 9 04:50:52.311173 kubelet[2694]: I0909 04:50:52.310893 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08804fed-046c-4958-a0b9-3918de926a18-whisker-ca-bundle\") pod \"whisker-6f67bc767b-5nl9j\" (UID: \"08804fed-046c-4958-a0b9-3918de926a18\") " pod="calico-system/whisker-6f67bc767b-5nl9j" Sep 9 04:50:52.311173 kubelet[2694]: I0909 04:50:52.310939 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9f302fd2-6080-487a-8cb7-150862e1d68d-calico-apiserver-certs\") pod \"calico-apiserver-5dd4d7df4d-jkl4b\" (UID: \"9f302fd2-6080-487a-8cb7-150862e1d68d\") " pod="calico-apiserver/calico-apiserver-5dd4d7df4d-jkl4b" Sep 9 04:50:52.311547 kubelet[2694]: I0909 04:50:52.311087 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c5aa6fa-4636-44d9-982d-83b18c1811e9-config-volume\") pod \"coredns-674b8bbfcf-xp2ld\" (UID: \"0c5aa6fa-4636-44d9-982d-83b18c1811e9\") " pod="kube-system/coredns-674b8bbfcf-xp2ld" Sep 9 04:50:52.311547 kubelet[2694]: I0909 04:50:52.311130 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f852350-12c8-443c-9eee-567b69f5268d-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-jg5js\" (UID: \"7f852350-12c8-443c-9eee-567b69f5268d\") " pod="calico-system/goldmane-54d579b49d-jg5js" Sep 9 04:50:52.311652 kubelet[2694]: I0909 04:50:52.311632 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7f852350-12c8-443c-9eee-567b69f5268d-goldmane-key-pair\") pod \"goldmane-54d579b49d-jg5js\" (UID: \"7f852350-12c8-443c-9eee-567b69f5268d\") " pod="calico-system/goldmane-54d579b49d-jg5js" Sep 9 04:50:52.311776 kubelet[2694]: I0909 04:50:52.311759 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b7a374d6-dee0-4466-b5a4-000f07fc11a9-calico-apiserver-certs\") pod \"calico-apiserver-5dd4d7df4d-74glh\" (UID: \"b7a374d6-dee0-4466-b5a4-000f07fc11a9\") " pod="calico-apiserver/calico-apiserver-5dd4d7df4d-74glh" Sep 9 04:50:52.311851 kubelet[2694]: I0909 04:50:52.311839 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqj96\" (UniqueName: \"kubernetes.io/projected/08804fed-046c-4958-a0b9-3918de926a18-kube-api-access-vqj96\") pod \"whisker-6f67bc767b-5nl9j\" (UID: \"08804fed-046c-4958-a0b9-3918de926a18\") " pod="calico-system/whisker-6f67bc767b-5nl9j" Sep 9 04:50:52.311930 kubelet[2694]: I0909 04:50:52.311918 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a948436-27af-4a6b-a20a-c855f1c51f9d-tigera-ca-bundle\") pod \"calico-kube-controllers-55885cd57-8mg6f\" (UID: \"3a948436-27af-4a6b-a20a-c855f1c51f9d\") " pod="calico-system/calico-kube-controllers-55885cd57-8mg6f" Sep 9 04:50:52.312002 kubelet[2694]: I0909 04:50:52.311988 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f852350-12c8-443c-9eee-567b69f5268d-config\") pod \"goldmane-54d579b49d-jg5js\" (UID: \"7f852350-12c8-443c-9eee-567b69f5268d\") " pod="calico-system/goldmane-54d579b49d-jg5js" Sep 9 04:50:52.312124 kubelet[2694]: I0909 04:50:52.312103 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp4zb\" (UniqueName: \"kubernetes.io/projected/b7a374d6-dee0-4466-b5a4-000f07fc11a9-kube-api-access-mp4zb\") pod \"calico-apiserver-5dd4d7df4d-74glh\" (UID: \"b7a374d6-dee0-4466-b5a4-000f07fc11a9\") " pod="calico-apiserver/calico-apiserver-5dd4d7df4d-74glh" Sep 9 04:50:52.312217 kubelet[2694]: I0909 04:50:52.312196 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/08804fed-046c-4958-a0b9-3918de926a18-whisker-backend-key-pair\") pod \"whisker-6f67bc767b-5nl9j\" (UID: \"08804fed-046c-4958-a0b9-3918de926a18\") " pod="calico-system/whisker-6f67bc767b-5nl9j" Sep 9 04:50:52.312600 kubelet[2694]: I0909 04:50:52.312293 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvvw8\" (UniqueName: \"kubernetes.io/projected/0c5aa6fa-4636-44d9-982d-83b18c1811e9-kube-api-access-xvvw8\") pod \"coredns-674b8bbfcf-xp2ld\" (UID: \"0c5aa6fa-4636-44d9-982d-83b18c1811e9\") " pod="kube-system/coredns-674b8bbfcf-xp2ld" Sep 9 04:50:52.312600 kubelet[2694]: I0909 04:50:52.312323 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6aa13cc2-beb8-4ea2-9856-e67d80fcabd1-config-volume\") pod \"coredns-674b8bbfcf-qzc8t\" (UID: \"6aa13cc2-beb8-4ea2-9856-e67d80fcabd1\") " pod="kube-system/coredns-674b8bbfcf-qzc8t" Sep 9 04:50:52.312600 kubelet[2694]: I0909 04:50:52.312345 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2jsg\" (UniqueName: \"kubernetes.io/projected/9f302fd2-6080-487a-8cb7-150862e1d68d-kube-api-access-d2jsg\") pod \"calico-apiserver-5dd4d7df4d-jkl4b\" (UID: \"9f302fd2-6080-487a-8cb7-150862e1d68d\") " pod="calico-apiserver/calico-apiserver-5dd4d7df4d-jkl4b" Sep 9 04:50:52.565899 containerd[1540]: time="2025-09-09T04:50:52.565851211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xp2ld,Uid:0c5aa6fa-4636-44d9-982d-83b18c1811e9,Namespace:kube-system,Attempt:0,}" Sep 9 04:50:52.573971 containerd[1540]: time="2025-09-09T04:50:52.573681489Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qzc8t,Uid:6aa13cc2-beb8-4ea2-9856-e67d80fcabd1,Namespace:kube-system,Attempt:0,}" Sep 9 04:50:52.580443 containerd[1540]: time="2025-09-09T04:50:52.580406488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f67bc767b-5nl9j,Uid:08804fed-046c-4958-a0b9-3918de926a18,Namespace:calico-system,Attempt:0,}" Sep 9 04:50:52.589469 containerd[1540]: time="2025-09-09T04:50:52.589409766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-jkl4b,Uid:9f302fd2-6080-487a-8cb7-150862e1d68d,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:50:52.595555 containerd[1540]: time="2025-09-09T04:50:52.595519604Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-74glh,Uid:b7a374d6-dee0-4466-b5a4-000f07fc11a9,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:50:52.602230 containerd[1540]: time="2025-09-09T04:50:52.602191723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg5js,Uid:7f852350-12c8-443c-9eee-567b69f5268d,Namespace:calico-system,Attempt:0,}" Sep 9 04:50:52.611414 systemd[1]: Created slice kubepods-besteffort-pod48d0235b_b892_4f7a_ad2b_f4f107a0f105.slice - libcontainer container kubepods-besteffort-pod48d0235b_b892_4f7a_ad2b_f4f107a0f105.slice. Sep 9 04:50:52.612102 containerd[1540]: time="2025-09-09T04:50:52.612070241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55885cd57-8mg6f,Uid:3a948436-27af-4a6b-a20a-c855f1c51f9d,Namespace:calico-system,Attempt:0,}" Sep 9 04:50:52.614088 containerd[1540]: time="2025-09-09T04:50:52.614057640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-drkg7,Uid:48d0235b-b892-4f7a-ad2b-f4f107a0f105,Namespace:calico-system,Attempt:0,}" Sep 9 04:50:52.711452 containerd[1540]: time="2025-09-09T04:50:52.711401019Z" level=error msg="Failed to destroy network for sandbox \"f0e8c1cd67187a481def65f8e871a5ecc0a2fc84e457ef900fcaedc724b31bca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.714180 containerd[1540]: time="2025-09-09T04:50:52.714138658Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 04:50:52.714789 containerd[1540]: time="2025-09-09T04:50:52.714581218Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qzc8t,Uid:6aa13cc2-beb8-4ea2-9856-e67d80fcabd1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e8c1cd67187a481def65f8e871a5ecc0a2fc84e457ef900fcaedc724b31bca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.715344 containerd[1540]: time="2025-09-09T04:50:52.715256498Z" level=error msg="Failed to destroy network for sandbox \"12cff08af17a49dade89038ef22f191b53e8ea8efbd5c0f3753893cab3b1c803\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.716234 containerd[1540]: time="2025-09-09T04:50:52.716199378Z" level=error msg="Failed to destroy network for sandbox \"4c2d364f9a706080754f49c56ac3176c7f38236342179c9eb550a7e23cfa4a08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.717630 containerd[1540]: time="2025-09-09T04:50:52.717586337Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f67bc767b-5nl9j,Uid:08804fed-046c-4958-a0b9-3918de926a18,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12cff08af17a49dade89038ef22f191b53e8ea8efbd5c0f3753893cab3b1c803\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.717828 kubelet[2694]: E0909 04:50:52.717791 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12cff08af17a49dade89038ef22f191b53e8ea8efbd5c0f3753893cab3b1c803\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.717886 kubelet[2694]: E0909 04:50:52.717849 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12cff08af17a49dade89038ef22f191b53e8ea8efbd5c0f3753893cab3b1c803\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f67bc767b-5nl9j" Sep 9 04:50:52.717886 kubelet[2694]: E0909 04:50:52.717871 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12cff08af17a49dade89038ef22f191b53e8ea8efbd5c0f3753893cab3b1c803\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f67bc767b-5nl9j" Sep 9 04:50:52.718700 kubelet[2694]: E0909 04:50:52.718659 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6f67bc767b-5nl9j_calico-system(08804fed-046c-4958-a0b9-3918de926a18)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6f67bc767b-5nl9j_calico-system(08804fed-046c-4958-a0b9-3918de926a18)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12cff08af17a49dade89038ef22f191b53e8ea8efbd5c0f3753893cab3b1c803\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6f67bc767b-5nl9j" podUID="08804fed-046c-4958-a0b9-3918de926a18" Sep 9 04:50:52.718830 containerd[1540]: time="2025-09-09T04:50:52.718803537Z" level=error msg="Failed to destroy network for sandbox \"616b58cdfe01a162f60cafb5f80e833498220e5f035996bccc8788a4ccd4b345\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.719931 containerd[1540]: time="2025-09-09T04:50:52.719885297Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55885cd57-8mg6f,Uid:3a948436-27af-4a6b-a20a-c855f1c51f9d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c2d364f9a706080754f49c56ac3176c7f38236342179c9eb550a7e23cfa4a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.720045 kubelet[2694]: E0909 04:50:52.720026 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c2d364f9a706080754f49c56ac3176c7f38236342179c9eb550a7e23cfa4a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.720095 kubelet[2694]: E0909 04:50:52.720059 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c2d364f9a706080754f49c56ac3176c7f38236342179c9eb550a7e23cfa4a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55885cd57-8mg6f" Sep 9 04:50:52.720095 kubelet[2694]: E0909 04:50:52.720075 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c2d364f9a706080754f49c56ac3176c7f38236342179c9eb550a7e23cfa4a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55885cd57-8mg6f" Sep 9 04:50:52.720156 kubelet[2694]: E0909 04:50:52.720106 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-55885cd57-8mg6f_calico-system(3a948436-27af-4a6b-a20a-c855f1c51f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-55885cd57-8mg6f_calico-system(3a948436-27af-4a6b-a20a-c855f1c51f9d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c2d364f9a706080754f49c56ac3176c7f38236342179c9eb550a7e23cfa4a08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-55885cd57-8mg6f" podUID="3a948436-27af-4a6b-a20a-c855f1c51f9d" Sep 9 04:50:52.722806 kubelet[2694]: E0909 04:50:52.715333 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e8c1cd67187a481def65f8e871a5ecc0a2fc84e457ef900fcaedc724b31bca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.722882 kubelet[2694]: E0909 04:50:52.722821 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e8c1cd67187a481def65f8e871a5ecc0a2fc84e457ef900fcaedc724b31bca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qzc8t" Sep 9 04:50:52.722882 kubelet[2694]: E0909 04:50:52.722839 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e8c1cd67187a481def65f8e871a5ecc0a2fc84e457ef900fcaedc724b31bca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qzc8t" Sep 9 04:50:52.723139 kubelet[2694]: E0909 04:50:52.723070 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-qzc8t_kube-system(6aa13cc2-beb8-4ea2-9856-e67d80fcabd1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-qzc8t_kube-system(6aa13cc2-beb8-4ea2-9856-e67d80fcabd1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f0e8c1cd67187a481def65f8e871a5ecc0a2fc84e457ef900fcaedc724b31bca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-qzc8t" podUID="6aa13cc2-beb8-4ea2-9856-e67d80fcabd1" Sep 9 04:50:52.723325 containerd[1540]: time="2025-09-09T04:50:52.723294856Z" level=error msg="Failed to destroy network for sandbox \"83169718798c37f381548175c95d588b70a0a44a4343f9acd79d0bb4cc5cf31e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.723774 containerd[1540]: time="2025-09-09T04:50:52.723629656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-jkl4b,Uid:9f302fd2-6080-487a-8cb7-150862e1d68d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"616b58cdfe01a162f60cafb5f80e833498220e5f035996bccc8788a4ccd4b345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.724839 containerd[1540]: time="2025-09-09T04:50:52.724800136Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xp2ld,Uid:0c5aa6fa-4636-44d9-982d-83b18c1811e9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"83169718798c37f381548175c95d588b70a0a44a4343f9acd79d0bb4cc5cf31e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.724994 kubelet[2694]: E0909 04:50:52.724963 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83169718798c37f381548175c95d588b70a0a44a4343f9acd79d0bb4cc5cf31e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.725045 kubelet[2694]: E0909 04:50:52.725005 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83169718798c37f381548175c95d588b70a0a44a4343f9acd79d0bb4cc5cf31e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xp2ld" Sep 9 04:50:52.725045 kubelet[2694]: E0909 04:50:52.725023 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83169718798c37f381548175c95d588b70a0a44a4343f9acd79d0bb4cc5cf31e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xp2ld" Sep 9 04:50:52.725125 kubelet[2694]: E0909 04:50:52.725057 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-xp2ld_kube-system(0c5aa6fa-4636-44d9-982d-83b18c1811e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-xp2ld_kube-system(0c5aa6fa-4636-44d9-982d-83b18c1811e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83169718798c37f381548175c95d588b70a0a44a4343f9acd79d0bb4cc5cf31e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-xp2ld" podUID="0c5aa6fa-4636-44d9-982d-83b18c1811e9" Sep 9 04:50:52.725815 kubelet[2694]: E0909 04:50:52.724694 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616b58cdfe01a162f60cafb5f80e833498220e5f035996bccc8788a4ccd4b345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.725880 kubelet[2694]: E0909 04:50:52.725825 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616b58cdfe01a162f60cafb5f80e833498220e5f035996bccc8788a4ccd4b345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-jkl4b" Sep 9 04:50:52.725880 kubelet[2694]: E0909 04:50:52.725855 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616b58cdfe01a162f60cafb5f80e833498220e5f035996bccc8788a4ccd4b345\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-jkl4b" Sep 9 04:50:52.725931 kubelet[2694]: E0909 04:50:52.725895 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5dd4d7df4d-jkl4b_calico-apiserver(9f302fd2-6080-487a-8cb7-150862e1d68d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5dd4d7df4d-jkl4b_calico-apiserver(9f302fd2-6080-487a-8cb7-150862e1d68d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"616b58cdfe01a162f60cafb5f80e833498220e5f035996bccc8788a4ccd4b345\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-jkl4b" podUID="9f302fd2-6080-487a-8cb7-150862e1d68d" Sep 9 04:50:52.734839 containerd[1540]: time="2025-09-09T04:50:52.734495574Z" level=error msg="Failed to destroy network for sandbox \"82360b8b50126ad71eeaa8f0d47f0a25bb4d832d78593dce03d3c61e129185c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.736410 containerd[1540]: time="2025-09-09T04:50:52.736373533Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-74glh,Uid:b7a374d6-dee0-4466-b5a4-000f07fc11a9,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"82360b8b50126ad71eeaa8f0d47f0a25bb4d832d78593dce03d3c61e129185c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.736646 containerd[1540]: time="2025-09-09T04:50:52.736449613Z" level=error msg="Failed to destroy network for sandbox \"ac63b3ce11c991021eea44703a003ff165d586ea28ebea90e8c4498b10835577\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.737234 kubelet[2694]: E0909 04:50:52.736734 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82360b8b50126ad71eeaa8f0d47f0a25bb4d832d78593dce03d3c61e129185c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.737300 kubelet[2694]: E0909 04:50:52.737273 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82360b8b50126ad71eeaa8f0d47f0a25bb4d832d78593dce03d3c61e129185c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-74glh" Sep 9 04:50:52.737324 kubelet[2694]: E0909 04:50:52.737298 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"82360b8b50126ad71eeaa8f0d47f0a25bb4d832d78593dce03d3c61e129185c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-74glh" Sep 9 04:50:52.737500 kubelet[2694]: E0909 04:50:52.737430 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5dd4d7df4d-74glh_calico-apiserver(b7a374d6-dee0-4466-b5a4-000f07fc11a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5dd4d7df4d-74glh_calico-apiserver(b7a374d6-dee0-4466-b5a4-000f07fc11a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"82360b8b50126ad71eeaa8f0d47f0a25bb4d832d78593dce03d3c61e129185c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-74glh" podUID="b7a374d6-dee0-4466-b5a4-000f07fc11a9" Sep 9 04:50:52.738409 containerd[1540]: time="2025-09-09T04:50:52.738354293Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg5js,Uid:7f852350-12c8-443c-9eee-567b69f5268d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac63b3ce11c991021eea44703a003ff165d586ea28ebea90e8c4498b10835577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.738944 kubelet[2694]: E0909 04:50:52.738909 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac63b3ce11c991021eea44703a003ff165d586ea28ebea90e8c4498b10835577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.738993 kubelet[2694]: E0909 04:50:52.738958 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac63b3ce11c991021eea44703a003ff165d586ea28ebea90e8c4498b10835577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-jg5js" Sep 9 04:50:52.738993 kubelet[2694]: E0909 04:50:52.738975 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac63b3ce11c991021eea44703a003ff165d586ea28ebea90e8c4498b10835577\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-jg5js" Sep 9 04:50:52.739049 kubelet[2694]: E0909 04:50:52.739012 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-jg5js_calico-system(7f852350-12c8-443c-9eee-567b69f5268d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-jg5js_calico-system(7f852350-12c8-443c-9eee-567b69f5268d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac63b3ce11c991021eea44703a003ff165d586ea28ebea90e8c4498b10835577\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-jg5js" podUID="7f852350-12c8-443c-9eee-567b69f5268d" Sep 9 04:50:52.744447 containerd[1540]: time="2025-09-09T04:50:52.744417891Z" level=error msg="Failed to destroy network for sandbox \"14522dd66a8366e2c059aacb36a639ddd77c7b6a122ef0773fc6daf1a05d7051\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.745410 containerd[1540]: time="2025-09-09T04:50:52.745344931Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-drkg7,Uid:48d0235b-b892-4f7a-ad2b-f4f107a0f105,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"14522dd66a8366e2c059aacb36a639ddd77c7b6a122ef0773fc6daf1a05d7051\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.745636 kubelet[2694]: E0909 04:50:52.745602 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14522dd66a8366e2c059aacb36a639ddd77c7b6a122ef0773fc6daf1a05d7051\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 04:50:52.745680 kubelet[2694]: E0909 04:50:52.745667 2694 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14522dd66a8366e2c059aacb36a639ddd77c7b6a122ef0773fc6daf1a05d7051\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-drkg7" Sep 9 04:50:52.745711 kubelet[2694]: E0909 04:50:52.745684 2694 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14522dd66a8366e2c059aacb36a639ddd77c7b6a122ef0773fc6daf1a05d7051\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-drkg7" Sep 9 04:50:52.745737 kubelet[2694]: E0909 04:50:52.745714 2694 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-drkg7_calico-system(48d0235b-b892-4f7a-ad2b-f4f107a0f105)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-drkg7_calico-system(48d0235b-b892-4f7a-ad2b-f4f107a0f105)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"14522dd66a8366e2c059aacb36a639ddd77c7b6a122ef0773fc6daf1a05d7051\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-drkg7" podUID="48d0235b-b892-4f7a-ad2b-f4f107a0f105" Sep 9 04:50:53.445210 systemd[1]: run-netns-cni\x2d75821463\x2d17de\x2db02b\x2d2f30\x2d1b6cd363f7c0.mount: Deactivated successfully. Sep 9 04:50:53.445331 systemd[1]: run-netns-cni\x2d1fc8bbea\x2d98d4\x2d92d3\x2d6119\x2ddcf537fee4e7.mount: Deactivated successfully. Sep 9 04:50:53.445377 systemd[1]: run-netns-cni\x2de0d4650c\x2d6079\x2d443e\x2d67e3\x2d5c9b8c9fb9ce.mount: Deactivated successfully. Sep 9 04:50:53.445417 systemd[1]: run-netns-cni\x2db035fcc8\x2d1af3\x2d1709\x2d10da\x2d24a1b13eed14.mount: Deactivated successfully. Sep 9 04:50:56.612300 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2074490861.mount: Deactivated successfully. Sep 9 04:50:56.856152 containerd[1540]: time="2025-09-09T04:50:56.856017133Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 04:50:56.859275 containerd[1540]: time="2025-09-09T04:50:56.859180133Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:56.859531 containerd[1540]: time="2025-09-09T04:50:56.859387013Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.145063995s" Sep 9 04:50:56.859531 containerd[1540]: time="2025-09-09T04:50:56.859426653Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 04:50:56.860335 containerd[1540]: time="2025-09-09T04:50:56.859772492Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:56.860335 containerd[1540]: time="2025-09-09T04:50:56.860271932Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:56.884846 containerd[1540]: time="2025-09-09T04:50:56.883984048Z" level=info msg="CreateContainer within sandbox \"119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 04:50:56.891698 containerd[1540]: time="2025-09-09T04:50:56.891644886Z" level=info msg="Container 69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:56.900265 containerd[1540]: time="2025-09-09T04:50:56.900198045Z" level=info msg="CreateContainer within sandbox \"119af681a26c2b456f2ae25e252322b68d6705efc6bb950d382b505f46f018ee\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de\"" Sep 9 04:50:56.900986 containerd[1540]: time="2025-09-09T04:50:56.900846325Z" level=info msg="StartContainer for \"69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de\"" Sep 9 04:50:56.902567 containerd[1540]: time="2025-09-09T04:50:56.902518764Z" level=info msg="connecting to shim 69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de" address="unix:///run/containerd/s/a0004ec757c5328a43617b2cc3b298663f09f714452b133218380a2765d8d46a" protocol=ttrpc version=3 Sep 9 04:50:56.949396 systemd[1]: Started cri-containerd-69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de.scope - libcontainer container 69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de. Sep 9 04:50:56.992372 containerd[1540]: time="2025-09-09T04:50:56.992324307Z" level=info msg="StartContainer for \"69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de\" returns successfully" Sep 9 04:50:57.112000 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 04:50:57.112108 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 04:50:57.345390 kubelet[2694]: I0909 04:50:57.345343 2694 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqj96\" (UniqueName: \"kubernetes.io/projected/08804fed-046c-4958-a0b9-3918de926a18-kube-api-access-vqj96\") pod \"08804fed-046c-4958-a0b9-3918de926a18\" (UID: \"08804fed-046c-4958-a0b9-3918de926a18\") " Sep 9 04:50:57.345390 kubelet[2694]: I0909 04:50:57.345390 2694 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08804fed-046c-4958-a0b9-3918de926a18-whisker-ca-bundle\") pod \"08804fed-046c-4958-a0b9-3918de926a18\" (UID: \"08804fed-046c-4958-a0b9-3918de926a18\") " Sep 9 04:50:57.345821 kubelet[2694]: I0909 04:50:57.345419 2694 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/08804fed-046c-4958-a0b9-3918de926a18-whisker-backend-key-pair\") pod \"08804fed-046c-4958-a0b9-3918de926a18\" (UID: \"08804fed-046c-4958-a0b9-3918de926a18\") " Sep 9 04:50:57.363064 kubelet[2694]: I0909 04:50:57.363017 2694 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08804fed-046c-4958-a0b9-3918de926a18-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "08804fed-046c-4958-a0b9-3918de926a18" (UID: "08804fed-046c-4958-a0b9-3918de926a18"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 04:50:57.363064 kubelet[2694]: I0909 04:50:57.363046 2694 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08804fed-046c-4958-a0b9-3918de926a18-kube-api-access-vqj96" (OuterVolumeSpecName: "kube-api-access-vqj96") pod "08804fed-046c-4958-a0b9-3918de926a18" (UID: "08804fed-046c-4958-a0b9-3918de926a18"). InnerVolumeSpecName "kube-api-access-vqj96". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 04:50:57.363203 kubelet[2694]: I0909 04:50:57.363018 2694 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08804fed-046c-4958-a0b9-3918de926a18-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "08804fed-046c-4958-a0b9-3918de926a18" (UID: "08804fed-046c-4958-a0b9-3918de926a18"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 04:50:57.446037 kubelet[2694]: I0909 04:50:57.445975 2694 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08804fed-046c-4958-a0b9-3918de926a18-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 9 04:50:57.446037 kubelet[2694]: I0909 04:50:57.446015 2694 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/08804fed-046c-4958-a0b9-3918de926a18-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 9 04:50:57.446037 kubelet[2694]: I0909 04:50:57.446036 2694 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vqj96\" (UniqueName: \"kubernetes.io/projected/08804fed-046c-4958-a0b9-3918de926a18-kube-api-access-vqj96\") on node \"localhost\" DevicePath \"\"" Sep 9 04:50:57.613143 systemd[1]: var-lib-kubelet-pods-08804fed\x2d046c\x2d4958\x2da0b9\x2d3918de926a18-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvqj96.mount: Deactivated successfully. Sep 9 04:50:57.613235 systemd[1]: var-lib-kubelet-pods-08804fed\x2d046c\x2d4958\x2da0b9\x2d3918de926a18-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 04:50:57.615890 systemd[1]: Removed slice kubepods-besteffort-pod08804fed_046c_4958_a0b9_3918de926a18.slice - libcontainer container kubepods-besteffort-pod08804fed_046c_4958_a0b9_3918de926a18.slice. Sep 9 04:50:57.773545 kubelet[2694]: I0909 04:50:57.772507 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jkbsz" podStartSLOduration=1.7498394149999998 podStartE2EDuration="13.772491561s" podCreationTimestamp="2025-09-09 04:50:44 +0000 UTC" firstStartedPulling="2025-09-09 04:50:44.838207226 +0000 UTC m=+21.330483347" lastFinishedPulling="2025-09-09 04:50:56.860859372 +0000 UTC m=+33.353135493" observedRunningTime="2025-09-09 04:50:57.771664961 +0000 UTC m=+34.263941082" watchObservedRunningTime="2025-09-09 04:50:57.772491561 +0000 UTC m=+34.264767642" Sep 9 04:50:57.827140 systemd[1]: Created slice kubepods-besteffort-pod7840e2cc_fad1_43e8_8a5d_8db822679105.slice - libcontainer container kubepods-besteffort-pod7840e2cc_fad1_43e8_8a5d_8db822679105.slice. Sep 9 04:50:57.848622 kubelet[2694]: I0909 04:50:57.848567 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7840e2cc-fad1-43e8-8a5d-8db822679105-whisker-backend-key-pair\") pod \"whisker-5556ff9c67-2b5gl\" (UID: \"7840e2cc-fad1-43e8-8a5d-8db822679105\") " pod="calico-system/whisker-5556ff9c67-2b5gl" Sep 9 04:50:57.848622 kubelet[2694]: I0909 04:50:57.848621 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7840e2cc-fad1-43e8-8a5d-8db822679105-whisker-ca-bundle\") pod \"whisker-5556ff9c67-2b5gl\" (UID: \"7840e2cc-fad1-43e8-8a5d-8db822679105\") " pod="calico-system/whisker-5556ff9c67-2b5gl" Sep 9 04:50:57.848776 kubelet[2694]: I0909 04:50:57.848640 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbrf2\" (UniqueName: \"kubernetes.io/projected/7840e2cc-fad1-43e8-8a5d-8db822679105-kube-api-access-wbrf2\") pod \"whisker-5556ff9c67-2b5gl\" (UID: \"7840e2cc-fad1-43e8-8a5d-8db822679105\") " pod="calico-system/whisker-5556ff9c67-2b5gl" Sep 9 04:50:57.914407 containerd[1540]: time="2025-09-09T04:50:57.914267855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de\" id:\"c0560665718dc91c0eee3527d77c404cb4151126525a5897824c536557db39b7\" pid:3871 exit_status:1 exited_at:{seconds:1757393457 nanos:913902895}" Sep 9 04:50:58.131630 containerd[1540]: time="2025-09-09T04:50:58.131579895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5556ff9c67-2b5gl,Uid:7840e2cc-fad1-43e8-8a5d-8db822679105,Namespace:calico-system,Attempt:0,}" Sep 9 04:50:58.281485 systemd-networkd[1444]: cali1fd97db9a25: Link UP Sep 9 04:50:58.281658 systemd-networkd[1444]: cali1fd97db9a25: Gained carrier Sep 9 04:50:58.293164 containerd[1540]: 2025-09-09 04:50:58.152 [INFO][3887] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 04:50:58.293164 containerd[1540]: 2025-09-09 04:50:58.182 [INFO][3887] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5556ff9c67--2b5gl-eth0 whisker-5556ff9c67- calico-system 7840e2cc-fad1-43e8-8a5d-8db822679105 872 0 2025-09-09 04:50:57 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5556ff9c67 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5556ff9c67-2b5gl eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1fd97db9a25 [] [] }} ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-" Sep 9 04:50:58.293164 containerd[1540]: 2025-09-09 04:50:58.182 [INFO][3887] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" Sep 9 04:50:58.293164 containerd[1540]: 2025-09-09 04:50:58.240 [INFO][3900] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" HandleID="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Workload="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.240 [INFO][3900] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" HandleID="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Workload="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004dbba0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5556ff9c67-2b5gl", "timestamp":"2025-09-09 04:50:58.240033635 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.240 [INFO][3900] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.240 [INFO][3900] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.240 [INFO][3900] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.250 [INFO][3900] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" host="localhost" Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.255 [INFO][3900] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.259 [INFO][3900] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.261 [INFO][3900] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.263 [INFO][3900] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:50:58.293545 containerd[1540]: 2025-09-09 04:50:58.263 [INFO][3900] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" host="localhost" Sep 9 04:50:58.293804 containerd[1540]: 2025-09-09 04:50:58.264 [INFO][3900] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5 Sep 9 04:50:58.293804 containerd[1540]: 2025-09-09 04:50:58.267 [INFO][3900] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" host="localhost" Sep 9 04:50:58.293804 containerd[1540]: 2025-09-09 04:50:58.272 [INFO][3900] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" host="localhost" Sep 9 04:50:58.293804 containerd[1540]: 2025-09-09 04:50:58.272 [INFO][3900] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" host="localhost" Sep 9 04:50:58.293804 containerd[1540]: 2025-09-09 04:50:58.272 [INFO][3900] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:50:58.293804 containerd[1540]: 2025-09-09 04:50:58.272 [INFO][3900] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" HandleID="k8s-pod-network.f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Workload="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" Sep 9 04:50:58.293947 containerd[1540]: 2025-09-09 04:50:58.275 [INFO][3887] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5556ff9c67--2b5gl-eth0", GenerateName:"whisker-5556ff9c67-", Namespace:"calico-system", SelfLink:"", UID:"7840e2cc-fad1-43e8-8a5d-8db822679105", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5556ff9c67", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5556ff9c67-2b5gl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1fd97db9a25", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:50:58.293947 containerd[1540]: 2025-09-09 04:50:58.275 [INFO][3887] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" Sep 9 04:50:58.294031 containerd[1540]: 2025-09-09 04:50:58.276 [INFO][3887] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1fd97db9a25 ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" Sep 9 04:50:58.294031 containerd[1540]: 2025-09-09 04:50:58.281 [INFO][3887] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" Sep 9 04:50:58.294070 containerd[1540]: 2025-09-09 04:50:58.282 [INFO][3887] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5556ff9c67--2b5gl-eth0", GenerateName:"whisker-5556ff9c67-", Namespace:"calico-system", SelfLink:"", UID:"7840e2cc-fad1-43e8-8a5d-8db822679105", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5556ff9c67", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5", Pod:"whisker-5556ff9c67-2b5gl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1fd97db9a25", MAC:"06:9c:76:ac:99:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:50:58.294142 containerd[1540]: 2025-09-09 04:50:58.289 [INFO][3887] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" Namespace="calico-system" Pod="whisker-5556ff9c67-2b5gl" WorkloadEndpoint="localhost-k8s-whisker--5556ff9c67--2b5gl-eth0" Sep 9 04:50:58.338929 containerd[1540]: time="2025-09-09T04:50:58.338886017Z" level=info msg="connecting to shim f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5" address="unix:///run/containerd/s/2670d7051d4a07463ede5f9098c0edb0df1060fb8a4c5c42482e203ec6a1f301" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:50:58.373421 systemd[1]: Started cri-containerd-f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5.scope - libcontainer container f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5. Sep 9 04:50:58.398877 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:50:58.465267 containerd[1540]: time="2025-09-09T04:50:58.465164554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5556ff9c67-2b5gl,Uid:7840e2cc-fad1-43e8-8a5d-8db822679105,Namespace:calico-system,Attempt:0,} returns sandbox id \"f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5\"" Sep 9 04:50:58.467279 containerd[1540]: time="2025-09-09T04:50:58.467182394Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 04:50:58.774577 systemd-networkd[1444]: vxlan.calico: Link UP Sep 9 04:50:58.774583 systemd-networkd[1444]: vxlan.calico: Gained carrier Sep 9 04:50:58.857809 containerd[1540]: time="2025-09-09T04:50:58.857766003Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de\" id:\"05e0bf99127e05dce85ba01c6dc0acbff8b677d9fec3efb901b2f95c5e492d2e\" pid:4123 exit_status:1 exited_at:{seconds:1757393458 nanos:853275604}" Sep 9 04:50:59.606676 kubelet[2694]: I0909 04:50:59.606632 2694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08804fed-046c-4958-a0b9-3918de926a18" path="/var/lib/kubelet/pods/08804fed-046c-4958-a0b9-3918de926a18/volumes" Sep 9 04:50:59.729989 containerd[1540]: time="2025-09-09T04:50:59.729944569Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:59.731183 containerd[1540]: time="2025-09-09T04:50:59.731137729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 04:50:59.732506 containerd[1540]: time="2025-09-09T04:50:59.732126529Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:59.735603 containerd[1540]: time="2025-09-09T04:50:59.735570608Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:50:59.736274 containerd[1540]: time="2025-09-09T04:50:59.736233528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.269012814s" Sep 9 04:50:59.736314 containerd[1540]: time="2025-09-09T04:50:59.736277288Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 04:50:59.740998 containerd[1540]: time="2025-09-09T04:50:59.740950647Z" level=info msg="CreateContainer within sandbox \"f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 04:50:59.749190 containerd[1540]: time="2025-09-09T04:50:59.748440926Z" level=info msg="Container d3d9fe56d2dcb427de6222e0bda459a6a325e86add9bf3c7deb06ff8688722db: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:50:59.751510 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1345483081.mount: Deactivated successfully. Sep 9 04:50:59.761203 containerd[1540]: time="2025-09-09T04:50:59.761159163Z" level=info msg="CreateContainer within sandbox \"f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d3d9fe56d2dcb427de6222e0bda459a6a325e86add9bf3c7deb06ff8688722db\"" Sep 9 04:50:59.762300 containerd[1540]: time="2025-09-09T04:50:59.761546483Z" level=info msg="StartContainer for \"d3d9fe56d2dcb427de6222e0bda459a6a325e86add9bf3c7deb06ff8688722db\"" Sep 9 04:50:59.764064 containerd[1540]: time="2025-09-09T04:50:59.764015123Z" level=info msg="connecting to shim d3d9fe56d2dcb427de6222e0bda459a6a325e86add9bf3c7deb06ff8688722db" address="unix:///run/containerd/s/2670d7051d4a07463ede5f9098c0edb0df1060fb8a4c5c42482e203ec6a1f301" protocol=ttrpc version=3 Sep 9 04:50:59.789222 systemd[1]: Started cri-containerd-d3d9fe56d2dcb427de6222e0bda459a6a325e86add9bf3c7deb06ff8688722db.scope - libcontainer container d3d9fe56d2dcb427de6222e0bda459a6a325e86add9bf3c7deb06ff8688722db. Sep 9 04:50:59.832736 containerd[1540]: time="2025-09-09T04:50:59.832696791Z" level=info msg="StartContainer for \"d3d9fe56d2dcb427de6222e0bda459a6a325e86add9bf3c7deb06ff8688722db\" returns successfully" Sep 9 04:50:59.833913 containerd[1540]: time="2025-09-09T04:50:59.833892591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 04:50:59.890600 containerd[1540]: time="2025-09-09T04:50:59.890496421Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de\" id:\"db2d2c6da5a3f19fddb1383c2c52ce9e12a517ac6e660ecfbb5be424d4587591\" pid:4206 exit_status:1 exited_at:{seconds:1757393459 nanos:890172301}" Sep 9 04:51:00.039387 systemd-networkd[1444]: cali1fd97db9a25: Gained IPv6LL Sep 9 04:51:00.039662 systemd-networkd[1444]: vxlan.calico: Gained IPv6LL Sep 9 04:51:01.392327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1123331887.mount: Deactivated successfully. Sep 9 04:51:01.407328 containerd[1540]: time="2025-09-09T04:51:01.407279002Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:01.408326 containerd[1540]: time="2025-09-09T04:51:01.408296842Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 04:51:01.409203 containerd[1540]: time="2025-09-09T04:51:01.409165202Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:01.412296 containerd[1540]: time="2025-09-09T04:51:01.411383122Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:01.412296 containerd[1540]: time="2025-09-09T04:51:01.412207001Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.57815329s" Sep 9 04:51:01.412392 containerd[1540]: time="2025-09-09T04:51:01.412304121Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 04:51:01.418036 containerd[1540]: time="2025-09-09T04:51:01.417992041Z" level=info msg="CreateContainer within sandbox \"f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 04:51:01.447570 containerd[1540]: time="2025-09-09T04:51:01.424164119Z" level=info msg="Container c026148a489ef7254374110c0b68325b0871fc37996b3ed10ea323a9d7c14428: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:01.454067 containerd[1540]: time="2025-09-09T04:51:01.454013555Z" level=info msg="CreateContainer within sandbox \"f106df72624ae5572b573ff5a2fd3110b4fed5b5b7148821b615a250471ae9f5\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"c026148a489ef7254374110c0b68325b0871fc37996b3ed10ea323a9d7c14428\"" Sep 9 04:51:01.454574 containerd[1540]: time="2025-09-09T04:51:01.454547154Z" level=info msg="StartContainer for \"c026148a489ef7254374110c0b68325b0871fc37996b3ed10ea323a9d7c14428\"" Sep 9 04:51:01.457480 containerd[1540]: time="2025-09-09T04:51:01.457446794Z" level=info msg="connecting to shim c026148a489ef7254374110c0b68325b0871fc37996b3ed10ea323a9d7c14428" address="unix:///run/containerd/s/2670d7051d4a07463ede5f9098c0edb0df1060fb8a4c5c42482e203ec6a1f301" protocol=ttrpc version=3 Sep 9 04:51:01.483442 systemd[1]: Started cri-containerd-c026148a489ef7254374110c0b68325b0871fc37996b3ed10ea323a9d7c14428.scope - libcontainer container c026148a489ef7254374110c0b68325b0871fc37996b3ed10ea323a9d7c14428. Sep 9 04:51:01.527411 containerd[1540]: time="2025-09-09T04:51:01.527375102Z" level=info msg="StartContainer for \"c026148a489ef7254374110c0b68325b0871fc37996b3ed10ea323a9d7c14428\" returns successfully" Sep 9 04:51:01.782081 kubelet[2694]: I0909 04:51:01.781968 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5556ff9c67-2b5gl" podStartSLOduration=1.834732933 podStartE2EDuration="4.78195186s" podCreationTimestamp="2025-09-09 04:50:57 +0000 UTC" firstStartedPulling="2025-09-09 04:50:58.466816394 +0000 UTC m=+34.959092475" lastFinishedPulling="2025-09-09 04:51:01.414035281 +0000 UTC m=+37.906311402" observedRunningTime="2025-09-09 04:51:01.78049718 +0000 UTC m=+38.272773301" watchObservedRunningTime="2025-09-09 04:51:01.78195186 +0000 UTC m=+38.274227981" Sep 9 04:51:03.607139 containerd[1540]: time="2025-09-09T04:51:03.607084485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qzc8t,Uid:6aa13cc2-beb8-4ea2-9856-e67d80fcabd1,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:03.609410 containerd[1540]: time="2025-09-09T04:51:03.607537885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-drkg7,Uid:48d0235b-b892-4f7a-ad2b-f4f107a0f105,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:03.756348 systemd-networkd[1444]: cali60454563148: Link UP Sep 9 04:51:03.756537 systemd-networkd[1444]: cali60454563148: Gained carrier Sep 9 04:51:03.774868 containerd[1540]: 2025-09-09 04:51:03.667 [INFO][4308] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--drkg7-eth0 csi-node-driver- calico-system 48d0235b-b892-4f7a-ad2b-f4f107a0f105 679 0 2025-09-09 04:50:44 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-drkg7 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali60454563148 [] [] }} ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-" Sep 9 04:51:03.774868 containerd[1540]: 2025-09-09 04:51:03.667 [INFO][4308] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-eth0" Sep 9 04:51:03.774868 containerd[1540]: 2025-09-09 04:51:03.701 [INFO][4327] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" HandleID="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Workload="localhost-k8s-csi--node--driver--drkg7-eth0" Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.702 [INFO][4327] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" HandleID="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Workload="localhost-k8s-csi--node--driver--drkg7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400050ea60), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-drkg7", "timestamp":"2025-09-09 04:51:03.70186271 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.702 [INFO][4327] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.702 [INFO][4327] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.702 [INFO][4327] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.719 [INFO][4327] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" host="localhost" Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.727 [INFO][4327] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.732 [INFO][4327] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.736 [INFO][4327] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.738 [INFO][4327] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:03.775076 containerd[1540]: 2025-09-09 04:51:03.738 [INFO][4327] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" host="localhost" Sep 9 04:51:03.775308 containerd[1540]: 2025-09-09 04:51:03.740 [INFO][4327] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96 Sep 9 04:51:03.775308 containerd[1540]: 2025-09-09 04:51:03.744 [INFO][4327] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" host="localhost" Sep 9 04:51:03.775308 containerd[1540]: 2025-09-09 04:51:03.749 [INFO][4327] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" host="localhost" Sep 9 04:51:03.775308 containerd[1540]: 2025-09-09 04:51:03.749 [INFO][4327] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" host="localhost" Sep 9 04:51:03.775308 containerd[1540]: 2025-09-09 04:51:03.750 [INFO][4327] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:03.775308 containerd[1540]: 2025-09-09 04:51:03.750 [INFO][4327] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" HandleID="k8s-pod-network.72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Workload="localhost-k8s-csi--node--driver--drkg7-eth0" Sep 9 04:51:03.775481 containerd[1540]: 2025-09-09 04:51:03.753 [INFO][4308] cni-plugin/k8s.go 418: Populated endpoint ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--drkg7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"48d0235b-b892-4f7a-ad2b-f4f107a0f105", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-drkg7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali60454563148", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:03.775534 containerd[1540]: 2025-09-09 04:51:03.753 [INFO][4308] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-eth0" Sep 9 04:51:03.775534 containerd[1540]: 2025-09-09 04:51:03.753 [INFO][4308] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60454563148 ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-eth0" Sep 9 04:51:03.775534 containerd[1540]: 2025-09-09 04:51:03.756 [INFO][4308] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-eth0" Sep 9 04:51:03.775591 containerd[1540]: 2025-09-09 04:51:03.757 [INFO][4308] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--drkg7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"48d0235b-b892-4f7a-ad2b-f4f107a0f105", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96", Pod:"csi-node-driver-drkg7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali60454563148", MAC:"62:65:d6:9a:a8:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:03.775638 containerd[1540]: 2025-09-09 04:51:03.772 [INFO][4308] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" Namespace="calico-system" Pod="csi-node-driver-drkg7" WorkloadEndpoint="localhost-k8s-csi--node--driver--drkg7-eth0" Sep 9 04:51:03.810563 containerd[1540]: time="2025-09-09T04:51:03.810471773Z" level=info msg="connecting to shim 72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96" address="unix:///run/containerd/s/2d7e455c71373f483f4ac42d18e8190c637e6432506c6da1d0c09dc5228d8934" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:03.839488 systemd[1]: Started cri-containerd-72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96.scope - libcontainer container 72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96. Sep 9 04:51:03.856608 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:03.862424 systemd-networkd[1444]: calib2e9f6f4174: Link UP Sep 9 04:51:03.863648 systemd-networkd[1444]: calib2e9f6f4174: Gained carrier Sep 9 04:51:03.879493 containerd[1540]: 2025-09-09 04:51:03.666 [INFO][4298] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0 coredns-674b8bbfcf- kube-system 6aa13cc2-beb8-4ea2-9856-e67d80fcabd1 808 0 2025-09-09 04:50:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-qzc8t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib2e9f6f4174 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-" Sep 9 04:51:03.879493 containerd[1540]: 2025-09-09 04:51:03.667 [INFO][4298] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" Sep 9 04:51:03.879493 containerd[1540]: 2025-09-09 04:51:03.709 [INFO][4325] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" HandleID="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Workload="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.709 [INFO][4325] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" HandleID="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Workload="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004de00), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-qzc8t", "timestamp":"2025-09-09 04:51:03.709695469 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.709 [INFO][4325] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.750 [INFO][4325] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.750 [INFO][4325] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.819 [INFO][4325] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" host="localhost" Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.826 [INFO][4325] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.834 [INFO][4325] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.836 [INFO][4325] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.838 [INFO][4325] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:03.879702 containerd[1540]: 2025-09-09 04:51:03.838 [INFO][4325] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" host="localhost" Sep 9 04:51:03.879897 containerd[1540]: 2025-09-09 04:51:03.839 [INFO][4325] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab Sep 9 04:51:03.879897 containerd[1540]: 2025-09-09 04:51:03.845 [INFO][4325] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" host="localhost" Sep 9 04:51:03.879897 containerd[1540]: 2025-09-09 04:51:03.853 [INFO][4325] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" host="localhost" Sep 9 04:51:03.879897 containerd[1540]: 2025-09-09 04:51:03.853 [INFO][4325] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" host="localhost" Sep 9 04:51:03.879897 containerd[1540]: 2025-09-09 04:51:03.853 [INFO][4325] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:03.879897 containerd[1540]: 2025-09-09 04:51:03.853 [INFO][4325] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" HandleID="k8s-pod-network.956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Workload="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" Sep 9 04:51:03.880003 containerd[1540]: 2025-09-09 04:51:03.858 [INFO][4298] cni-plugin/k8s.go 418: Populated endpoint ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6aa13cc2-beb8-4ea2-9856-e67d80fcabd1", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-qzc8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2e9f6f4174", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:03.880079 containerd[1540]: 2025-09-09 04:51:03.858 [INFO][4298] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" Sep 9 04:51:03.880079 containerd[1540]: 2025-09-09 04:51:03.858 [INFO][4298] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib2e9f6f4174 ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" Sep 9 04:51:03.880079 containerd[1540]: 2025-09-09 04:51:03.863 [INFO][4298] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" Sep 9 04:51:03.880142 containerd[1540]: 2025-09-09 04:51:03.864 [INFO][4298] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6aa13cc2-beb8-4ea2-9856-e67d80fcabd1", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab", Pod:"coredns-674b8bbfcf-qzc8t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib2e9f6f4174", MAC:"86:83:85:01:fd:3f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:03.880142 containerd[1540]: 2025-09-09 04:51:03.874 [INFO][4298] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" Namespace="kube-system" Pod="coredns-674b8bbfcf-qzc8t" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qzc8t-eth0" Sep 9 04:51:03.881336 containerd[1540]: time="2025-09-09T04:51:03.880839001Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-drkg7,Uid:48d0235b-b892-4f7a-ad2b-f4f107a0f105,Namespace:calico-system,Attempt:0,} returns sandbox id \"72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96\"" Sep 9 04:51:03.883861 containerd[1540]: time="2025-09-09T04:51:03.883813801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 04:51:03.902753 containerd[1540]: time="2025-09-09T04:51:03.902504438Z" level=info msg="connecting to shim 956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab" address="unix:///run/containerd/s/efddbd96b0b1989cee5b8a13058f772af093aaa96d7c70fbdb2a8d7439961e83" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:03.928763 systemd[1]: Started cri-containerd-956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab.scope - libcontainer container 956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab. Sep 9 04:51:03.941177 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:04.001903 containerd[1540]: time="2025-09-09T04:51:04.001846902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qzc8t,Uid:6aa13cc2-beb8-4ea2-9856-e67d80fcabd1,Namespace:kube-system,Attempt:0,} returns sandbox id \"956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab\"" Sep 9 04:51:04.021232 containerd[1540]: time="2025-09-09T04:51:04.021189259Z" level=info msg="CreateContainer within sandbox \"956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:51:04.040532 containerd[1540]: time="2025-09-09T04:51:04.040484736Z" level=info msg="Container b9e8df2c8867c7052eeaf8071794584f8da276023ebf8f020773a56134e1a68a: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:04.047193 containerd[1540]: time="2025-09-09T04:51:04.047092095Z" level=info msg="CreateContainer within sandbox \"956f8f6d894d487c21778055a8ba3b68c89fd213d740b153573bf608265c1bab\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b9e8df2c8867c7052eeaf8071794584f8da276023ebf8f020773a56134e1a68a\"" Sep 9 04:51:04.047873 containerd[1540]: time="2025-09-09T04:51:04.047843415Z" level=info msg="StartContainer for \"b9e8df2c8867c7052eeaf8071794584f8da276023ebf8f020773a56134e1a68a\"" Sep 9 04:51:04.050074 containerd[1540]: time="2025-09-09T04:51:04.050010775Z" level=info msg="connecting to shim b9e8df2c8867c7052eeaf8071794584f8da276023ebf8f020773a56134e1a68a" address="unix:///run/containerd/s/efddbd96b0b1989cee5b8a13058f772af093aaa96d7c70fbdb2a8d7439961e83" protocol=ttrpc version=3 Sep 9 04:51:04.074445 systemd[1]: Started cri-containerd-b9e8df2c8867c7052eeaf8071794584f8da276023ebf8f020773a56134e1a68a.scope - libcontainer container b9e8df2c8867c7052eeaf8071794584f8da276023ebf8f020773a56134e1a68a. Sep 9 04:51:04.103976 containerd[1540]: time="2025-09-09T04:51:04.103937326Z" level=info msg="StartContainer for \"b9e8df2c8867c7052eeaf8071794584f8da276023ebf8f020773a56134e1a68a\" returns successfully" Sep 9 04:51:04.808838 kubelet[2694]: I0909 04:51:04.808756 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-qzc8t" podStartSLOduration=33.808739897 podStartE2EDuration="33.808739897s" podCreationTimestamp="2025-09-09 04:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:51:04.808733697 +0000 UTC m=+41.301009818" watchObservedRunningTime="2025-09-09 04:51:04.808739897 +0000 UTC m=+41.301016018" Sep 9 04:51:04.976734 containerd[1540]: time="2025-09-09T04:51:04.976686311Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:04.977600 containerd[1540]: time="2025-09-09T04:51:04.977439951Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 04:51:04.978410 containerd[1540]: time="2025-09-09T04:51:04.978374031Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:04.980564 containerd[1540]: time="2025-09-09T04:51:04.980523831Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:04.981486 containerd[1540]: time="2025-09-09T04:51:04.981454551Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.09759023s" Sep 9 04:51:04.981548 containerd[1540]: time="2025-09-09T04:51:04.981486991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 04:51:04.987570 containerd[1540]: time="2025-09-09T04:51:04.987523470Z" level=info msg="CreateContainer within sandbox \"72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 04:51:04.998289 containerd[1540]: time="2025-09-09T04:51:04.997416868Z" level=info msg="Container a4441b005cdfe06400f243857d025f652fa93d17425998b38490fffc803209fa: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:05.011237 containerd[1540]: time="2025-09-09T04:51:05.011176026Z" level=info msg="CreateContainer within sandbox \"72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a4441b005cdfe06400f243857d025f652fa93d17425998b38490fffc803209fa\"" Sep 9 04:51:05.013427 containerd[1540]: time="2025-09-09T04:51:05.013389786Z" level=info msg="StartContainer for \"a4441b005cdfe06400f243857d025f652fa93d17425998b38490fffc803209fa\"" Sep 9 04:51:05.016262 containerd[1540]: time="2025-09-09T04:51:05.016196745Z" level=info msg="connecting to shim a4441b005cdfe06400f243857d025f652fa93d17425998b38490fffc803209fa" address="unix:///run/containerd/s/2d7e455c71373f483f4ac42d18e8190c637e6432506c6da1d0c09dc5228d8934" protocol=ttrpc version=3 Sep 9 04:51:05.037467 systemd[1]: Started cri-containerd-a4441b005cdfe06400f243857d025f652fa93d17425998b38490fffc803209fa.scope - libcontainer container a4441b005cdfe06400f243857d025f652fa93d17425998b38490fffc803209fa. Sep 9 04:51:05.074210 containerd[1540]: time="2025-09-09T04:51:05.073819657Z" level=info msg="StartContainer for \"a4441b005cdfe06400f243857d025f652fa93d17425998b38490fffc803209fa\" returns successfully" Sep 9 04:51:05.077985 containerd[1540]: time="2025-09-09T04:51:05.077939656Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 04:51:05.223440 systemd-networkd[1444]: cali60454563148: Gained IPv6LL Sep 9 04:51:05.352935 systemd-networkd[1444]: calib2e9f6f4174: Gained IPv6LL Sep 9 04:51:05.603986 containerd[1540]: time="2025-09-09T04:51:05.603874976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xp2ld,Uid:0c5aa6fa-4636-44d9-982d-83b18c1811e9,Namespace:kube-system,Attempt:0,}" Sep 9 04:51:05.782695 systemd-networkd[1444]: cali787a698cfce: Link UP Sep 9 04:51:05.783454 systemd-networkd[1444]: cali787a698cfce: Gained carrier Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.643 [INFO][4529] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0 coredns-674b8bbfcf- kube-system 0c5aa6fa-4636-44d9-982d-83b18c1811e9 806 0 2025-09-09 04:50:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-xp2ld eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali787a698cfce [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.644 [INFO][4529] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.666 [INFO][4544] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" HandleID="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Workload="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.666 [INFO][4544] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" HandleID="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Workload="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3810), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-xp2ld", "timestamp":"2025-09-09 04:51:05.666081487 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.666 [INFO][4544] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.666 [INFO][4544] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.666 [INFO][4544] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.675 [INFO][4544] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.685 [INFO][4544] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.690 [INFO][4544] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.692 [INFO][4544] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.694 [INFO][4544] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.694 [INFO][4544] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.695 [INFO][4544] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.704 [INFO][4544] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.777 [INFO][4544] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.777 [INFO][4544] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" host="localhost" Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.777 [INFO][4544] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:05.800789 containerd[1540]: 2025-09-09 04:51:05.777 [INFO][4544] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" HandleID="k8s-pod-network.17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Workload="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" Sep 9 04:51:05.801326 containerd[1540]: 2025-09-09 04:51:05.779 [INFO][4529] cni-plugin/k8s.go 418: Populated endpoint ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0c5aa6fa-4636-44d9-982d-83b18c1811e9", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-xp2ld", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali787a698cfce", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:05.801326 containerd[1540]: 2025-09-09 04:51:05.779 [INFO][4529] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" Sep 9 04:51:05.801326 containerd[1540]: 2025-09-09 04:51:05.779 [INFO][4529] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali787a698cfce ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" Sep 9 04:51:05.801326 containerd[1540]: 2025-09-09 04:51:05.783 [INFO][4529] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" Sep 9 04:51:05.801326 containerd[1540]: 2025-09-09 04:51:05.784 [INFO][4529] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0c5aa6fa-4636-44d9-982d-83b18c1811e9", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b", Pod:"coredns-674b8bbfcf-xp2ld", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali787a698cfce", MAC:"6e:ef:32:d5:b4:6c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:05.801326 containerd[1540]: 2025-09-09 04:51:05.797 [INFO][4529] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" Namespace="kube-system" Pod="coredns-674b8bbfcf-xp2ld" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--xp2ld-eth0" Sep 9 04:51:05.819858 containerd[1540]: time="2025-09-09T04:51:05.819722704Z" level=info msg="connecting to shim 17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b" address="unix:///run/containerd/s/5de6072666bb8d7226dbfb33a8539dd5621664fb0e9c6e5fb7a836593984a33d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:05.839413 systemd[1]: Started cri-containerd-17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b.scope - libcontainer container 17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b. Sep 9 04:51:05.850276 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:05.870660 containerd[1540]: time="2025-09-09T04:51:05.870511816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xp2ld,Uid:0c5aa6fa-4636-44d9-982d-83b18c1811e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b\"" Sep 9 04:51:05.875595 containerd[1540]: time="2025-09-09T04:51:05.875563215Z" level=info msg="CreateContainer within sandbox \"17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 04:51:05.886071 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4033176475.mount: Deactivated successfully. Sep 9 04:51:05.888242 containerd[1540]: time="2025-09-09T04:51:05.888202333Z" level=info msg="Container e3c533d5b7528c0c730cb17ff907ee77c18ebb0fb560c44845741a03d44d3696: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:05.894060 containerd[1540]: time="2025-09-09T04:51:05.894026413Z" level=info msg="CreateContainer within sandbox \"17dc67361ef66c47d75b8a04de7c6ea9de4ecdf9c32e179818466c6706d6075b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e3c533d5b7528c0c730cb17ff907ee77c18ebb0fb560c44845741a03d44d3696\"" Sep 9 04:51:05.894455 containerd[1540]: time="2025-09-09T04:51:05.894431693Z" level=info msg="StartContainer for \"e3c533d5b7528c0c730cb17ff907ee77c18ebb0fb560c44845741a03d44d3696\"" Sep 9 04:51:05.895163 containerd[1540]: time="2025-09-09T04:51:05.895138692Z" level=info msg="connecting to shim e3c533d5b7528c0c730cb17ff907ee77c18ebb0fb560c44845741a03d44d3696" address="unix:///run/containerd/s/5de6072666bb8d7226dbfb33a8539dd5621664fb0e9c6e5fb7a836593984a33d" protocol=ttrpc version=3 Sep 9 04:51:05.917381 systemd[1]: Started cri-containerd-e3c533d5b7528c0c730cb17ff907ee77c18ebb0fb560c44845741a03d44d3696.scope - libcontainer container e3c533d5b7528c0c730cb17ff907ee77c18ebb0fb560c44845741a03d44d3696. Sep 9 04:51:05.942127 containerd[1540]: time="2025-09-09T04:51:05.942004645Z" level=info msg="StartContainer for \"e3c533d5b7528c0c730cb17ff907ee77c18ebb0fb560c44845741a03d44d3696\" returns successfully" Sep 9 04:51:06.580002 containerd[1540]: time="2025-09-09T04:51:06.579961471Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:06.580707 containerd[1540]: time="2025-09-09T04:51:06.580390991Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 04:51:06.581214 containerd[1540]: time="2025-09-09T04:51:06.581183071Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:06.583372 containerd[1540]: time="2025-09-09T04:51:06.583303870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:06.583898 containerd[1540]: time="2025-09-09T04:51:06.583827670Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.505843574s" Sep 9 04:51:06.583898 containerd[1540]: time="2025-09-09T04:51:06.583859950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 04:51:06.587880 containerd[1540]: time="2025-09-09T04:51:06.587828670Z" level=info msg="CreateContainer within sandbox \"72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 04:51:06.593374 containerd[1540]: time="2025-09-09T04:51:06.593349389Z" level=info msg="Container 581d9edd3e001d263dfe5f3ec96ba9cf741ddc32c97ca2031283bb81d45c256c: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:06.601511 containerd[1540]: time="2025-09-09T04:51:06.601461908Z" level=info msg="CreateContainer within sandbox \"72e698e6e37c280960718e0ba4e8fd51c3ffc8d1016468d04a265420efb80e96\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"581d9edd3e001d263dfe5f3ec96ba9cf741ddc32c97ca2031283bb81d45c256c\"" Sep 9 04:51:06.602196 containerd[1540]: time="2025-09-09T04:51:06.602163987Z" level=info msg="StartContainer for \"581d9edd3e001d263dfe5f3ec96ba9cf741ddc32c97ca2031283bb81d45c256c\"" Sep 9 04:51:06.604689 containerd[1540]: time="2025-09-09T04:51:06.604174827Z" level=info msg="connecting to shim 581d9edd3e001d263dfe5f3ec96ba9cf741ddc32c97ca2031283bb81d45c256c" address="unix:///run/containerd/s/2d7e455c71373f483f4ac42d18e8190c637e6432506c6da1d0c09dc5228d8934" protocol=ttrpc version=3 Sep 9 04:51:06.604689 containerd[1540]: time="2025-09-09T04:51:06.604511387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55885cd57-8mg6f,Uid:3a948436-27af-4a6b-a20a-c855f1c51f9d,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:06.605388 containerd[1540]: time="2025-09-09T04:51:06.605364827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-74glh,Uid:b7a374d6-dee0-4466-b5a4-000f07fc11a9,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:51:06.606305 containerd[1540]: time="2025-09-09T04:51:06.606270067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-jkl4b,Uid:9f302fd2-6080-487a-8cb7-150862e1d68d,Namespace:calico-apiserver,Attempt:0,}" Sep 9 04:51:06.669628 systemd[1]: Started cri-containerd-581d9edd3e001d263dfe5f3ec96ba9cf741ddc32c97ca2031283bb81d45c256c.scope - libcontainer container 581d9edd3e001d263dfe5f3ec96ba9cf741ddc32c97ca2031283bb81d45c256c. Sep 9 04:51:06.672676 systemd[1]: Started sshd@7-10.0.0.33:22-10.0.0.1:59912.service - OpenSSH per-connection server daemon (10.0.0.1:59912). Sep 9 04:51:06.766231 sshd[4709]: Accepted publickey for core from 10.0.0.1 port 59912 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:06.767385 systemd-networkd[1444]: cali1eafb99e186: Link UP Sep 9 04:51:06.767513 systemd-networkd[1444]: cali1eafb99e186: Gained carrier Sep 9 04:51:06.768514 sshd-session[4709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:06.772974 containerd[1540]: time="2025-09-09T04:51:06.772930242Z" level=info msg="StartContainer for \"581d9edd3e001d263dfe5f3ec96ba9cf741ddc32c97ca2031283bb81d45c256c\" returns successfully" Sep 9 04:51:06.782007 systemd-logind[1507]: New session 8 of user core. Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.681 [INFO][4665] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0 calico-kube-controllers-55885cd57- calico-system 3a948436-27af-4a6b-a20a-c855f1c51f9d 809 0 2025-09-09 04:50:44 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:55885cd57 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-55885cd57-8mg6f eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1eafb99e186 [] [] }} ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.681 [INFO][4665] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.720 [INFO][4726] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" HandleID="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Workload="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.721 [INFO][4726] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" HandleID="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Workload="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3260), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-55885cd57-8mg6f", "timestamp":"2025-09-09 04:51:06.72089309 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.721 [INFO][4726] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.721 [INFO][4726] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.721 [INFO][4726] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.733 [INFO][4726] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.738 [INFO][4726] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.743 [INFO][4726] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.745 [INFO][4726] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.748 [INFO][4726] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.748 [INFO][4726] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.749 [INFO][4726] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9 Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.754 [INFO][4726] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.761 [INFO][4726] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.761 [INFO][4726] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" host="localhost" Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.761 [INFO][4726] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:06.787337 containerd[1540]: 2025-09-09 04:51:06.761 [INFO][4726] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" HandleID="k8s-pod-network.9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Workload="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" Sep 9 04:51:06.787806 containerd[1540]: 2025-09-09 04:51:06.763 [INFO][4665] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0", GenerateName:"calico-kube-controllers-55885cd57-", Namespace:"calico-system", SelfLink:"", UID:"3a948436-27af-4a6b-a20a-c855f1c51f9d", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55885cd57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-55885cd57-8mg6f", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1eafb99e186", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:06.787806 containerd[1540]: 2025-09-09 04:51:06.763 [INFO][4665] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" Sep 9 04:51:06.787806 containerd[1540]: 2025-09-09 04:51:06.763 [INFO][4665] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1eafb99e186 ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" Sep 9 04:51:06.787806 containerd[1540]: 2025-09-09 04:51:06.765 [INFO][4665] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" Sep 9 04:51:06.787806 containerd[1540]: 2025-09-09 04:51:06.765 [INFO][4665] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0", GenerateName:"calico-kube-controllers-55885cd57-", Namespace:"calico-system", SelfLink:"", UID:"3a948436-27af-4a6b-a20a-c855f1c51f9d", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55885cd57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9", Pod:"calico-kube-controllers-55885cd57-8mg6f", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1eafb99e186", MAC:"76:f1:5e:08:e2:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:06.787806 containerd[1540]: 2025-09-09 04:51:06.784 [INFO][4665] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" Namespace="calico-system" Pod="calico-kube-controllers-55885cd57-8mg6f" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--55885cd57--8mg6f-eth0" Sep 9 04:51:06.790451 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 04:51:06.813787 kubelet[2694]: I0909 04:51:06.811950 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-xp2ld" podStartSLOduration=35.811933076 podStartE2EDuration="35.811933076s" podCreationTimestamp="2025-09-09 04:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 04:51:06.810096917 +0000 UTC m=+43.302373038" watchObservedRunningTime="2025-09-09 04:51:06.811933076 +0000 UTC m=+43.304209197" Sep 9 04:51:06.821377 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount181476569.mount: Deactivated successfully. Sep 9 04:51:06.834009 kubelet[2694]: I0909 04:51:06.833894 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-drkg7" podStartSLOduration=20.132383924 podStartE2EDuration="22.833785433s" podCreationTimestamp="2025-09-09 04:50:44 +0000 UTC" firstStartedPulling="2025-09-09 04:51:03.883551481 +0000 UTC m=+40.375827602" lastFinishedPulling="2025-09-09 04:51:06.58495299 +0000 UTC m=+43.077229111" observedRunningTime="2025-09-09 04:51:06.824814315 +0000 UTC m=+43.317090396" watchObservedRunningTime="2025-09-09 04:51:06.833785433 +0000 UTC m=+43.326061554" Sep 9 04:51:06.853735 containerd[1540]: time="2025-09-09T04:51:06.853643710Z" level=info msg="connecting to shim 9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9" address="unix:///run/containerd/s/72e9aa6e75833fa92d233e2e2c78125c9b73059a785c20c4602d2b70666086b9" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:06.886096 systemd-networkd[1444]: calida5b7f6a107: Link UP Sep 9 04:51:06.887767 systemd-networkd[1444]: calida5b7f6a107: Gained carrier Sep 9 04:51:06.903539 systemd[1]: Started cri-containerd-9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9.scope - libcontainer container 9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9. Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.666 [INFO][4655] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0 calico-apiserver-5dd4d7df4d- calico-apiserver b7a374d6-dee0-4466-b5a4-000f07fc11a9 810 0 2025-09-09 04:50:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5dd4d7df4d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5dd4d7df4d-74glh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calida5b7f6a107 [] [] }} ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.666 [INFO][4655] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.728 [INFO][4711] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" HandleID="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Workload="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.728 [INFO][4711] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" HandleID="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Workload="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137bd0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5dd4d7df4d-74glh", "timestamp":"2025-09-09 04:51:06.72347289 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.728 [INFO][4711] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.761 [INFO][4711] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.761 [INFO][4711] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.834 [INFO][4711] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.843 [INFO][4711] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.850 [INFO][4711] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.853 [INFO][4711] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.856 [INFO][4711] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.856 [INFO][4711] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.857 [INFO][4711] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.864 [INFO][4711] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.874 [INFO][4711] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.874 [INFO][4711] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" host="localhost" Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.874 [INFO][4711] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:06.910227 containerd[1540]: 2025-09-09 04:51:06.874 [INFO][4711] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" HandleID="k8s-pod-network.7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Workload="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" Sep 9 04:51:06.911213 containerd[1540]: 2025-09-09 04:51:06.876 [INFO][4655] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0", GenerateName:"calico-apiserver-5dd4d7df4d-", Namespace:"calico-apiserver", SelfLink:"", UID:"b7a374d6-dee0-4466-b5a4-000f07fc11a9", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dd4d7df4d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5dd4d7df4d-74glh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calida5b7f6a107", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:06.911213 containerd[1540]: 2025-09-09 04:51:06.877 [INFO][4655] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" Sep 9 04:51:06.911213 containerd[1540]: 2025-09-09 04:51:06.877 [INFO][4655] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calida5b7f6a107 ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" Sep 9 04:51:06.911213 containerd[1540]: 2025-09-09 04:51:06.889 [INFO][4655] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" Sep 9 04:51:06.911213 containerd[1540]: 2025-09-09 04:51:06.891 [INFO][4655] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0", GenerateName:"calico-apiserver-5dd4d7df4d-", Namespace:"calico-apiserver", SelfLink:"", UID:"b7a374d6-dee0-4466-b5a4-000f07fc11a9", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dd4d7df4d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff", Pod:"calico-apiserver-5dd4d7df4d-74glh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calida5b7f6a107", MAC:"6e:ee:7a:73:1b:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:06.911213 containerd[1540]: 2025-09-09 04:51:06.902 [INFO][4655] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-74glh" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--74glh-eth0" Sep 9 04:51:06.923959 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:06.941221 containerd[1540]: time="2025-09-09T04:51:06.941018297Z" level=info msg="connecting to shim 7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff" address="unix:///run/containerd/s/9abf6133e369eacc4b8dd94235fc3ee6d1467ff58e20bed46512ef66ade520f5" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:06.976366 containerd[1540]: time="2025-09-09T04:51:06.976327092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55885cd57-8mg6f,Uid:3a948436-27af-4a6b-a20a-c855f1c51f9d,Namespace:calico-system,Attempt:0,} returns sandbox id \"9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9\"" Sep 9 04:51:06.982271 containerd[1540]: time="2025-09-09T04:51:06.982047491Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 04:51:06.984389 systemd-networkd[1444]: calid6781bf44d9: Link UP Sep 9 04:51:06.984994 systemd-networkd[1444]: calid6781bf44d9: Gained carrier Sep 9 04:51:06.986452 systemd[1]: Started cri-containerd-7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff.scope - libcontainer container 7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff. Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.679 [INFO][4686] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0 calico-apiserver-5dd4d7df4d- calico-apiserver 9f302fd2-6080-487a-8cb7-150862e1d68d 811 0 2025-09-09 04:50:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5dd4d7df4d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5dd4d7df4d-jkl4b eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid6781bf44d9 [] [] }} ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.679 [INFO][4686] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.740 [INFO][4728] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" HandleID="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Workload="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.740 [INFO][4728] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" HandleID="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Workload="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137450), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5dd4d7df4d-jkl4b", "timestamp":"2025-09-09 04:51:06.740556847 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.740 [INFO][4728] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.874 [INFO][4728] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.874 [INFO][4728] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.935 [INFO][4728] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.941 [INFO][4728] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.950 [INFO][4728] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.952 [INFO][4728] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.955 [INFO][4728] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.955 [INFO][4728] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.957 [INFO][4728] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.961 [INFO][4728] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.970 [INFO][4728] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.970 [INFO][4728] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" host="localhost" Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.970 [INFO][4728] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:07.005378 containerd[1540]: 2025-09-09 04:51:06.970 [INFO][4728] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" HandleID="k8s-pod-network.30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Workload="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" Sep 9 04:51:07.006077 containerd[1540]: 2025-09-09 04:51:06.978 [INFO][4686] cni-plugin/k8s.go 418: Populated endpoint ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0", GenerateName:"calico-apiserver-5dd4d7df4d-", Namespace:"calico-apiserver", SelfLink:"", UID:"9f302fd2-6080-487a-8cb7-150862e1d68d", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dd4d7df4d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5dd4d7df4d-jkl4b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid6781bf44d9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:07.006077 containerd[1540]: 2025-09-09 04:51:06.978 [INFO][4686] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" Sep 9 04:51:07.006077 containerd[1540]: 2025-09-09 04:51:06.978 [INFO][4686] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6781bf44d9 ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" Sep 9 04:51:07.006077 containerd[1540]: 2025-09-09 04:51:06.985 [INFO][4686] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" Sep 9 04:51:07.006077 containerd[1540]: 2025-09-09 04:51:06.985 [INFO][4686] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0", GenerateName:"calico-apiserver-5dd4d7df4d-", Namespace:"calico-apiserver", SelfLink:"", UID:"9f302fd2-6080-487a-8cb7-150862e1d68d", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dd4d7df4d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e", Pod:"calico-apiserver-5dd4d7df4d-jkl4b", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid6781bf44d9", MAC:"d2:6b:b7:a2:ec:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:07.006077 containerd[1540]: 2025-09-09 04:51:06.996 [INFO][4686] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" Namespace="calico-apiserver" Pod="calico-apiserver-5dd4d7df4d-jkl4b" WorkloadEndpoint="localhost-k8s-calico--apiserver--5dd4d7df4d--jkl4b-eth0" Sep 9 04:51:07.015196 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:07.034077 containerd[1540]: time="2025-09-09T04:51:07.033993924Z" level=info msg="connecting to shim 30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e" address="unix:///run/containerd/s/79e2e30c4f437db7f825fde3a8b891a6d05215dc84c0589c567b07e45a263882" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:07.041874 containerd[1540]: time="2025-09-09T04:51:07.041845283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-74glh,Uid:b7a374d6-dee0-4466-b5a4-000f07fc11a9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff\"" Sep 9 04:51:07.061412 systemd[1]: Started cri-containerd-30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e.scope - libcontainer container 30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e. Sep 9 04:51:07.072671 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:07.087837 sshd[4771]: Connection closed by 10.0.0.1 port 59912 Sep 9 04:51:07.088381 sshd-session[4709]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:07.094976 systemd[1]: sshd@7-10.0.0.33:22-10.0.0.1:59912.service: Deactivated successfully. Sep 9 04:51:07.097832 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 04:51:07.098773 systemd-logind[1507]: Session 8 logged out. Waiting for processes to exit. Sep 9 04:51:07.099787 containerd[1540]: time="2025-09-09T04:51:07.099588194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dd4d7df4d-jkl4b,Uid:9f302fd2-6080-487a-8cb7-150862e1d68d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e\"" Sep 9 04:51:07.100066 systemd-logind[1507]: Removed session 8. Sep 9 04:51:07.463506 systemd-networkd[1444]: cali787a698cfce: Gained IPv6LL Sep 9 04:51:07.669542 kubelet[2694]: I0909 04:51:07.669459 2694 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 04:51:07.675917 kubelet[2694]: I0909 04:51:07.675893 2694 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 04:51:08.103397 systemd-networkd[1444]: cali1eafb99e186: Gained IPv6LL Sep 9 04:51:08.359434 systemd-networkd[1444]: calid6781bf44d9: Gained IPv6LL Sep 9 04:51:08.604062 containerd[1540]: time="2025-09-09T04:51:08.604010338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg5js,Uid:7f852350-12c8-443c-9eee-567b69f5268d,Namespace:calico-system,Attempt:0,}" Sep 9 04:51:08.616456 systemd-networkd[1444]: calida5b7f6a107: Gained IPv6LL Sep 9 04:51:08.821750 systemd-networkd[1444]: calic223c2d13b2: Link UP Sep 9 04:51:08.822222 systemd-networkd[1444]: calic223c2d13b2: Gained carrier Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.742 [INFO][4948] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--jg5js-eth0 goldmane-54d579b49d- calico-system 7f852350-12c8-443c-9eee-567b69f5268d 812 0 2025-09-09 04:50:44 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-jg5js eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic223c2d13b2 [] [] }} ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.742 [INFO][4948] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.770 [INFO][4966] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" HandleID="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Workload="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.770 [INFO][4966] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" HandleID="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Workload="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c4d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-jg5js", "timestamp":"2025-09-09 04:51:08.770645194 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.770 [INFO][4966] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.770 [INFO][4966] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.770 [INFO][4966] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.785 [INFO][4966] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.789 [INFO][4966] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.795 [INFO][4966] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.797 [INFO][4966] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.800 [INFO][4966] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.800 [INFO][4966] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.801 [INFO][4966] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.807 [INFO][4966] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.814 [INFO][4966] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.815 [INFO][4966] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" host="localhost" Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.815 [INFO][4966] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 04:51:08.837850 containerd[1540]: 2025-09-09 04:51:08.815 [INFO][4966] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" HandleID="k8s-pod-network.4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Workload="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" Sep 9 04:51:08.839321 containerd[1540]: 2025-09-09 04:51:08.818 [INFO][4948] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--jg5js-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"7f852350-12c8-443c-9eee-567b69f5268d", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-jg5js", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic223c2d13b2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:08.839321 containerd[1540]: 2025-09-09 04:51:08.818 [INFO][4948] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" Sep 9 04:51:08.839321 containerd[1540]: 2025-09-09 04:51:08.818 [INFO][4948] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic223c2d13b2 ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" Sep 9 04:51:08.839321 containerd[1540]: 2025-09-09 04:51:08.823 [INFO][4948] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" Sep 9 04:51:08.839321 containerd[1540]: 2025-09-09 04:51:08.823 [INFO][4948] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--jg5js-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"7f852350-12c8-443c-9eee-567b69f5268d", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 4, 50, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d", Pod:"goldmane-54d579b49d-jg5js", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic223c2d13b2", MAC:"1a:8f:bf:b3:a2:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 04:51:08.839321 containerd[1540]: 2025-09-09 04:51:08.833 [INFO][4948] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" Namespace="calico-system" Pod="goldmane-54d579b49d-jg5js" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--jg5js-eth0" Sep 9 04:51:08.859884 containerd[1540]: time="2025-09-09T04:51:08.859804302Z" level=info msg="connecting to shim 4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d" address="unix:///run/containerd/s/db5393a699979f3506e98577b51e7337b95e63b86d3a0fbc5884a493f7dff441" namespace=k8s.io protocol=ttrpc version=3 Sep 9 04:51:08.886461 systemd[1]: Started cri-containerd-4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d.scope - libcontainer container 4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d. Sep 9 04:51:08.900967 systemd-resolved[1354]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 9 04:51:08.924548 containerd[1540]: time="2025-09-09T04:51:08.924512612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-jg5js,Uid:7f852350-12c8-443c-9eee-567b69f5268d,Namespace:calico-system,Attempt:0,} returns sandbox id \"4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d\"" Sep 9 04:51:09.308885 containerd[1540]: time="2025-09-09T04:51:09.308837439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:09.309630 containerd[1540]: time="2025-09-09T04:51:09.309598359Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 04:51:09.310594 containerd[1540]: time="2025-09-09T04:51:09.310560038Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:09.312999 containerd[1540]: time="2025-09-09T04:51:09.312813078Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:09.313858 containerd[1540]: time="2025-09-09T04:51:09.313830278Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.331736347s" Sep 9 04:51:09.313919 containerd[1540]: time="2025-09-09T04:51:09.313860278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 04:51:09.316383 containerd[1540]: time="2025-09-09T04:51:09.316360998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:51:09.323701 containerd[1540]: time="2025-09-09T04:51:09.323673557Z" level=info msg="CreateContainer within sandbox \"9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 04:51:09.330473 containerd[1540]: time="2025-09-09T04:51:09.330438916Z" level=info msg="Container bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:09.337860 containerd[1540]: time="2025-09-09T04:51:09.337808955Z" level=info msg="CreateContainer within sandbox \"9b17517d91de6c18358ebd477a0ce8bef8f7cb3f9b2ad00db23729556849f9d9\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104\"" Sep 9 04:51:09.338285 containerd[1540]: time="2025-09-09T04:51:09.338240355Z" level=info msg="StartContainer for \"bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104\"" Sep 9 04:51:09.340295 containerd[1540]: time="2025-09-09T04:51:09.340131794Z" level=info msg="connecting to shim bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104" address="unix:///run/containerd/s/72e9aa6e75833fa92d233e2e2c78125c9b73059a785c20c4602d2b70666086b9" protocol=ttrpc version=3 Sep 9 04:51:09.357398 systemd[1]: Started cri-containerd-bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104.scope - libcontainer container bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104. Sep 9 04:51:09.390987 containerd[1540]: time="2025-09-09T04:51:09.390337907Z" level=info msg="StartContainer for \"bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104\" returns successfully" Sep 9 04:51:09.824271 kubelet[2694]: I0909 04:51:09.823310 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-55885cd57-8mg6f" podStartSLOduration=23.48921918 podStartE2EDuration="25.823293447s" podCreationTimestamp="2025-09-09 04:50:44 +0000 UTC" firstStartedPulling="2025-09-09 04:51:06.981492651 +0000 UTC m=+43.473768732" lastFinishedPulling="2025-09-09 04:51:09.315566918 +0000 UTC m=+45.807842999" observedRunningTime="2025-09-09 04:51:09.822577447 +0000 UTC m=+46.314853568" watchObservedRunningTime="2025-09-09 04:51:09.823293447 +0000 UTC m=+46.315569608" Sep 9 04:51:10.407406 systemd-networkd[1444]: calic223c2d13b2: Gained IPv6LL Sep 9 04:51:10.794490 containerd[1540]: time="2025-09-09T04:51:10.794452914Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104\" id:\"b846e24fef9bae3780f678214a1df48997cbfa58c59fe9a30c2c19b4cb1ce1af\" pid:5093 exited_at:{seconds:1757393470 nanos:793896274}" Sep 9 04:51:10.839788 containerd[1540]: time="2025-09-09T04:51:10.839743188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104\" id:\"923a2ce49daf5762c7b5a892b02cd0e68faa801e361c7bc492d11d0e209112d7\" pid:5116 exited_at:{seconds:1757393470 nanos:839547668}" Sep 9 04:51:11.524741 containerd[1540]: time="2025-09-09T04:51:11.524690335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:11.525559 containerd[1540]: time="2025-09-09T04:51:11.525524695Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 04:51:11.526265 containerd[1540]: time="2025-09-09T04:51:11.526201375Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:11.529153 containerd[1540]: time="2025-09-09T04:51:11.529100095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:11.529677 containerd[1540]: time="2025-09-09T04:51:11.529643135Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.213255017s" Sep 9 04:51:11.529731 containerd[1540]: time="2025-09-09T04:51:11.529680255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:51:11.531577 containerd[1540]: time="2025-09-09T04:51:11.531480374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 04:51:11.534344 containerd[1540]: time="2025-09-09T04:51:11.534193654Z" level=info msg="CreateContainer within sandbox \"7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:51:11.540749 containerd[1540]: time="2025-09-09T04:51:11.540715453Z" level=info msg="Container 334d9c7d8e90861c8d51ba283b34e14dd657d1d781517f40fa52f0b0bc326c9c: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:11.547090 containerd[1540]: time="2025-09-09T04:51:11.547024372Z" level=info msg="CreateContainer within sandbox \"7811ec5fad669c589ed2fce1c4b13c0938e64a54397f738cdff9c6c1c509adff\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"334d9c7d8e90861c8d51ba283b34e14dd657d1d781517f40fa52f0b0bc326c9c\"" Sep 9 04:51:11.548294 containerd[1540]: time="2025-09-09T04:51:11.547657772Z" level=info msg="StartContainer for \"334d9c7d8e90861c8d51ba283b34e14dd657d1d781517f40fa52f0b0bc326c9c\"" Sep 9 04:51:11.548849 containerd[1540]: time="2025-09-09T04:51:11.548818212Z" level=info msg="connecting to shim 334d9c7d8e90861c8d51ba283b34e14dd657d1d781517f40fa52f0b0bc326c9c" address="unix:///run/containerd/s/9abf6133e369eacc4b8dd94235fc3ee6d1467ff58e20bed46512ef66ade520f5" protocol=ttrpc version=3 Sep 9 04:51:11.573383 systemd[1]: Started cri-containerd-334d9c7d8e90861c8d51ba283b34e14dd657d1d781517f40fa52f0b0bc326c9c.scope - libcontainer container 334d9c7d8e90861c8d51ba283b34e14dd657d1d781517f40fa52f0b0bc326c9c. Sep 9 04:51:11.620436 containerd[1540]: time="2025-09-09T04:51:11.620371402Z" level=info msg="StartContainer for \"334d9c7d8e90861c8d51ba283b34e14dd657d1d781517f40fa52f0b0bc326c9c\" returns successfully" Sep 9 04:51:11.799851 containerd[1540]: time="2025-09-09T04:51:11.799740138Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:11.800659 containerd[1540]: time="2025-09-09T04:51:11.800626778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 04:51:11.802382 containerd[1540]: time="2025-09-09T04:51:11.802351178Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 270.806084ms" Sep 9 04:51:11.802460 containerd[1540]: time="2025-09-09T04:51:11.802385378Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 04:51:11.803490 containerd[1540]: time="2025-09-09T04:51:11.803316338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 04:51:11.806927 containerd[1540]: time="2025-09-09T04:51:11.806843377Z" level=info msg="CreateContainer within sandbox \"30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 04:51:11.813264 containerd[1540]: time="2025-09-09T04:51:11.812878936Z" level=info msg="Container 53e35389771051f186d82df9dfd3918e8205a5928398324fd8f3eef1f410dd96: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:11.824110 containerd[1540]: time="2025-09-09T04:51:11.824007175Z" level=info msg="CreateContainer within sandbox \"30c7c84bc31cdac4b977cb314a8855bea18eb3a2671b4e834b8101e9381e0c0e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"53e35389771051f186d82df9dfd3918e8205a5928398324fd8f3eef1f410dd96\"" Sep 9 04:51:11.825663 containerd[1540]: time="2025-09-09T04:51:11.825503895Z" level=info msg="StartContainer for \"53e35389771051f186d82df9dfd3918e8205a5928398324fd8f3eef1f410dd96\"" Sep 9 04:51:11.826853 containerd[1540]: time="2025-09-09T04:51:11.826819415Z" level=info msg="connecting to shim 53e35389771051f186d82df9dfd3918e8205a5928398324fd8f3eef1f410dd96" address="unix:///run/containerd/s/79e2e30c4f437db7f825fde3a8b891a6d05215dc84c0589c567b07e45a263882" protocol=ttrpc version=3 Sep 9 04:51:11.837348 kubelet[2694]: I0909 04:51:11.837260 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-74glh" podStartSLOduration=27.350551721 podStartE2EDuration="31.837203653s" podCreationTimestamp="2025-09-09 04:50:40 +0000 UTC" firstStartedPulling="2025-09-09 04:51:07.043994002 +0000 UTC m=+43.536270123" lastFinishedPulling="2025-09-09 04:51:11.530645934 +0000 UTC m=+48.022922055" observedRunningTime="2025-09-09 04:51:11.836705373 +0000 UTC m=+48.328981534" watchObservedRunningTime="2025-09-09 04:51:11.837203653 +0000 UTC m=+48.329479734" Sep 9 04:51:11.851400 systemd[1]: Started cri-containerd-53e35389771051f186d82df9dfd3918e8205a5928398324fd8f3eef1f410dd96.scope - libcontainer container 53e35389771051f186d82df9dfd3918e8205a5928398324fd8f3eef1f410dd96. Sep 9 04:51:11.921419 containerd[1540]: time="2025-09-09T04:51:11.921379162Z" level=info msg="StartContainer for \"53e35389771051f186d82df9dfd3918e8205a5928398324fd8f3eef1f410dd96\" returns successfully" Sep 9 04:51:12.101934 systemd[1]: Started sshd@8-10.0.0.33:22-10.0.0.1:35766.service - OpenSSH per-connection server daemon (10.0.0.1:35766). Sep 9 04:51:12.175571 sshd[5208]: Accepted publickey for core from 10.0.0.1 port 35766 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:12.176965 sshd-session[5208]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:12.181600 systemd-logind[1507]: New session 9 of user core. Sep 9 04:51:12.187430 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 04:51:12.409928 sshd[5211]: Connection closed by 10.0.0.1 port 35766 Sep 9 04:51:12.410446 sshd-session[5208]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:12.415442 systemd[1]: sshd@8-10.0.0.33:22-10.0.0.1:35766.service: Deactivated successfully. Sep 9 04:51:12.420354 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 04:51:12.421648 systemd-logind[1507]: Session 9 logged out. Waiting for processes to exit. Sep 9 04:51:12.423504 systemd-logind[1507]: Removed session 9. Sep 9 04:51:12.824036 kubelet[2694]: I0909 04:51:12.823994 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:51:12.841673 kubelet[2694]: I0909 04:51:12.841365 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5dd4d7df4d-jkl4b" podStartSLOduration=28.139306336 podStartE2EDuration="32.84096748s" podCreationTimestamp="2025-09-09 04:50:40 +0000 UTC" firstStartedPulling="2025-09-09 04:51:07.101549314 +0000 UTC m=+43.593825435" lastFinishedPulling="2025-09-09 04:51:11.803210498 +0000 UTC m=+48.295486579" observedRunningTime="2025-09-09 04:51:12.83955808 +0000 UTC m=+49.331834201" watchObservedRunningTime="2025-09-09 04:51:12.84096748 +0000 UTC m=+49.333243601" Sep 9 04:51:14.583294 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1281113059.mount: Deactivated successfully. Sep 9 04:51:14.988864 containerd[1540]: time="2025-09-09T04:51:14.988092843Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:14.988864 containerd[1540]: time="2025-09-09T04:51:14.988644562Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 04:51:14.989690 containerd[1540]: time="2025-09-09T04:51:14.989665122Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:14.991596 containerd[1540]: time="2025-09-09T04:51:14.991564642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 04:51:14.992849 containerd[1540]: time="2025-09-09T04:51:14.992821442Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.189477824s" Sep 9 04:51:14.992922 containerd[1540]: time="2025-09-09T04:51:14.992851602Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 04:51:14.996328 containerd[1540]: time="2025-09-09T04:51:14.996262642Z" level=info msg="CreateContainer within sandbox \"4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 04:51:15.003289 containerd[1540]: time="2025-09-09T04:51:15.002799561Z" level=info msg="Container db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae: CDI devices from CRI Config.CDIDevices: []" Sep 9 04:51:15.012500 containerd[1540]: time="2025-09-09T04:51:15.012464759Z" level=info msg="CreateContainer within sandbox \"4ef20d7294fe383e6dbc717566b7cb37f2eab6cce7215711f296131b8844536d\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae\"" Sep 9 04:51:15.014346 containerd[1540]: time="2025-09-09T04:51:15.014311279Z" level=info msg="StartContainer for \"db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae\"" Sep 9 04:51:15.015707 containerd[1540]: time="2025-09-09T04:51:15.015674879Z" level=info msg="connecting to shim db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae" address="unix:///run/containerd/s/db5393a699979f3506e98577b51e7337b95e63b86d3a0fbc5884a493f7dff441" protocol=ttrpc version=3 Sep 9 04:51:15.036440 systemd[1]: Started cri-containerd-db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae.scope - libcontainer container db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae. Sep 9 04:51:15.071495 containerd[1540]: time="2025-09-09T04:51:15.071452072Z" level=info msg="StartContainer for \"db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae\" returns successfully" Sep 9 04:51:15.931282 containerd[1540]: time="2025-09-09T04:51:15.931230483Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db9198f44f43395c9cb03888632f041f0cebf7a55098ee6f6b725d97496a7aae\" id:\"c5913d8f8a18ef24facaca94414dbc3da56bfa7128882383bdfc94b0599d5789\" pid:5293 exited_at:{seconds:1757393475 nanos:923502884}" Sep 9 04:51:15.954313 kubelet[2694]: I0909 04:51:15.953260 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-jg5js" podStartSLOduration=25.886627691 podStartE2EDuration="31.953227801s" podCreationTimestamp="2025-09-09 04:50:44 +0000 UTC" firstStartedPulling="2025-09-09 04:51:08.926923212 +0000 UTC m=+45.419199293" lastFinishedPulling="2025-09-09 04:51:14.993523282 +0000 UTC m=+51.485799403" observedRunningTime="2025-09-09 04:51:15.846895214 +0000 UTC m=+52.339171375" watchObservedRunningTime="2025-09-09 04:51:15.953227801 +0000 UTC m=+52.445503922" Sep 9 04:51:17.426354 systemd[1]: Started sshd@9-10.0.0.33:22-10.0.0.1:35772.service - OpenSSH per-connection server daemon (10.0.0.1:35772). Sep 9 04:51:17.508682 sshd[5309]: Accepted publickey for core from 10.0.0.1 port 35772 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:17.510926 sshd-session[5309]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:17.516948 systemd-logind[1507]: New session 10 of user core. Sep 9 04:51:17.527396 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 04:51:17.766618 sshd[5312]: Connection closed by 10.0.0.1 port 35772 Sep 9 04:51:17.767703 sshd-session[5309]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:17.776356 systemd[1]: sshd@9-10.0.0.33:22-10.0.0.1:35772.service: Deactivated successfully. Sep 9 04:51:17.777989 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 04:51:17.778884 systemd-logind[1507]: Session 10 logged out. Waiting for processes to exit. Sep 9 04:51:17.781179 systemd[1]: Started sshd@10-10.0.0.33:22-10.0.0.1:35788.service - OpenSSH per-connection server daemon (10.0.0.1:35788). Sep 9 04:51:17.782304 systemd-logind[1507]: Removed session 10. Sep 9 04:51:17.841210 sshd[5326]: Accepted publickey for core from 10.0.0.1 port 35788 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:17.842759 sshd-session[5326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:17.848745 systemd-logind[1507]: New session 11 of user core. Sep 9 04:51:17.856418 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 04:51:18.055082 sshd[5329]: Connection closed by 10.0.0.1 port 35788 Sep 9 04:51:18.056536 sshd-session[5326]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:18.071490 systemd[1]: sshd@10-10.0.0.33:22-10.0.0.1:35788.service: Deactivated successfully. Sep 9 04:51:18.075869 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 04:51:18.080900 systemd-logind[1507]: Session 11 logged out. Waiting for processes to exit. Sep 9 04:51:18.085904 systemd[1]: Started sshd@11-10.0.0.33:22-10.0.0.1:35804.service - OpenSSH per-connection server daemon (10.0.0.1:35804). Sep 9 04:51:18.089589 systemd-logind[1507]: Removed session 11. Sep 9 04:51:18.153179 sshd[5341]: Accepted publickey for core from 10.0.0.1 port 35804 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:18.154954 sshd-session[5341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:18.158895 systemd-logind[1507]: New session 12 of user core. Sep 9 04:51:18.169398 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 04:51:18.311399 sshd[5344]: Connection closed by 10.0.0.1 port 35804 Sep 9 04:51:18.312072 sshd-session[5341]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:18.315924 systemd[1]: sshd@11-10.0.0.33:22-10.0.0.1:35804.service: Deactivated successfully. Sep 9 04:51:18.318839 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 04:51:18.319825 systemd-logind[1507]: Session 12 logged out. Waiting for processes to exit. Sep 9 04:51:18.321332 systemd-logind[1507]: Removed session 12. Sep 9 04:51:23.329522 systemd[1]: Started sshd@12-10.0.0.33:22-10.0.0.1:36278.service - OpenSSH per-connection server daemon (10.0.0.1:36278). Sep 9 04:51:23.390260 sshd[5369]: Accepted publickey for core from 10.0.0.1 port 36278 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:23.391372 sshd-session[5369]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:23.395087 systemd-logind[1507]: New session 13 of user core. Sep 9 04:51:23.402409 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 04:51:23.526205 sshd[5372]: Connection closed by 10.0.0.1 port 36278 Sep 9 04:51:23.527145 sshd-session[5369]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:23.536340 systemd[1]: sshd@12-10.0.0.33:22-10.0.0.1:36278.service: Deactivated successfully. Sep 9 04:51:23.538009 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 04:51:23.538632 systemd-logind[1507]: Session 13 logged out. Waiting for processes to exit. Sep 9 04:51:23.540856 systemd[1]: Started sshd@13-10.0.0.33:22-10.0.0.1:36282.service - OpenSSH per-connection server daemon (10.0.0.1:36282). Sep 9 04:51:23.541458 systemd-logind[1507]: Removed session 13. Sep 9 04:51:23.545099 kubelet[2694]: I0909 04:51:23.545016 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 04:51:23.611729 sshd[5385]: Accepted publickey for core from 10.0.0.1 port 36282 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:23.612911 sshd-session[5385]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:23.616818 systemd-logind[1507]: New session 14 of user core. Sep 9 04:51:23.630415 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 04:51:23.873480 sshd[5392]: Connection closed by 10.0.0.1 port 36282 Sep 9 04:51:23.873731 sshd-session[5385]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:23.885499 systemd[1]: sshd@13-10.0.0.33:22-10.0.0.1:36282.service: Deactivated successfully. Sep 9 04:51:23.887218 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 04:51:23.887871 systemd-logind[1507]: Session 14 logged out. Waiting for processes to exit. Sep 9 04:51:23.890603 systemd[1]: Started sshd@14-10.0.0.33:22-10.0.0.1:36298.service - OpenSSH per-connection server daemon (10.0.0.1:36298). Sep 9 04:51:23.892695 systemd-logind[1507]: Removed session 14. Sep 9 04:51:23.960126 sshd[5404]: Accepted publickey for core from 10.0.0.1 port 36298 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:23.961539 sshd-session[5404]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:23.965464 systemd-logind[1507]: New session 15 of user core. Sep 9 04:51:23.971412 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 04:51:24.590385 sshd[5407]: Connection closed by 10.0.0.1 port 36298 Sep 9 04:51:24.591013 sshd-session[5404]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:24.604121 systemd[1]: sshd@14-10.0.0.33:22-10.0.0.1:36298.service: Deactivated successfully. Sep 9 04:51:24.607863 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 04:51:24.609048 systemd-logind[1507]: Session 15 logged out. Waiting for processes to exit. Sep 9 04:51:24.615196 systemd[1]: Started sshd@15-10.0.0.33:22-10.0.0.1:36304.service - OpenSSH per-connection server daemon (10.0.0.1:36304). Sep 9 04:51:24.617215 systemd-logind[1507]: Removed session 15. Sep 9 04:51:24.674370 sshd[5426]: Accepted publickey for core from 10.0.0.1 port 36304 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:24.675501 sshd-session[5426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:24.679326 systemd-logind[1507]: New session 16 of user core. Sep 9 04:51:24.687378 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 04:51:24.952355 sshd[5430]: Connection closed by 10.0.0.1 port 36304 Sep 9 04:51:24.952633 sshd-session[5426]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:24.963155 systemd[1]: sshd@15-10.0.0.33:22-10.0.0.1:36304.service: Deactivated successfully. Sep 9 04:51:24.965106 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 04:51:24.965877 systemd-logind[1507]: Session 16 logged out. Waiting for processes to exit. Sep 9 04:51:24.969461 systemd[1]: Started sshd@16-10.0.0.33:22-10.0.0.1:36314.service - OpenSSH per-connection server daemon (10.0.0.1:36314). Sep 9 04:51:24.970868 systemd-logind[1507]: Removed session 16. Sep 9 04:51:25.027554 sshd[5442]: Accepted publickey for core from 10.0.0.1 port 36314 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:25.028869 sshd-session[5442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:25.033575 systemd-logind[1507]: New session 17 of user core. Sep 9 04:51:25.044440 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 04:51:25.199028 sshd[5445]: Connection closed by 10.0.0.1 port 36314 Sep 9 04:51:25.199390 sshd-session[5442]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:25.203379 systemd[1]: sshd@16-10.0.0.33:22-10.0.0.1:36314.service: Deactivated successfully. Sep 9 04:51:25.205812 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 04:51:25.206633 systemd-logind[1507]: Session 17 logged out. Waiting for processes to exit. Sep 9 04:51:25.209045 systemd-logind[1507]: Removed session 17. Sep 9 04:51:29.834956 containerd[1540]: time="2025-09-09T04:51:29.834908947Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69b807bee31b1a67745fbeb46d3195c81686a14b773ccd204ca366b067fea8de\" id:\"8e06f5e70d2728fa6629fe3d28d51e6ab47ec2f1870aa599323bd5af53076ebd\" pid:5473 exited_at:{seconds:1757393489 nanos:834614546}" Sep 9 04:51:30.214621 systemd[1]: Started sshd@17-10.0.0.33:22-10.0.0.1:38080.service - OpenSSH per-connection server daemon (10.0.0.1:38080). Sep 9 04:51:30.280582 sshd[5486]: Accepted publickey for core from 10.0.0.1 port 38080 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:30.281935 sshd-session[5486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:30.285589 systemd-logind[1507]: New session 18 of user core. Sep 9 04:51:30.302512 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 04:51:30.461295 sshd[5489]: Connection closed by 10.0.0.1 port 38080 Sep 9 04:51:30.461467 sshd-session[5486]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:30.464968 systemd[1]: sshd@17-10.0.0.33:22-10.0.0.1:38080.service: Deactivated successfully. Sep 9 04:51:30.466980 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 04:51:30.467928 systemd-logind[1507]: Session 18 logged out. Waiting for processes to exit. Sep 9 04:51:30.468920 systemd-logind[1507]: Removed session 18. Sep 9 04:51:35.476951 systemd[1]: Started sshd@18-10.0.0.33:22-10.0.0.1:38088.service - OpenSSH per-connection server daemon (10.0.0.1:38088). Sep 9 04:51:35.539570 sshd[5507]: Accepted publickey for core from 10.0.0.1 port 38088 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:35.541181 sshd-session[5507]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:35.545744 systemd-logind[1507]: New session 19 of user core. Sep 9 04:51:35.555464 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 04:51:35.712531 sshd[5510]: Connection closed by 10.0.0.1 port 38088 Sep 9 04:51:35.712877 sshd-session[5507]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:35.716541 systemd[1]: sshd@18-10.0.0.33:22-10.0.0.1:38088.service: Deactivated successfully. Sep 9 04:51:35.718376 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 04:51:35.720945 systemd-logind[1507]: Session 19 logged out. Waiting for processes to exit. Sep 9 04:51:35.722441 systemd-logind[1507]: Removed session 19. Sep 9 04:51:40.729626 systemd[1]: Started sshd@19-10.0.0.33:22-10.0.0.1:37684.service - OpenSSH per-connection server daemon (10.0.0.1:37684). Sep 9 04:51:40.812545 sshd[5530]: Accepted publickey for core from 10.0.0.1 port 37684 ssh2: RSA SHA256:BZm90Ok3j8HCXtlwShuWuMQDPsEE0kFrFWmP82ap/wE Sep 9 04:51:40.814497 sshd-session[5530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 04:51:40.823816 systemd-logind[1507]: New session 20 of user core. Sep 9 04:51:40.826353 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 04:51:40.847460 containerd[1540]: time="2025-09-09T04:51:40.847413194Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104\" id:\"47c354b109e3a9c4f8d1737b3a435d4eace0c22b868f1aa86572722023cf5aaf\" pid:5546 exited_at:{seconds:1757393500 nanos:846904192}" Sep 9 04:51:41.022388 sshd[5552]: Connection closed by 10.0.0.1 port 37684 Sep 9 04:51:41.022776 sshd-session[5530]: pam_unix(sshd:session): session closed for user core Sep 9 04:51:41.028954 systemd[1]: sshd@19-10.0.0.33:22-10.0.0.1:37684.service: Deactivated successfully. Sep 9 04:51:41.032906 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 04:51:41.033737 systemd-logind[1507]: Session 20 logged out. Waiting for processes to exit. Sep 9 04:51:41.035316 systemd-logind[1507]: Removed session 20. Sep 9 04:51:42.698452 containerd[1540]: time="2025-09-09T04:51:42.698408902Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf813f115aa448faabd775816459c085f16ba48e637c11dda1bc61c5c1d2e104\" id:\"2d8c0b249dda74ffc34ded44259a58f7cbf134dedd2875e8f14660572f5675c2\" pid:5579 exited_at:{seconds:1757393502 nanos:697648780}"