May 13 12:50:50.792190 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] May 13 12:50:50.792211 kernel: Linux version 6.12.28-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 13 11:28:23 -00 2025 May 13 12:50:50.792220 kernel: KASLR enabled May 13 12:50:50.792226 kernel: efi: EFI v2.7 by EDK II May 13 12:50:50.792231 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb832018 ACPI 2.0=0xdbfd0018 RNG=0xdbfd0a18 MEMRESERVE=0xdb838218 May 13 12:50:50.792237 kernel: random: crng init done May 13 12:50:50.792243 kernel: secureboot: Secure boot disabled May 13 12:50:50.792249 kernel: ACPI: Early table checksum verification disabled May 13 12:50:50.792255 kernel: ACPI: RSDP 0x00000000DBFD0018 000024 (v02 BOCHS ) May 13 12:50:50.792261 kernel: ACPI: XSDT 0x00000000DBFD0F18 000064 (v01 BOCHS BXPC 00000001 01000013) May 13 12:50:50.792267 kernel: ACPI: FACP 0x00000000DBFD0B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792273 kernel: ACPI: DSDT 0x00000000DBF0E018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792278 kernel: ACPI: APIC 0x00000000DBFD0C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792284 kernel: ACPI: PPTT 0x00000000DBFD0098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792291 kernel: ACPI: GTDT 0x00000000DBFD0818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792298 kernel: ACPI: MCFG 0x00000000DBFD0A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792304 kernel: ACPI: SPCR 0x00000000DBFD0918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792310 kernel: ACPI: DBG2 0x00000000DBFD0998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792316 kernel: ACPI: IORT 0x00000000DBFD0198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:50:50.792322 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 May 13 12:50:50.792328 kernel: ACPI: Use ACPI SPCR as default console: Yes May 13 12:50:50.792334 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] May 13 12:50:50.792340 kernel: NODE_DATA(0) allocated [mem 0xdc964dc0-0xdc96bfff] May 13 12:50:50.792346 kernel: Zone ranges: May 13 12:50:50.792352 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] May 13 12:50:50.792359 kernel: DMA32 empty May 13 12:50:50.792365 kernel: Normal empty May 13 12:50:50.792370 kernel: Device empty May 13 12:50:50.792376 kernel: Movable zone start for each node May 13 12:50:50.792382 kernel: Early memory node ranges May 13 12:50:50.792388 kernel: node 0: [mem 0x0000000040000000-0x00000000db81ffff] May 13 12:50:50.792394 kernel: node 0: [mem 0x00000000db820000-0x00000000db82ffff] May 13 12:50:50.792400 kernel: node 0: [mem 0x00000000db830000-0x00000000dc09ffff] May 13 12:50:50.792417 kernel: node 0: [mem 0x00000000dc0a0000-0x00000000dc2dffff] May 13 12:50:50.792423 kernel: node 0: [mem 0x00000000dc2e0000-0x00000000dc36ffff] May 13 12:50:50.792429 kernel: node 0: [mem 0x00000000dc370000-0x00000000dc45ffff] May 13 12:50:50.792435 kernel: node 0: [mem 0x00000000dc460000-0x00000000dc52ffff] May 13 12:50:50.792442 kernel: node 0: [mem 0x00000000dc530000-0x00000000dc5cffff] May 13 12:50:50.792448 kernel: node 0: [mem 0x00000000dc5d0000-0x00000000dce1ffff] May 13 12:50:50.792454 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] May 13 12:50:50.792463 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] May 13 12:50:50.792469 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] May 13 12:50:50.792476 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] May 13 12:50:50.792483 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] May 13 12:50:50.792490 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges May 13 12:50:50.792496 kernel: psci: probing for conduit method from ACPI. May 13 12:50:50.792502 kernel: psci: PSCIv1.1 detected in firmware. May 13 12:50:50.792509 kernel: psci: Using standard PSCI v0.2 function IDs May 13 12:50:50.792515 kernel: psci: Trusted OS migration not required May 13 12:50:50.792521 kernel: psci: SMC Calling Convention v1.1 May 13 12:50:50.792528 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) May 13 12:50:50.792534 kernel: percpu: Embedded 33 pages/cpu s98136 r8192 d28840 u135168 May 13 12:50:50.792540 kernel: pcpu-alloc: s98136 r8192 d28840 u135168 alloc=33*4096 May 13 12:50:50.792548 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 May 13 12:50:50.792554 kernel: Detected PIPT I-cache on CPU0 May 13 12:50:50.792561 kernel: CPU features: detected: GIC system register CPU interface May 13 12:50:50.792567 kernel: CPU features: detected: Spectre-v4 May 13 12:50:50.792573 kernel: CPU features: detected: Spectre-BHB May 13 12:50:50.792580 kernel: CPU features: kernel page table isolation forced ON by KASLR May 13 12:50:50.792586 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 13 12:50:50.792592 kernel: CPU features: detected: ARM erratum 1418040 May 13 12:50:50.792598 kernel: CPU features: detected: SSBS not fully self-synchronizing May 13 12:50:50.792605 kernel: alternatives: applying boot alternatives May 13 12:50:50.792612 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b20e935bbd8772a1b0c6883755acb6e2a52b7a903a0b8e12c8ff59ca86b84928 May 13 12:50:50.792620 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 12:50:50.792627 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 12:50:50.792633 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 12:50:50.792639 kernel: Fallback order for Node 0: 0 May 13 12:50:50.792646 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 May 13 12:50:50.792652 kernel: Policy zone: DMA May 13 12:50:50.792658 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 12:50:50.792664 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB May 13 12:50:50.792671 kernel: software IO TLB: area num 4. May 13 12:50:50.792677 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB May 13 12:50:50.792683 kernel: software IO TLB: mapped [mem 0x00000000d8c00000-0x00000000d9000000] (4MB) May 13 12:50:50.792690 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 13 12:50:50.792697 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 12:50:50.792704 kernel: rcu: RCU event tracing is enabled. May 13 12:50:50.792711 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 13 12:50:50.792717 kernel: Trampoline variant of Tasks RCU enabled. May 13 12:50:50.792723 kernel: Tracing variant of Tasks RCU enabled. May 13 12:50:50.792729 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 12:50:50.792736 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 13 12:50:50.792742 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 12:50:50.792749 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 12:50:50.792755 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 13 12:50:50.792761 kernel: GICv3: 256 SPIs implemented May 13 12:50:50.792769 kernel: GICv3: 0 Extended SPIs implemented May 13 12:50:50.792775 kernel: Root IRQ handler: gic_handle_irq May 13 12:50:50.792781 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI May 13 12:50:50.792788 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 May 13 12:50:50.792794 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 May 13 12:50:50.792800 kernel: ITS [mem 0x08080000-0x0809ffff] May 13 12:50:50.792807 kernel: ITS@0x0000000008080000: allocated 8192 Devices @400e0000 (indirect, esz 8, psz 64K, shr 1) May 13 12:50:50.792813 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @400f0000 (flat, esz 8, psz 64K, shr 1) May 13 12:50:50.792820 kernel: GICv3: using LPI property table @0x0000000040100000 May 13 12:50:50.792826 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040110000 May 13 12:50:50.792832 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 12:50:50.792839 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 12:50:50.792846 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). May 13 12:50:50.792853 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns May 13 12:50:50.792859 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns May 13 12:50:50.792866 kernel: arm-pv: using stolen time PV May 13 12:50:50.792872 kernel: Console: colour dummy device 80x25 May 13 12:50:50.792879 kernel: ACPI: Core revision 20240827 May 13 12:50:50.792886 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) May 13 12:50:50.792892 kernel: pid_max: default: 32768 minimum: 301 May 13 12:50:50.792899 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 13 12:50:50.792907 kernel: landlock: Up and running. May 13 12:50:50.792913 kernel: SELinux: Initializing. May 13 12:50:50.792920 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 12:50:50.792933 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 12:50:50.792940 kernel: ACPI PPTT: PPTT table found, but unable to locate core 3 (3) May 13 12:50:50.792947 kernel: rcu: Hierarchical SRCU implementation. May 13 12:50:50.792953 kernel: rcu: Max phase no-delay instances is 400. May 13 12:50:50.792960 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 13 12:50:50.792966 kernel: Remapping and enabling EFI services. May 13 12:50:50.792975 kernel: smp: Bringing up secondary CPUs ... May 13 12:50:50.792986 kernel: Detected PIPT I-cache on CPU1 May 13 12:50:50.792993 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 May 13 12:50:50.793001 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040120000 May 13 12:50:50.793008 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 12:50:50.793015 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] May 13 12:50:50.793022 kernel: Detected PIPT I-cache on CPU2 May 13 12:50:50.793028 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 May 13 12:50:50.793035 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040130000 May 13 12:50:50.793043 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 12:50:50.793050 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] May 13 12:50:50.793057 kernel: Detected PIPT I-cache on CPU3 May 13 12:50:50.793064 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 May 13 12:50:50.793070 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040140000 May 13 12:50:50.793077 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 12:50:50.793084 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] May 13 12:50:50.793091 kernel: smp: Brought up 1 node, 4 CPUs May 13 12:50:50.793097 kernel: SMP: Total of 4 processors activated. May 13 12:50:50.793106 kernel: CPU: All CPU(s) started at EL1 May 13 12:50:50.793112 kernel: CPU features: detected: 32-bit EL0 Support May 13 12:50:50.793119 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 13 12:50:50.793126 kernel: CPU features: detected: Common not Private translations May 13 12:50:50.793133 kernel: CPU features: detected: CRC32 instructions May 13 12:50:50.793140 kernel: CPU features: detected: Enhanced Virtualization Traps May 13 12:50:50.793147 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 13 12:50:50.793153 kernel: CPU features: detected: LSE atomic instructions May 13 12:50:50.793160 kernel: CPU features: detected: Privileged Access Never May 13 12:50:50.793168 kernel: CPU features: detected: RAS Extension Support May 13 12:50:50.793175 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 13 12:50:50.793182 kernel: alternatives: applying system-wide alternatives May 13 12:50:50.793189 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 May 13 12:50:50.793196 kernel: Memory: 2440916K/2572288K available (11072K kernel code, 2276K rwdata, 8932K rodata, 39488K init, 1034K bss, 125604K reserved, 0K cma-reserved) May 13 12:50:50.793203 kernel: devtmpfs: initialized May 13 12:50:50.793210 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 12:50:50.793217 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 13 12:50:50.793224 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 13 12:50:50.793232 kernel: 0 pages in range for non-PLT usage May 13 12:50:50.793239 kernel: 508528 pages in range for PLT usage May 13 12:50:50.793245 kernel: pinctrl core: initialized pinctrl subsystem May 13 12:50:50.793252 kernel: SMBIOS 3.0.0 present. May 13 12:50:50.793259 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 May 13 12:50:50.793266 kernel: DMI: Memory slots populated: 1/1 May 13 12:50:50.793273 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 12:50:50.793280 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 13 12:50:50.793286 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 13 12:50:50.793295 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 13 12:50:50.793302 kernel: audit: initializing netlink subsys (disabled) May 13 12:50:50.793309 kernel: audit: type=2000 audit(0.029:1): state=initialized audit_enabled=0 res=1 May 13 12:50:50.793315 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 12:50:50.793322 kernel: cpuidle: using governor menu May 13 12:50:50.793329 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 13 12:50:50.793336 kernel: ASID allocator initialised with 32768 entries May 13 12:50:50.793343 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 12:50:50.793350 kernel: Serial: AMBA PL011 UART driver May 13 12:50:50.793358 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 12:50:50.793365 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 13 12:50:50.793372 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 13 12:50:50.793379 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 13 12:50:50.793385 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 12:50:50.793392 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 13 12:50:50.793399 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 13 12:50:50.793454 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 13 12:50:50.793462 kernel: ACPI: Added _OSI(Module Device) May 13 12:50:50.793472 kernel: ACPI: Added _OSI(Processor Device) May 13 12:50:50.793479 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 12:50:50.793486 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 12:50:50.793492 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 12:50:50.793499 kernel: ACPI: Interpreter enabled May 13 12:50:50.793506 kernel: ACPI: Using GIC for interrupt routing May 13 12:50:50.793513 kernel: ACPI: MCFG table detected, 1 entries May 13 12:50:50.793520 kernel: ACPI: CPU0 has been hot-added May 13 12:50:50.793526 kernel: ACPI: CPU1 has been hot-added May 13 12:50:50.793534 kernel: ACPI: CPU2 has been hot-added May 13 12:50:50.793541 kernel: ACPI: CPU3 has been hot-added May 13 12:50:50.793548 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA May 13 12:50:50.793555 kernel: printk: legacy console [ttyAMA0] enabled May 13 12:50:50.793562 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 12:50:50.793687 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 12:50:50.793751 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 13 12:50:50.793810 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 13 12:50:50.793870 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 May 13 12:50:50.793936 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] May 13 12:50:50.793946 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] May 13 12:50:50.793953 kernel: PCI host bridge to bus 0000:00 May 13 12:50:50.794017 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] May 13 12:50:50.794073 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] May 13 12:50:50.794126 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] May 13 12:50:50.794179 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 12:50:50.794252 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint May 13 12:50:50.794320 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint May 13 12:50:50.794380 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] May 13 12:50:50.794456 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] May 13 12:50:50.794516 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] May 13 12:50:50.794574 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned May 13 12:50:50.794634 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned May 13 12:50:50.794693 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned May 13 12:50:50.794746 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] May 13 12:50:50.794797 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] May 13 12:50:50.794849 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] May 13 12:50:50.794857 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 May 13 12:50:50.794864 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 May 13 12:50:50.794873 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 May 13 12:50:50.794880 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 May 13 12:50:50.794886 kernel: iommu: Default domain type: Translated May 13 12:50:50.794893 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 13 12:50:50.794900 kernel: efivars: Registered efivars operations May 13 12:50:50.794907 kernel: vgaarb: loaded May 13 12:50:50.794914 kernel: clocksource: Switched to clocksource arch_sys_counter May 13 12:50:50.794921 kernel: VFS: Disk quotas dquot_6.6.0 May 13 12:50:50.794934 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 12:50:50.794943 kernel: pnp: PnP ACPI init May 13 12:50:50.795016 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved May 13 12:50:50.795026 kernel: pnp: PnP ACPI: found 1 devices May 13 12:50:50.795033 kernel: NET: Registered PF_INET protocol family May 13 12:50:50.795041 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 12:50:50.795048 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 12:50:50.795055 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 12:50:50.795062 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 12:50:50.795071 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 12:50:50.795078 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 12:50:50.795084 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 12:50:50.795091 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 12:50:50.795098 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 12:50:50.795105 kernel: PCI: CLS 0 bytes, default 64 May 13 12:50:50.795112 kernel: kvm [1]: HYP mode not available May 13 12:50:50.795118 kernel: Initialise system trusted keyrings May 13 12:50:50.795125 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 12:50:50.795133 kernel: Key type asymmetric registered May 13 12:50:50.795140 kernel: Asymmetric key parser 'x509' registered May 13 12:50:50.795147 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) May 13 12:50:50.795154 kernel: io scheduler mq-deadline registered May 13 12:50:50.795161 kernel: io scheduler kyber registered May 13 12:50:50.795167 kernel: io scheduler bfq registered May 13 12:50:50.795174 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 13 12:50:50.795181 kernel: ACPI: button: Power Button [PWRB] May 13 12:50:50.795188 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 May 13 12:50:50.795249 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) May 13 12:50:50.795258 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 12:50:50.795265 kernel: thunder_xcv, ver 1.0 May 13 12:50:50.795271 kernel: thunder_bgx, ver 1.0 May 13 12:50:50.795278 kernel: nicpf, ver 1.0 May 13 12:50:50.795285 kernel: nicvf, ver 1.0 May 13 12:50:50.795350 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 13 12:50:50.795418 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-13T12:50:50 UTC (1747140650) May 13 12:50:50.795430 kernel: hid: raw HID events driver (C) Jiri Kosina May 13 12:50:50.795438 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available May 13 12:50:50.795445 kernel: watchdog: NMI not fully supported May 13 12:50:50.795452 kernel: watchdog: Hard watchdog permanently disabled May 13 12:50:50.795458 kernel: NET: Registered PF_INET6 protocol family May 13 12:50:50.795465 kernel: Segment Routing with IPv6 May 13 12:50:50.795472 kernel: In-situ OAM (IOAM) with IPv6 May 13 12:50:50.795479 kernel: NET: Registered PF_PACKET protocol family May 13 12:50:50.795486 kernel: Key type dns_resolver registered May 13 12:50:50.795494 kernel: registered taskstats version 1 May 13 12:50:50.795501 kernel: Loading compiled-in X.509 certificates May 13 12:50:50.795508 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.28-flatcar: f8df872077a0531ef71a44c67653908e8a70c520' May 13 12:50:50.795515 kernel: Demotion targets for Node 0: null May 13 12:50:50.795522 kernel: Key type .fscrypt registered May 13 12:50:50.795528 kernel: Key type fscrypt-provisioning registered May 13 12:50:50.795535 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 12:50:50.795542 kernel: ima: Allocated hash algorithm: sha1 May 13 12:50:50.795549 kernel: ima: No architecture policies found May 13 12:50:50.795557 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 13 12:50:50.795564 kernel: clk: Disabling unused clocks May 13 12:50:50.795570 kernel: PM: genpd: Disabling unused power domains May 13 12:50:50.795577 kernel: Warning: unable to open an initial console. May 13 12:50:50.795584 kernel: Freeing unused kernel memory: 39488K May 13 12:50:50.795591 kernel: Run /init as init process May 13 12:50:50.795598 kernel: with arguments: May 13 12:50:50.795604 kernel: /init May 13 12:50:50.795611 kernel: with environment: May 13 12:50:50.795618 kernel: HOME=/ May 13 12:50:50.795625 kernel: TERM=linux May 13 12:50:50.795632 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 12:50:50.795640 systemd[1]: Successfully made /usr/ read-only. May 13 12:50:50.795649 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 12:50:50.795657 systemd[1]: Detected virtualization kvm. May 13 12:50:50.795664 systemd[1]: Detected architecture arm64. May 13 12:50:50.795671 systemd[1]: Running in initrd. May 13 12:50:50.795679 systemd[1]: No hostname configured, using default hostname. May 13 12:50:50.795687 systemd[1]: Hostname set to . May 13 12:50:50.795694 systemd[1]: Initializing machine ID from VM UUID. May 13 12:50:50.795701 systemd[1]: Queued start job for default target initrd.target. May 13 12:50:50.795708 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 12:50:50.795716 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 12:50:50.795723 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 12:50:50.795731 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 12:50:50.795739 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 12:50:50.795748 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 12:50:50.795756 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 12:50:50.795763 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 12:50:50.795770 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 12:50:50.795778 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 12:50:50.795786 systemd[1]: Reached target paths.target - Path Units. May 13 12:50:50.795793 systemd[1]: Reached target slices.target - Slice Units. May 13 12:50:50.795801 systemd[1]: Reached target swap.target - Swaps. May 13 12:50:50.795808 systemd[1]: Reached target timers.target - Timer Units. May 13 12:50:50.795815 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 12:50:50.795822 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 12:50:50.795830 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 12:50:50.795837 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 12:50:50.795844 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 12:50:50.795853 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 12:50:50.795860 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 12:50:50.795867 systemd[1]: Reached target sockets.target - Socket Units. May 13 12:50:50.795875 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 12:50:50.795882 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 12:50:50.795889 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 12:50:50.795897 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 13 12:50:50.795904 systemd[1]: Starting systemd-fsck-usr.service... May 13 12:50:50.795913 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 12:50:50.795920 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 12:50:50.795935 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:50:50.795942 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 12:50:50.795950 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 12:50:50.795959 systemd[1]: Finished systemd-fsck-usr.service. May 13 12:50:50.795967 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 12:50:50.795990 systemd-journald[244]: Collecting audit messages is disabled. May 13 12:50:50.796008 systemd-journald[244]: Journal started May 13 12:50:50.796028 systemd-journald[244]: Runtime Journal (/run/log/journal/4cf638ac634b40d885f8051f184a30c5) is 6M, max 48.5M, 42.4M free. May 13 12:50:50.797702 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:50:50.787058 systemd-modules-load[245]: Inserted module 'overlay' May 13 12:50:50.800656 systemd[1]: Started systemd-journald.service - Journal Service. May 13 12:50:50.801478 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 12:50:50.803335 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 12:50:50.808273 kernel: Bridge firewalling registered May 13 12:50:50.806930 systemd-modules-load[245]: Inserted module 'br_netfilter' May 13 12:50:50.807801 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 12:50:50.810967 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 12:50:50.812570 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 12:50:50.814510 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 12:50:50.821660 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 12:50:50.828447 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 12:50:50.829634 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 12:50:50.831359 systemd-tmpfiles[268]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 13 12:50:50.834078 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 12:50:50.836737 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 12:50:50.838710 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 12:50:50.841082 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 12:50:50.853982 dracut-cmdline[290]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b20e935bbd8772a1b0c6883755acb6e2a52b7a903a0b8e12c8ff59ca86b84928 May 13 12:50:50.869585 systemd-resolved[288]: Positive Trust Anchors: May 13 12:50:50.869600 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 12:50:50.869631 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 12:50:50.874303 systemd-resolved[288]: Defaulting to hostname 'linux'. May 13 12:50:50.875204 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 12:50:50.878662 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 12:50:50.931436 kernel: SCSI subsystem initialized May 13 12:50:50.935422 kernel: Loading iSCSI transport class v2.0-870. May 13 12:50:50.943451 kernel: iscsi: registered transport (tcp) May 13 12:50:50.955422 kernel: iscsi: registered transport (qla4xxx) May 13 12:50:50.955442 kernel: QLogic iSCSI HBA Driver May 13 12:50:50.970349 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 12:50:50.985239 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 12:50:50.986768 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 12:50:51.031474 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 12:50:51.033678 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 12:50:51.096446 kernel: raid6: neonx8 gen() 15761 MB/s May 13 12:50:51.113442 kernel: raid6: neonx4 gen() 15747 MB/s May 13 12:50:51.130439 kernel: raid6: neonx2 gen() 13174 MB/s May 13 12:50:51.147430 kernel: raid6: neonx1 gen() 10504 MB/s May 13 12:50:51.164437 kernel: raid6: int64x8 gen() 6877 MB/s May 13 12:50:51.181439 kernel: raid6: int64x4 gen() 7325 MB/s May 13 12:50:51.198429 kernel: raid6: int64x2 gen() 6084 MB/s May 13 12:50:51.215497 kernel: raid6: int64x1 gen() 5046 MB/s May 13 12:50:51.215531 kernel: raid6: using algorithm neonx8 gen() 15761 MB/s May 13 12:50:51.233475 kernel: raid6: .... xor() 12061 MB/s, rmw enabled May 13 12:50:51.233490 kernel: raid6: using neon recovery algorithm May 13 12:50:51.238671 kernel: xor: measuring software checksum speed May 13 12:50:51.238696 kernel: 8regs : 21596 MB/sec May 13 12:50:51.239424 kernel: 32regs : 21699 MB/sec May 13 12:50:51.239436 kernel: arm64_neon : 24375 MB/sec May 13 12:50:51.240524 kernel: xor: using function: arm64_neon (24375 MB/sec) May 13 12:50:51.294433 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 12:50:51.300365 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 12:50:51.302902 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 12:50:51.330207 systemd-udevd[498]: Using default interface naming scheme 'v255'. May 13 12:50:51.334245 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 12:50:51.336534 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 12:50:51.359072 dracut-pre-trigger[507]: rd.md=0: removing MD RAID activation May 13 12:50:51.379622 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 12:50:51.381744 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 12:50:51.429145 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 12:50:51.431650 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 12:50:51.473437 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues May 13 12:50:51.476835 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 13 12:50:51.479760 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 12:50:51.479800 kernel: GPT:9289727 != 19775487 May 13 12:50:51.479810 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 12:50:51.480841 kernel: GPT:9289727 != 19775487 May 13 12:50:51.480876 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 12:50:51.481680 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 12:50:51.481559 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 12:50:51.481683 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:50:51.483655 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:50:51.485373 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:50:51.511413 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 13 12:50:51.512823 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:50:51.519620 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 12:50:51.528704 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 13 12:50:51.534882 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 13 12:50:51.536060 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 13 12:50:51.545116 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 12:50:51.550591 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 12:50:51.551796 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 12:50:51.553854 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 12:50:51.556387 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 12:50:51.558114 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 12:50:51.577011 disk-uuid[593]: Primary Header is updated. May 13 12:50:51.577011 disk-uuid[593]: Secondary Entries is updated. May 13 12:50:51.577011 disk-uuid[593]: Secondary Header is updated. May 13 12:50:51.582292 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 12:50:51.585531 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 12:50:52.589435 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 12:50:52.592621 disk-uuid[597]: The operation has completed successfully. May 13 12:50:52.616703 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 12:50:52.616806 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 12:50:52.640552 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 12:50:52.668988 sh[614]: Success May 13 12:50:52.682429 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 12:50:52.682462 kernel: device-mapper: uevent: version 1.0.3 May 13 12:50:52.686686 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 13 12:50:52.694424 kernel: device-mapper: verity: sha256 using shash "sha256-ce" May 13 12:50:52.718931 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 12:50:52.721594 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 12:50:52.748444 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 12:50:52.756549 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 13 12:50:52.756577 kernel: BTRFS: device fsid 5ded7f9d-c045-4eec-a161-ff9af5b01d28 devid 1 transid 40 /dev/mapper/usr (253:0) scanned by mount (626) May 13 12:50:52.757906 kernel: BTRFS info (device dm-0): first mount of filesystem 5ded7f9d-c045-4eec-a161-ff9af5b01d28 May 13 12:50:52.758856 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 13 12:50:52.758868 kernel: BTRFS info (device dm-0): using free-space-tree May 13 12:50:52.762921 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 12:50:52.764113 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 13 12:50:52.765531 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 12:50:52.766216 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 12:50:52.767817 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 12:50:52.786053 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (659) May 13 12:50:52.786088 kernel: BTRFS info (device vda6): first mount of filesystem 79dad06b-b9d3-4cc5-b052-ebf459e9d4d7 May 13 12:50:52.787052 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm May 13 12:50:52.787767 kernel: BTRFS info (device vda6): using free-space-tree May 13 12:50:52.793430 kernel: BTRFS info (device vda6): last unmount of filesystem 79dad06b-b9d3-4cc5-b052-ebf459e9d4d7 May 13 12:50:52.794260 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 12:50:52.796249 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 12:50:52.866429 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 12:50:52.870744 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 12:50:52.911429 systemd-networkd[806]: lo: Link UP May 13 12:50:52.911441 systemd-networkd[806]: lo: Gained carrier May 13 12:50:52.912193 systemd-networkd[806]: Enumeration completed May 13 12:50:52.912476 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 12:50:52.914188 systemd[1]: Reached target network.target - Network. May 13 12:50:52.915103 systemd-networkd[806]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:50:52.915107 systemd-networkd[806]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 12:50:52.915806 systemd-networkd[806]: eth0: Link UP May 13 12:50:52.915810 systemd-networkd[806]: eth0: Gained carrier May 13 12:50:52.915818 systemd-networkd[806]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:50:52.935558 ignition[704]: Ignition 2.21.0 May 13 12:50:52.935573 ignition[704]: Stage: fetch-offline May 13 12:50:52.935607 ignition[704]: no configs at "/usr/lib/ignition/base.d" May 13 12:50:52.936456 systemd-networkd[806]: eth0: DHCPv4 address 10.0.0.111/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 12:50:52.935615 ignition[704]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:50:52.936006 ignition[704]: parsed url from cmdline: "" May 13 12:50:52.936014 ignition[704]: no config URL provided May 13 12:50:52.936019 ignition[704]: reading system config file "/usr/lib/ignition/user.ign" May 13 12:50:52.936027 ignition[704]: no config at "/usr/lib/ignition/user.ign" May 13 12:50:52.936048 ignition[704]: op(1): [started] loading QEMU firmware config module May 13 12:50:52.936057 ignition[704]: op(1): executing: "modprobe" "qemu_fw_cfg" May 13 12:50:52.945912 ignition[704]: op(1): [finished] loading QEMU firmware config module May 13 12:50:52.982862 ignition[704]: parsing config with SHA512: 08b37808cfacdeb3e744a93524ba97d4850aa516d83bba8a1d4c020442a38c195eb2b24ac47a5c7e6a87897c26f2713685b4ae175db5fd19fe78276c96a9bfad May 13 12:50:52.986891 unknown[704]: fetched base config from "system" May 13 12:50:52.986913 unknown[704]: fetched user config from "qemu" May 13 12:50:52.987277 ignition[704]: fetch-offline: fetch-offline passed May 13 12:50:52.988828 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 12:50:52.987328 ignition[704]: Ignition finished successfully May 13 12:50:52.991178 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 13 12:50:52.992048 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 12:50:53.020514 ignition[816]: Ignition 2.21.0 May 13 12:50:53.020530 ignition[816]: Stage: kargs May 13 12:50:53.020666 ignition[816]: no configs at "/usr/lib/ignition/base.d" May 13 12:50:53.020675 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:50:53.022468 ignition[816]: kargs: kargs passed May 13 12:50:53.022527 ignition[816]: Ignition finished successfully May 13 12:50:53.025541 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 12:50:53.027431 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 12:50:53.057102 ignition[824]: Ignition 2.21.0 May 13 12:50:53.057118 ignition[824]: Stage: disks May 13 12:50:53.057252 ignition[824]: no configs at "/usr/lib/ignition/base.d" May 13 12:50:53.057261 ignition[824]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:50:53.059962 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 12:50:53.058509 ignition[824]: disks: disks passed May 13 12:50:53.061610 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 12:50:53.058561 ignition[824]: Ignition finished successfully May 13 12:50:53.063286 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 12:50:53.064922 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 12:50:53.066712 systemd[1]: Reached target sysinit.target - System Initialization. May 13 12:50:53.068265 systemd[1]: Reached target basic.target - Basic System. May 13 12:50:53.070962 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 12:50:53.091505 systemd-fsck[833]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 13 12:50:53.095148 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 12:50:53.097882 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 12:50:53.167429 kernel: EXT4-fs (vda9): mounted filesystem 02660b30-6941-48da-9f0e-501a024e2c48 r/w with ordered data mode. Quota mode: none. May 13 12:50:53.168085 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 12:50:53.169300 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 12:50:53.171606 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 12:50:53.173184 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 12:50:53.174188 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 13 12:50:53.174228 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 12:50:53.174250 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 12:50:53.180690 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 12:50:53.182734 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 12:50:53.187062 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (842) May 13 12:50:53.187094 kernel: BTRFS info (device vda6): first mount of filesystem 79dad06b-b9d3-4cc5-b052-ebf459e9d4d7 May 13 12:50:53.187104 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm May 13 12:50:53.188784 kernel: BTRFS info (device vda6): using free-space-tree May 13 12:50:53.191429 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 12:50:53.223285 initrd-setup-root[866]: cut: /sysroot/etc/passwd: No such file or directory May 13 12:50:53.226313 initrd-setup-root[873]: cut: /sysroot/etc/group: No such file or directory May 13 12:50:53.229137 initrd-setup-root[880]: cut: /sysroot/etc/shadow: No such file or directory May 13 12:50:53.233089 initrd-setup-root[887]: cut: /sysroot/etc/gshadow: No such file or directory May 13 12:50:53.305035 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 12:50:53.306991 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 12:50:53.308533 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 12:50:53.326430 kernel: BTRFS info (device vda6): last unmount of filesystem 79dad06b-b9d3-4cc5-b052-ebf459e9d4d7 May 13 12:50:53.345531 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 12:50:53.357109 ignition[956]: INFO : Ignition 2.21.0 May 13 12:50:53.357109 ignition[956]: INFO : Stage: mount May 13 12:50:53.358682 ignition[956]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 12:50:53.358682 ignition[956]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:50:53.361478 ignition[956]: INFO : mount: mount passed May 13 12:50:53.361478 ignition[956]: INFO : Ignition finished successfully May 13 12:50:53.362685 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 12:50:53.366155 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 12:50:53.884550 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 12:50:53.886121 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 12:50:53.919214 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (968) May 13 12:50:53.919247 kernel: BTRFS info (device vda6): first mount of filesystem 79dad06b-b9d3-4cc5-b052-ebf459e9d4d7 May 13 12:50:53.919257 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm May 13 12:50:53.920853 kernel: BTRFS info (device vda6): using free-space-tree May 13 12:50:53.923317 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 12:50:53.955798 ignition[986]: INFO : Ignition 2.21.0 May 13 12:50:53.955798 ignition[986]: INFO : Stage: files May 13 12:50:53.957437 ignition[986]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 12:50:53.957437 ignition[986]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:50:53.957437 ignition[986]: DEBUG : files: compiled without relabeling support, skipping May 13 12:50:53.960656 ignition[986]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 12:50:53.960656 ignition[986]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 12:50:53.960656 ignition[986]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 12:50:53.960656 ignition[986]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 12:50:53.965816 ignition[986]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 12:50:53.965816 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 13 12:50:53.965816 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 May 13 12:50:53.960677 unknown[986]: wrote ssh authorized keys file for user: core May 13 12:50:54.080652 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 12:50:54.119547 systemd-networkd[806]: eth0: Gained IPv6LL May 13 12:50:54.325468 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 13 12:50:54.325468 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 12:50:54.329093 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 12:50:54.329093 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 12:50:54.329093 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 12:50:54.329093 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 12:50:54.329093 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 12:50:54.329093 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 12:50:54.329093 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 12:50:54.340290 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 12:50:54.340290 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 12:50:54.340290 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 12:50:54.340290 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 12:50:54.340290 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 12:50:54.340290 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-arm64.raw: attempt #1 May 13 12:50:54.696188 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 12:50:54.946232 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 12:50:54.946232 ignition[986]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 12:50:54.949529 ignition[986]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 12:50:54.952594 ignition[986]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 12:50:54.952594 ignition[986]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 12:50:54.952594 ignition[986]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 12:50:54.956883 ignition[986]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 12:50:54.956883 ignition[986]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 12:50:54.956883 ignition[986]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 12:50:54.956883 ignition[986]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 13 12:50:54.967819 ignition[986]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 13 12:50:54.970614 ignition[986]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 13 12:50:54.972115 ignition[986]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 13 12:50:54.972115 ignition[986]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 13 12:50:54.972115 ignition[986]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 13 12:50:54.972115 ignition[986]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 12:50:54.972115 ignition[986]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 12:50:54.972115 ignition[986]: INFO : files: files passed May 13 12:50:54.972115 ignition[986]: INFO : Ignition finished successfully May 13 12:50:54.972911 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 12:50:54.976546 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 12:50:54.978584 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 12:50:54.993577 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 12:50:54.994829 initrd-setup-root-after-ignition[1014]: grep: /sysroot/oem/oem-release: No such file or directory May 13 12:50:54.995450 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 12:50:54.998324 initrd-setup-root-after-ignition[1016]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 12:50:54.998324 initrd-setup-root-after-ignition[1016]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 12:50:55.002194 initrd-setup-root-after-ignition[1020]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 12:50:55.002548 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 12:50:55.005675 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 12:50:55.007327 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 12:50:55.036388 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 12:50:55.036497 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 12:50:55.038633 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 12:50:55.040420 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 12:50:55.042161 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 12:50:55.042862 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 12:50:55.056252 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 12:50:55.058633 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 12:50:55.074754 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 12:50:55.075954 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 12:50:55.077960 systemd[1]: Stopped target timers.target - Timer Units. May 13 12:50:55.079686 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 12:50:55.079792 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 12:50:55.082207 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 12:50:55.083277 systemd[1]: Stopped target basic.target - Basic System. May 13 12:50:55.085081 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 12:50:55.086850 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 12:50:55.088553 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 12:50:55.090417 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 13 12:50:55.092372 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 12:50:55.094213 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 12:50:55.096206 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 12:50:55.097939 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 12:50:55.099815 systemd[1]: Stopped target swap.target - Swaps. May 13 12:50:55.101302 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 12:50:55.101430 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 12:50:55.103737 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 12:50:55.105570 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 12:50:55.107436 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 12:50:55.107537 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 12:50:55.109438 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 12:50:55.109549 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 12:50:55.112255 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 12:50:55.112358 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 12:50:55.114727 systemd[1]: Stopped target paths.target - Path Units. May 13 12:50:55.116213 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 12:50:55.119461 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 12:50:55.121101 systemd[1]: Stopped target slices.target - Slice Units. May 13 12:50:55.122779 systemd[1]: Stopped target sockets.target - Socket Units. May 13 12:50:55.124844 systemd[1]: iscsid.socket: Deactivated successfully. May 13 12:50:55.124931 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 12:50:55.126418 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 12:50:55.126495 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 12:50:55.128044 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 12:50:55.128153 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 12:50:55.129749 systemd[1]: ignition-files.service: Deactivated successfully. May 13 12:50:55.129859 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 12:50:55.132085 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 12:50:55.134365 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 12:50:55.135375 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 12:50:55.135511 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 12:50:55.137589 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 12:50:55.137686 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 12:50:55.142513 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 12:50:55.146547 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 12:50:55.154018 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 12:50:55.158805 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 12:50:55.158907 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 12:50:55.163573 ignition[1040]: INFO : Ignition 2.21.0 May 13 12:50:55.163573 ignition[1040]: INFO : Stage: umount May 13 12:50:55.163573 ignition[1040]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 12:50:55.163573 ignition[1040]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:50:55.163573 ignition[1040]: INFO : umount: umount passed May 13 12:50:55.163573 ignition[1040]: INFO : Ignition finished successfully May 13 12:50:55.162402 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 12:50:55.162504 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 12:50:55.164878 systemd[1]: Stopped target network.target - Network. May 13 12:50:55.165903 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 12:50:55.165968 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 12:50:55.167339 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 12:50:55.167386 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 12:50:55.169128 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 12:50:55.169177 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 12:50:55.170959 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 12:50:55.171000 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 12:50:55.172554 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 12:50:55.172602 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 12:50:55.174348 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 12:50:55.175980 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 12:50:55.183833 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 12:50:55.183942 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 12:50:55.186960 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 12:50:55.187221 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 12:50:55.187264 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 12:50:55.190744 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 12:50:55.190936 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 12:50:55.191023 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 12:50:55.193685 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 12:50:55.194032 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 13 12:50:55.195929 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 12:50:55.195964 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 12:50:55.198677 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 12:50:55.200521 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 12:50:55.200581 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 12:50:55.201862 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 12:50:55.201916 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 12:50:55.204964 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 12:50:55.205011 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 12:50:55.207257 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 12:50:55.210497 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 12:50:55.225048 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 12:50:55.228579 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 12:50:55.230074 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 12:50:55.230111 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 12:50:55.231937 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 12:50:55.231964 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 12:50:55.233700 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 12:50:55.233746 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 12:50:55.236200 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 12:50:55.236240 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 12:50:55.238144 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 12:50:55.238192 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 12:50:55.241575 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 12:50:55.242660 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 13 12:50:55.242714 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 13 12:50:55.245398 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 12:50:55.245463 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 12:50:55.248574 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 13 12:50:55.248615 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 12:50:55.251905 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 12:50:55.251948 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 12:50:55.253939 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 12:50:55.253985 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:50:55.257499 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 12:50:55.257582 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 12:50:55.262360 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 12:50:55.264170 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 12:50:55.265520 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 12:50:55.267825 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 12:50:55.297128 systemd[1]: Switching root. May 13 12:50:55.322941 systemd-journald[244]: Journal stopped May 13 12:50:56.047274 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). May 13 12:50:56.047326 kernel: SELinux: policy capability network_peer_controls=1 May 13 12:50:56.047337 kernel: SELinux: policy capability open_perms=1 May 13 12:50:56.047349 kernel: SELinux: policy capability extended_socket_class=1 May 13 12:50:56.047360 kernel: SELinux: policy capability always_check_network=0 May 13 12:50:56.047373 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 12:50:56.047382 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 12:50:56.047391 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 12:50:56.047400 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 12:50:56.047440 kernel: SELinux: policy capability userspace_initial_context=0 May 13 12:50:56.047450 kernel: audit: type=1403 audit(1747140655.463:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 12:50:56.047464 systemd[1]: Successfully loaded SELinux policy in 30.951ms. May 13 12:50:56.047483 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.237ms. May 13 12:50:56.047494 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 12:50:56.047506 systemd[1]: Detected virtualization kvm. May 13 12:50:56.047516 systemd[1]: Detected architecture arm64. May 13 12:50:56.047526 systemd[1]: Detected first boot. May 13 12:50:56.047536 systemd[1]: Initializing machine ID from VM UUID. May 13 12:50:56.047546 zram_generator::config[1085]: No configuration found. May 13 12:50:56.047556 kernel: NET: Registered PF_VSOCK protocol family May 13 12:50:56.047565 systemd[1]: Populated /etc with preset unit settings. May 13 12:50:56.047576 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 12:50:56.047588 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 12:50:56.047601 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 12:50:56.047611 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 12:50:56.047621 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 12:50:56.047631 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 12:50:56.047642 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 12:50:56.047651 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 12:50:56.047663 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 12:50:56.047673 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 12:50:56.047685 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 12:50:56.047695 systemd[1]: Created slice user.slice - User and Session Slice. May 13 12:50:56.047705 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 12:50:56.047715 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 12:50:56.047725 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 12:50:56.047735 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 12:50:56.047745 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 12:50:56.047755 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 12:50:56.047767 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 13 12:50:56.047777 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 12:50:56.047787 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 12:50:56.047798 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 12:50:56.047807 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 12:50:56.047817 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 12:50:56.047827 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 12:50:56.047837 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 12:50:56.047849 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 12:50:56.047864 systemd[1]: Reached target slices.target - Slice Units. May 13 12:50:56.047877 systemd[1]: Reached target swap.target - Swaps. May 13 12:50:56.047888 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 12:50:56.047897 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 12:50:56.047907 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 12:50:56.047917 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 12:50:56.047927 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 12:50:56.047937 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 12:50:56.047947 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 12:50:56.047959 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 12:50:56.047969 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 12:50:56.047979 systemd[1]: Mounting media.mount - External Media Directory... May 13 12:50:56.047989 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 12:50:56.047999 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 12:50:56.048008 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 12:50:56.048019 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 12:50:56.048029 systemd[1]: Reached target machines.target - Containers. May 13 12:50:56.048040 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 12:50:56.048050 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:50:56.048061 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 12:50:56.048071 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 12:50:56.048081 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 12:50:56.048090 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 12:50:56.048101 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:50:56.048111 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 12:50:56.048122 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 12:50:56.048133 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 12:50:56.048143 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 12:50:56.048153 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 12:50:56.048163 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 12:50:56.048173 kernel: fuse: init (API version 7.41) May 13 12:50:56.048182 systemd[1]: Stopped systemd-fsck-usr.service. May 13 12:50:56.048192 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:50:56.048202 kernel: loop: module loaded May 13 12:50:56.048213 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 12:50:56.048223 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 12:50:56.048237 kernel: ACPI: bus type drm_connector registered May 13 12:50:56.048247 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 12:50:56.048257 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 12:50:56.048267 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 12:50:56.048278 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 12:50:56.048289 systemd[1]: verity-setup.service: Deactivated successfully. May 13 12:50:56.048299 systemd[1]: Stopped verity-setup.service. May 13 12:50:56.048309 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 12:50:56.048319 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 12:50:56.048329 systemd[1]: Mounted media.mount - External Media Directory. May 13 12:50:56.048342 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 12:50:56.048352 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 12:50:56.048364 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 12:50:56.048393 systemd-journald[1161]: Collecting audit messages is disabled. May 13 12:50:56.048425 systemd-journald[1161]: Journal started May 13 12:50:56.048448 systemd-journald[1161]: Runtime Journal (/run/log/journal/4cf638ac634b40d885f8051f184a30c5) is 6M, max 48.5M, 42.4M free. May 13 12:50:56.048485 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 12:50:55.821765 systemd[1]: Queued start job for default target multi-user.target. May 13 12:50:55.845207 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 13 12:50:55.845578 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 12:50:56.052424 systemd[1]: Started systemd-journald.service - Journal Service. May 13 12:50:56.053144 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 12:50:56.054756 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 12:50:56.054939 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 12:50:56.056312 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 12:50:56.056502 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 12:50:56.057776 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 12:50:56.057943 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 12:50:56.059233 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:50:56.059396 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:50:56.060892 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 12:50:56.061054 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 12:50:56.062327 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 12:50:56.062493 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 12:50:56.063771 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 12:50:56.065073 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 12:50:56.066547 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 12:50:56.069434 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 12:50:56.081924 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 12:50:56.084321 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 12:50:56.086300 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 12:50:56.087510 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 12:50:56.087537 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 12:50:56.089325 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 12:50:56.095276 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 12:50:56.096615 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:50:56.097876 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 12:50:56.099802 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 12:50:56.100982 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 12:50:56.102126 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 12:50:56.103446 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 12:50:56.106535 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 12:50:56.107476 systemd-journald[1161]: Time spent on flushing to /var/log/journal/4cf638ac634b40d885f8051f184a30c5 is 20.574ms for 883 entries. May 13 12:50:56.107476 systemd-journald[1161]: System Journal (/var/log/journal/4cf638ac634b40d885f8051f184a30c5) is 8M, max 195.6M, 187.6M free. May 13 12:50:56.140812 systemd-journald[1161]: Received client request to flush runtime journal. May 13 12:50:56.140856 kernel: loop0: detected capacity change from 0 to 107312 May 13 12:50:56.140884 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 12:50:56.109565 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 12:50:56.111695 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 12:50:56.115432 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 12:50:56.116794 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 12:50:56.118120 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 12:50:56.119636 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 12:50:56.123345 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 12:50:56.127621 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 12:50:56.143493 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 12:50:56.146763 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 12:50:56.162753 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. May 13 12:50:56.162772 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. May 13 12:50:56.166841 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 12:50:56.168529 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 12:50:56.171759 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 12:50:56.172461 kernel: loop1: detected capacity change from 0 to 138376 May 13 12:50:56.198249 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 12:50:56.200783 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 12:50:56.205419 kernel: loop2: detected capacity change from 0 to 189592 May 13 12:50:56.226913 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. May 13 12:50:56.226928 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. May 13 12:50:56.230483 kernel: loop3: detected capacity change from 0 to 107312 May 13 12:50:56.231067 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 12:50:56.243434 kernel: loop4: detected capacity change from 0 to 138376 May 13 12:50:56.249432 kernel: loop5: detected capacity change from 0 to 189592 May 13 12:50:56.254425 (sd-merge)[1229]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 13 12:50:56.254792 (sd-merge)[1229]: Merged extensions into '/usr'. May 13 12:50:56.258160 systemd[1]: Reload requested from client PID 1203 ('systemd-sysext') (unit systemd-sysext.service)... May 13 12:50:56.258177 systemd[1]: Reloading... May 13 12:50:56.309572 zram_generator::config[1256]: No configuration found. May 13 12:50:56.357419 ldconfig[1198]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 12:50:56.382373 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:50:56.443925 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 12:50:56.444021 systemd[1]: Reloading finished in 185 ms. May 13 12:50:56.462438 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 12:50:56.463874 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 12:50:56.481659 systemd[1]: Starting ensure-sysext.service... May 13 12:50:56.483390 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 12:50:56.493540 systemd[1]: Reload requested from client PID 1290 ('systemctl') (unit ensure-sysext.service)... May 13 12:50:56.493558 systemd[1]: Reloading... May 13 12:50:56.504655 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 13 12:50:56.504686 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 13 12:50:56.504920 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 12:50:56.505097 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 12:50:56.505703 systemd-tmpfiles[1291]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 12:50:56.505917 systemd-tmpfiles[1291]: ACLs are not supported, ignoring. May 13 12:50:56.505964 systemd-tmpfiles[1291]: ACLs are not supported, ignoring. May 13 12:50:56.508532 systemd-tmpfiles[1291]: Detected autofs mount point /boot during canonicalization of boot. May 13 12:50:56.508543 systemd-tmpfiles[1291]: Skipping /boot May 13 12:50:56.516741 systemd-tmpfiles[1291]: Detected autofs mount point /boot during canonicalization of boot. May 13 12:50:56.516756 systemd-tmpfiles[1291]: Skipping /boot May 13 12:50:56.541457 zram_generator::config[1318]: No configuration found. May 13 12:50:56.605620 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:50:56.667361 systemd[1]: Reloading finished in 173 ms. May 13 12:50:56.679445 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 12:50:56.685003 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 12:50:56.695520 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 12:50:56.697836 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 12:50:56.699999 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 12:50:56.704127 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 12:50:56.707329 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 12:50:56.710533 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 12:50:56.718045 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 12:50:56.723420 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:50:56.724522 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 12:50:56.726653 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:50:56.730727 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 12:50:56.731806 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:50:56.731984 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:50:56.738529 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 12:50:56.743106 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:50:56.743262 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:50:56.743713 systemd-udevd[1364]: Using default interface naming scheme 'v255'. May 13 12:50:56.747797 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 12:50:56.747962 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 12:50:56.750687 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 12:50:56.752729 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 12:50:56.752889 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 12:50:56.760565 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 12:50:56.763082 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:50:56.764706 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:50:56.765902 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:50:56.766015 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:50:56.766136 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 12:50:56.767257 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 12:50:56.770052 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 12:50:56.770682 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 12:50:56.773439 augenrules[1391]: No rules May 13 12:50:56.775503 systemd[1]: audit-rules.service: Deactivated successfully. May 13 12:50:56.777569 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 12:50:56.779182 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:50:56.779310 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:50:56.787537 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 12:50:56.794286 systemd[1]: Finished ensure-sysext.service. May 13 12:50:56.804192 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 12:50:56.806576 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:50:56.807515 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 12:50:56.813897 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 12:50:56.820579 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:50:56.824240 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 12:50:56.825368 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:50:56.825424 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:50:56.828644 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 12:50:56.837514 augenrules[1429]: /sbin/augenrules: No change May 13 12:50:56.838208 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 12:50:56.840535 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 12:50:56.841187 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 12:50:56.845495 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 12:50:56.845659 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 12:50:56.847397 augenrules[1454]: No rules May 13 12:50:56.849791 systemd[1]: audit-rules.service: Deactivated successfully. May 13 12:50:56.851449 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 12:50:56.852682 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 12:50:56.852825 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 12:50:56.855063 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:50:56.856445 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:50:56.858575 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 12:50:56.858889 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 12:50:56.877907 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 13 12:50:56.882519 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 12:50:56.885563 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 12:50:56.886744 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 12:50:56.886790 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 12:50:56.917449 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 12:50:56.975913 systemd-networkd[1444]: lo: Link UP May 13 12:50:56.975920 systemd-networkd[1444]: lo: Gained carrier May 13 12:50:56.976425 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 12:50:56.978326 systemd[1]: Reached target time-set.target - System Time Set. May 13 12:50:56.979245 systemd-networkd[1444]: Enumeration completed May 13 12:50:56.979671 systemd-networkd[1444]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:50:56.979678 systemd-networkd[1444]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 12:50:56.980605 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 12:50:56.981756 systemd-networkd[1444]: eth0: Link UP May 13 12:50:56.981879 systemd-networkd[1444]: eth0: Gained carrier May 13 12:50:56.981899 systemd-networkd[1444]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:50:56.982943 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 12:50:56.985623 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 12:50:56.988107 systemd-resolved[1358]: Positive Trust Anchors: May 13 12:50:56.988124 systemd-resolved[1358]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 12:50:56.988155 systemd-resolved[1358]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 12:50:56.994789 systemd-resolved[1358]: Defaulting to hostname 'linux'. May 13 12:50:56.995451 systemd-networkd[1444]: eth0: DHCPv4 address 10.0.0.111/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 12:50:56.996488 systemd-timesyncd[1447]: Network configuration changed, trying to establish connection. May 13 12:50:56.997385 systemd-timesyncd[1447]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 13 12:50:56.997524 systemd-timesyncd[1447]: Initial clock synchronization to Tue 2025-05-13 12:50:57.211696 UTC. May 13 12:50:57.000215 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 12:50:57.001539 systemd[1]: Reached target network.target - Network. May 13 12:50:57.002722 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 12:50:57.003965 systemd[1]: Reached target sysinit.target - System Initialization. May 13 12:50:57.005863 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 12:50:57.007207 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 12:50:57.008615 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 12:50:57.010558 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 12:50:57.011799 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 12:50:57.013043 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 12:50:57.013082 systemd[1]: Reached target paths.target - Path Units. May 13 12:50:57.014670 systemd[1]: Reached target timers.target - Timer Units. May 13 12:50:57.016648 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 12:50:57.019690 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 12:50:57.023790 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 12:50:57.025305 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 12:50:57.026859 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 12:50:57.030134 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 12:50:57.031913 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 12:50:57.034556 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 12:50:57.036589 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 12:50:57.039561 systemd[1]: Reached target sockets.target - Socket Units. May 13 12:50:57.040641 systemd[1]: Reached target basic.target - Basic System. May 13 12:50:57.041981 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 12:50:57.042010 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 12:50:57.043056 systemd[1]: Starting containerd.service - containerd container runtime... May 13 12:50:57.045646 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 12:50:57.049564 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 12:50:57.051654 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 12:50:57.053707 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 12:50:57.054721 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 12:50:57.055694 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 12:50:57.058309 jq[1500]: false May 13 12:50:57.058649 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 12:50:57.060605 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 12:50:57.062614 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 12:50:57.067210 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 12:50:57.069137 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 12:50:57.069643 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 12:50:57.071564 systemd[1]: Starting update-engine.service - Update Engine... May 13 12:50:57.073562 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 12:50:57.079674 extend-filesystems[1501]: Found loop3 May 13 12:50:57.080596 extend-filesystems[1501]: Found loop4 May 13 12:50:57.080596 extend-filesystems[1501]: Found loop5 May 13 12:50:57.080596 extend-filesystems[1501]: Found vda May 13 12:50:57.080596 extend-filesystems[1501]: Found vda1 May 13 12:50:57.080596 extend-filesystems[1501]: Found vda2 May 13 12:50:57.080596 extend-filesystems[1501]: Found vda3 May 13 12:50:57.080596 extend-filesystems[1501]: Found usr May 13 12:50:57.080596 extend-filesystems[1501]: Found vda4 May 13 12:50:57.080596 extend-filesystems[1501]: Found vda6 May 13 12:50:57.080596 extend-filesystems[1501]: Found vda7 May 13 12:50:57.080596 extend-filesystems[1501]: Found vda9 May 13 12:50:57.080596 extend-filesystems[1501]: Checking size of /dev/vda9 May 13 12:50:57.086851 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 12:50:57.088840 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 12:50:57.089099 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 12:50:57.095819 jq[1516]: true May 13 12:50:57.089599 systemd[1]: motdgen.service: Deactivated successfully. May 13 12:50:57.089776 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 12:50:57.096126 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 12:50:57.096582 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 12:50:57.098169 extend-filesystems[1501]: Resized partition /dev/vda9 May 13 12:50:57.110456 extend-filesystems[1524]: resize2fs 1.47.2 (1-Jan-2025) May 13 12:50:57.122872 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 13 12:50:57.119065 (ntainerd)[1526]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 12:50:57.123270 jq[1525]: true May 13 12:50:57.125417 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:50:57.138191 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 13 12:50:57.143053 tar[1521]: linux-arm64/helm May 13 12:50:57.151757 extend-filesystems[1524]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 13 12:50:57.151757 extend-filesystems[1524]: old_desc_blocks = 1, new_desc_blocks = 1 May 13 12:50:57.151757 extend-filesystems[1524]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 13 12:50:57.157518 extend-filesystems[1501]: Resized filesystem in /dev/vda9 May 13 12:50:57.154214 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 12:50:57.157979 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 12:50:57.183546 update_engine[1511]: I20250513 12:50:57.182263 1511 main.cc:92] Flatcar Update Engine starting May 13 12:50:57.193534 systemd-logind[1509]: Watching system buttons on /dev/input/event0 (Power Button) May 13 12:50:57.197173 dbus-daemon[1498]: [system] SELinux support is enabled May 13 12:50:57.197898 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 12:50:57.200503 systemd-logind[1509]: New seat seat0. May 13 12:50:57.204189 systemd[1]: Started systemd-logind.service - User Login Management. May 13 12:50:57.209421 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 12:50:57.209471 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 12:50:57.210705 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 12:50:57.210730 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 12:50:57.214352 systemd[1]: Started update-engine.service - Update Engine. May 13 12:50:57.214756 dbus-daemon[1498]: [system] Successfully activated service 'org.freedesktop.systemd1' May 13 12:50:57.217056 update_engine[1511]: I20250513 12:50:57.214262 1511 update_check_scheduler.cc:74] Next update check in 4m2s May 13 12:50:57.217319 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 12:50:57.225945 bash[1558]: Updated "/home/core/.ssh/authorized_keys" May 13 12:50:57.230485 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 12:50:57.232378 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 13 12:50:57.248514 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:50:57.288001 locksmithd[1559]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 12:50:57.338982 sshd_keygen[1518]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 12:50:57.358360 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 12:50:57.361116 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 12:50:57.374648 containerd[1526]: time="2025-05-13T12:50:57Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 12:50:57.377447 containerd[1526]: time="2025-05-13T12:50:57.377170989Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 13 12:50:57.381537 systemd[1]: issuegen.service: Deactivated successfully. May 13 12:50:57.381756 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 12:50:57.384405 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388009185Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.022µs" May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388043112Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388062211Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388210116Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388227654Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388250203Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388298998Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 12:50:57.388473 containerd[1526]: time="2025-05-13T12:50:57.388310170Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 12:50:57.388666 containerd[1526]: time="2025-05-13T12:50:57.388548600Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 12:50:57.388666 containerd[1526]: time="2025-05-13T12:50:57.388565769Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 12:50:57.388666 containerd[1526]: time="2025-05-13T12:50:57.388576530Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 12:50:57.388666 containerd[1526]: time="2025-05-13T12:50:57.388584991Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 12:50:57.388734 containerd[1526]: time="2025-05-13T12:50:57.388665248Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 12:50:57.388900 containerd[1526]: time="2025-05-13T12:50:57.388854390Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 12:50:57.388900 containerd[1526]: time="2025-05-13T12:50:57.388896449Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 12:50:57.389998 containerd[1526]: time="2025-05-13T12:50:57.389812957Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 12:50:57.390628 containerd[1526]: time="2025-05-13T12:50:57.390595114Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 12:50:57.390927 containerd[1526]: time="2025-05-13T12:50:57.390899097Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 12:50:57.391009 containerd[1526]: time="2025-05-13T12:50:57.390987651Z" level=info msg="metadata content store policy set" policy=shared May 13 12:50:57.396677 containerd[1526]: time="2025-05-13T12:50:57.396630290Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 12:50:57.396677 containerd[1526]: time="2025-05-13T12:50:57.396690339Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396705782Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396718474Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396731412Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396742625Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396754742Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396766653Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396779262Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396789983Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396799758Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 12:50:57.396849 containerd[1526]: time="2025-05-13T12:50:57.396811998Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 12:50:57.397014 containerd[1526]: time="2025-05-13T12:50:57.396941173Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 12:50:57.397014 containerd[1526]: time="2025-05-13T12:50:57.396961504Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 12:50:57.397014 containerd[1526]: time="2025-05-13T12:50:57.396976003Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 12:50:57.397014 containerd[1526]: time="2025-05-13T12:50:57.396988079Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 12:50:57.397014 containerd[1526]: time="2025-05-13T12:50:57.396998593Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 12:50:57.397014 containerd[1526]: time="2025-05-13T12:50:57.397009231Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 12:50:57.397112 containerd[1526]: time="2025-05-13T12:50:57.397019910Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 12:50:57.397112 containerd[1526]: time="2025-05-13T12:50:57.397036792Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 12:50:57.397112 containerd[1526]: time="2025-05-13T12:50:57.397048949Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 12:50:57.397112 containerd[1526]: time="2025-05-13T12:50:57.397059053Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 12:50:57.397112 containerd[1526]: time="2025-05-13T12:50:57.397069322Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 12:50:57.397403 containerd[1526]: time="2025-05-13T12:50:57.397367267Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 12:50:57.397403 containerd[1526]: time="2025-05-13T12:50:57.397391993Z" level=info msg="Start snapshots syncer" May 13 12:50:57.397471 containerd[1526]: time="2025-05-13T12:50:57.397416103Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 12:50:57.397683 containerd[1526]: time="2025-05-13T12:50:57.397643361Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 12:50:57.397834 containerd[1526]: time="2025-05-13T12:50:57.397700083Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 12:50:57.397834 containerd[1526]: time="2025-05-13T12:50:57.397783092Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 12:50:57.397911 containerd[1526]: time="2025-05-13T12:50:57.397885570Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 12:50:57.397911 containerd[1526]: time="2025-05-13T12:50:57.397907831Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 12:50:57.397946 containerd[1526]: time="2025-05-13T12:50:57.397918469Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 12:50:57.397946 containerd[1526]: time="2025-05-13T12:50:57.397928984Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 12:50:57.397946 containerd[1526]: time="2025-05-13T12:50:57.397940115Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 12:50:57.397995 containerd[1526]: time="2025-05-13T12:50:57.397950547Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 12:50:57.397995 containerd[1526]: time="2025-05-13T12:50:57.397961678Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 12:50:57.397995 containerd[1526]: time="2025-05-13T12:50:57.397986076Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 12:50:57.398046 containerd[1526]: time="2025-05-13T12:50:57.397996508Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 12:50:57.398046 containerd[1526]: time="2025-05-13T12:50:57.398006859Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 12:50:57.398046 containerd[1526]: time="2025-05-13T12:50:57.398036924Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 12:50:57.398094 containerd[1526]: time="2025-05-13T12:50:57.398049739Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 12:50:57.398094 containerd[1526]: time="2025-05-13T12:50:57.398057954Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 12:50:57.398094 containerd[1526]: time="2025-05-13T12:50:57.398067072Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 12:50:57.398094 containerd[1526]: time="2025-05-13T12:50:57.398079723Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 12:50:57.398094 containerd[1526]: time="2025-05-13T12:50:57.398092373Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 12:50:57.398181 containerd[1526]: time="2025-05-13T12:50:57.398104531Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 12:50:57.398199 containerd[1526]: time="2025-05-13T12:50:57.398180188Z" level=info msg="runtime interface created" May 13 12:50:57.398199 containerd[1526]: time="2025-05-13T12:50:57.398185938Z" level=info msg="created NRI interface" May 13 12:50:57.398199 containerd[1526]: time="2025-05-13T12:50:57.398194029Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 12:50:57.398259 containerd[1526]: time="2025-05-13T12:50:57.398204134Z" level=info msg="Connect containerd service" May 13 12:50:57.398259 containerd[1526]: time="2025-05-13T12:50:57.398228777Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 12:50:57.398909 containerd[1526]: time="2025-05-13T12:50:57.398870793Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 12:50:57.403556 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 12:50:57.406676 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 12:50:57.409703 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 13 12:50:57.411055 systemd[1]: Reached target getty.target - Login Prompts. May 13 12:50:57.522080 containerd[1526]: time="2025-05-13T12:50:57.521960907Z" level=info msg="Start subscribing containerd event" May 13 12:50:57.522080 containerd[1526]: time="2025-05-13T12:50:57.522028144Z" level=info msg="Start recovering state" May 13 12:50:57.522235 containerd[1526]: time="2025-05-13T12:50:57.522213508Z" level=info msg="Start event monitor" May 13 12:50:57.522235 containerd[1526]: time="2025-05-13T12:50:57.522233223Z" level=info msg="Start cni network conf syncer for default" May 13 12:50:57.522354 containerd[1526]: time="2025-05-13T12:50:57.522243820Z" level=info msg="Start streaming server" May 13 12:50:57.522354 containerd[1526]: time="2025-05-13T12:50:57.522252774Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 12:50:57.522354 containerd[1526]: time="2025-05-13T12:50:57.522259510Z" level=info msg="runtime interface starting up..." May 13 12:50:57.522354 containerd[1526]: time="2025-05-13T12:50:57.522265055Z" level=info msg="starting plugins..." May 13 12:50:57.522354 containerd[1526]: time="2025-05-13T12:50:57.522277253Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 12:50:57.522354 containerd[1526]: time="2025-05-13T12:50:57.522308674Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 12:50:57.522611 containerd[1526]: time="2025-05-13T12:50:57.522473583Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 12:50:57.525445 containerd[1526]: time="2025-05-13T12:50:57.525387071Z" level=info msg="containerd successfully booted in 0.151090s" May 13 12:50:57.525511 systemd[1]: Started containerd.service - containerd container runtime. May 13 12:50:57.551188 tar[1521]: linux-arm64/LICENSE May 13 12:50:57.551312 tar[1521]: linux-arm64/README.md May 13 12:50:57.569296 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 12:50:58.920535 systemd-networkd[1444]: eth0: Gained IPv6LL May 13 12:50:58.922817 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 12:50:58.924580 systemd[1]: Reached target network-online.target - Network is Online. May 13 12:50:58.927030 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 13 12:50:58.929484 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:50:58.937234 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 12:50:58.958495 systemd[1]: coreos-metadata.service: Deactivated successfully. May 13 12:50:58.958726 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 13 12:50:58.961528 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 12:50:58.962953 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 12:50:59.423859 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:50:59.425346 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 12:50:59.427061 (kubelet)[1633]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 12:50:59.430538 systemd[1]: Startup finished in 2.118s (kernel) + 4.850s (initrd) + 4.005s (userspace) = 10.974s. May 13 12:50:59.836028 kubelet[1633]: E0513 12:50:59.835916 1633 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 12:50:59.838636 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 12:50:59.838782 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 12:50:59.839100 systemd[1]: kubelet.service: Consumed 759ms CPU time, 232.6M memory peak. May 13 12:51:03.165950 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 12:51:03.167109 systemd[1]: Started sshd@0-10.0.0.111:22-10.0.0.1:49188.service - OpenSSH per-connection server daemon (10.0.0.1:49188). May 13 12:51:03.242662 sshd[1646]: Accepted publickey for core from 10.0.0.1 port 49188 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:03.244149 sshd-session[1646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:03.249775 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 12:51:03.250656 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 12:51:03.256811 systemd-logind[1509]: New session 1 of user core. May 13 12:51:03.271787 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 12:51:03.273999 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 12:51:03.290254 (systemd)[1650]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 12:51:03.292240 systemd-logind[1509]: New session c1 of user core. May 13 12:51:03.388073 systemd[1650]: Queued start job for default target default.target. May 13 12:51:03.397256 systemd[1650]: Created slice app.slice - User Application Slice. May 13 12:51:03.397284 systemd[1650]: Reached target paths.target - Paths. May 13 12:51:03.397318 systemd[1650]: Reached target timers.target - Timers. May 13 12:51:03.399573 systemd[1650]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 12:51:03.406473 systemd[1650]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 12:51:03.406532 systemd[1650]: Reached target sockets.target - Sockets. May 13 12:51:03.406569 systemd[1650]: Reached target basic.target - Basic System. May 13 12:51:03.406601 systemd[1650]: Reached target default.target - Main User Target. May 13 12:51:03.406627 systemd[1650]: Startup finished in 109ms. May 13 12:51:03.406727 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 12:51:03.407998 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 12:51:03.470608 systemd[1]: Started sshd@1-10.0.0.111:22-10.0.0.1:49198.service - OpenSSH per-connection server daemon (10.0.0.1:49198). May 13 12:51:03.533965 sshd[1661]: Accepted publickey for core from 10.0.0.1 port 49198 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:03.535087 sshd-session[1661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:03.539507 systemd-logind[1509]: New session 2 of user core. May 13 12:51:03.553636 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 12:51:03.604343 sshd[1663]: Connection closed by 10.0.0.1 port 49198 May 13 12:51:03.604635 sshd-session[1661]: pam_unix(sshd:session): session closed for user core May 13 12:51:03.621226 systemd[1]: sshd@1-10.0.0.111:22-10.0.0.1:49198.service: Deactivated successfully. May 13 12:51:03.623651 systemd[1]: session-2.scope: Deactivated successfully. May 13 12:51:03.624204 systemd-logind[1509]: Session 2 logged out. Waiting for processes to exit. May 13 12:51:03.626248 systemd[1]: Started sshd@2-10.0.0.111:22-10.0.0.1:49212.service - OpenSSH per-connection server daemon (10.0.0.1:49212). May 13 12:51:03.627033 systemd-logind[1509]: Removed session 2. May 13 12:51:03.678043 sshd[1669]: Accepted publickey for core from 10.0.0.1 port 49212 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:03.679055 sshd-session[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:03.682375 systemd-logind[1509]: New session 3 of user core. May 13 12:51:03.695545 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 12:51:03.742055 sshd[1672]: Connection closed by 10.0.0.1 port 49212 May 13 12:51:03.742404 sshd-session[1669]: pam_unix(sshd:session): session closed for user core May 13 12:51:03.752184 systemd[1]: sshd@2-10.0.0.111:22-10.0.0.1:49212.service: Deactivated successfully. May 13 12:51:03.754642 systemd[1]: session-3.scope: Deactivated successfully. May 13 12:51:03.755923 systemd-logind[1509]: Session 3 logged out. Waiting for processes to exit. May 13 12:51:03.757437 systemd-logind[1509]: Removed session 3. May 13 12:51:03.759087 systemd[1]: Started sshd@3-10.0.0.111:22-10.0.0.1:49226.service - OpenSSH per-connection server daemon (10.0.0.1:49226). May 13 12:51:03.809589 sshd[1678]: Accepted publickey for core from 10.0.0.1 port 49226 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:03.810802 sshd-session[1678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:03.814948 systemd-logind[1509]: New session 4 of user core. May 13 12:51:03.826587 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 12:51:03.877361 sshd[1680]: Connection closed by 10.0.0.1 port 49226 May 13 12:51:03.877761 sshd-session[1678]: pam_unix(sshd:session): session closed for user core May 13 12:51:03.887382 systemd[1]: sshd@3-10.0.0.111:22-10.0.0.1:49226.service: Deactivated successfully. May 13 12:51:03.889214 systemd[1]: session-4.scope: Deactivated successfully. May 13 12:51:03.891212 systemd-logind[1509]: Session 4 logged out. Waiting for processes to exit. May 13 12:51:03.894251 systemd[1]: Started sshd@4-10.0.0.111:22-10.0.0.1:49234.service - OpenSSH per-connection server daemon (10.0.0.1:49234). May 13 12:51:03.894961 systemd-logind[1509]: Removed session 4. May 13 12:51:03.952365 sshd[1686]: Accepted publickey for core from 10.0.0.1 port 49234 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:03.953458 sshd-session[1686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:03.958093 systemd-logind[1509]: New session 5 of user core. May 13 12:51:03.965570 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 12:51:04.029753 sudo[1689]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 12:51:04.030505 sudo[1689]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:51:04.047017 sudo[1689]: pam_unix(sudo:session): session closed for user root May 13 12:51:04.048755 sshd[1688]: Connection closed by 10.0.0.1 port 49234 May 13 12:51:04.049461 sshd-session[1686]: pam_unix(sshd:session): session closed for user core May 13 12:51:04.058997 systemd[1]: sshd@4-10.0.0.111:22-10.0.0.1:49234.service: Deactivated successfully. May 13 12:51:04.060473 systemd[1]: session-5.scope: Deactivated successfully. May 13 12:51:04.061287 systemd-logind[1509]: Session 5 logged out. Waiting for processes to exit. May 13 12:51:04.064133 systemd[1]: Started sshd@5-10.0.0.111:22-10.0.0.1:49246.service - OpenSSH per-connection server daemon (10.0.0.1:49246). May 13 12:51:04.064848 systemd-logind[1509]: Removed session 5. May 13 12:51:04.127844 sshd[1695]: Accepted publickey for core from 10.0.0.1 port 49246 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:04.128932 sshd-session[1695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:04.132470 systemd-logind[1509]: New session 6 of user core. May 13 12:51:04.144618 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 12:51:04.194066 sudo[1699]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 12:51:04.194322 sudo[1699]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:51:04.198344 sudo[1699]: pam_unix(sudo:session): session closed for user root May 13 12:51:04.202455 sudo[1698]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 12:51:04.202686 sudo[1698]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:51:04.210581 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 12:51:04.243188 augenrules[1721]: No rules May 13 12:51:04.244283 systemd[1]: audit-rules.service: Deactivated successfully. May 13 12:51:04.246469 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 12:51:04.247626 sudo[1698]: pam_unix(sudo:session): session closed for user root May 13 12:51:04.248633 sshd[1697]: Connection closed by 10.0.0.1 port 49246 May 13 12:51:04.249056 sshd-session[1695]: pam_unix(sshd:session): session closed for user core May 13 12:51:04.258326 systemd[1]: sshd@5-10.0.0.111:22-10.0.0.1:49246.service: Deactivated successfully. May 13 12:51:04.260680 systemd[1]: session-6.scope: Deactivated successfully. May 13 12:51:04.262144 systemd-logind[1509]: Session 6 logged out. Waiting for processes to exit. May 13 12:51:04.263262 systemd[1]: Started sshd@6-10.0.0.111:22-10.0.0.1:49252.service - OpenSSH per-connection server daemon (10.0.0.1:49252). May 13 12:51:04.263994 systemd-logind[1509]: Removed session 6. May 13 12:51:04.318526 sshd[1730]: Accepted publickey for core from 10.0.0.1 port 49252 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:04.319583 sshd-session[1730]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:04.323385 systemd-logind[1509]: New session 7 of user core. May 13 12:51:04.337590 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 12:51:04.387913 sudo[1733]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 12:51:04.388187 sudo[1733]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:51:04.744643 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 12:51:04.759833 (dockerd)[1755]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 12:51:05.025337 dockerd[1755]: time="2025-05-13T12:51:05.024997723Z" level=info msg="Starting up" May 13 12:51:05.027009 dockerd[1755]: time="2025-05-13T12:51:05.026971070Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 12:51:05.111649 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1933530066-merged.mount: Deactivated successfully. May 13 12:51:05.130006 dockerd[1755]: time="2025-05-13T12:51:05.129815189Z" level=info msg="Loading containers: start." May 13 12:51:05.137441 kernel: Initializing XFRM netlink socket May 13 12:51:05.328348 systemd-networkd[1444]: docker0: Link UP May 13 12:51:05.331870 dockerd[1755]: time="2025-05-13T12:51:05.331832816Z" level=info msg="Loading containers: done." May 13 12:51:05.343759 dockerd[1755]: time="2025-05-13T12:51:05.343711931Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 12:51:05.343868 dockerd[1755]: time="2025-05-13T12:51:05.343793274Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 13 12:51:05.343899 dockerd[1755]: time="2025-05-13T12:51:05.343886566Z" level=info msg="Initializing buildkit" May 13 12:51:05.366849 dockerd[1755]: time="2025-05-13T12:51:05.366817641Z" level=info msg="Completed buildkit initialization" May 13 12:51:05.371649 dockerd[1755]: time="2025-05-13T12:51:05.371614783Z" level=info msg="Daemon has completed initialization" May 13 12:51:05.371699 dockerd[1755]: time="2025-05-13T12:51:05.371662378Z" level=info msg="API listen on /run/docker.sock" May 13 12:51:05.371900 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 12:51:06.066682 containerd[1526]: time="2025-05-13T12:51:06.066651157Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 13 12:51:06.109735 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3851291689-merged.mount: Deactivated successfully. May 13 12:51:06.691152 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3715630425.mount: Deactivated successfully. May 13 12:51:07.710370 containerd[1526]: time="2025-05-13T12:51:07.710323662Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:07.710863 containerd[1526]: time="2025-05-13T12:51:07.710828078Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=25554610" May 13 12:51:07.711738 containerd[1526]: time="2025-05-13T12:51:07.711708408Z" level=info msg="ImageCreate event name:\"sha256:ef8fb1ea7c9599dbedea6f9d5589975ebc5bf4ec72f6be6acaaec59a723a09b3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:07.714162 containerd[1526]: time="2025-05-13T12:51:07.714125187Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:07.715229 containerd[1526]: time="2025-05-13T12:51:07.715191622Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:ef8fb1ea7c9599dbedea6f9d5589975ebc5bf4ec72f6be6acaaec59a723a09b3\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"25551408\" in 1.648505071s" May 13 12:51:07.715229 containerd[1526]: time="2025-05-13T12:51:07.715229849Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:ef8fb1ea7c9599dbedea6f9d5589975ebc5bf4ec72f6be6acaaec59a723a09b3\"" May 13 12:51:07.715862 containerd[1526]: time="2025-05-13T12:51:07.715834085Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 13 12:51:08.886656 containerd[1526]: time="2025-05-13T12:51:08.886597677Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:08.887377 containerd[1526]: time="2025-05-13T12:51:08.887317334Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=22458980" May 13 12:51:08.887963 containerd[1526]: time="2025-05-13T12:51:08.887937179Z" level=info msg="ImageCreate event name:\"sha256:ea6e6085feca75547d0422ab0536fe0d18c9ff5831de7a9d6a707c968027bb6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:08.893467 containerd[1526]: time="2025-05-13T12:51:08.893436858Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:08.894665 containerd[1526]: time="2025-05-13T12:51:08.894636260Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:ea6e6085feca75547d0422ab0536fe0d18c9ff5831de7a9d6a707c968027bb6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"23900539\" in 1.178769111s" May 13 12:51:08.894700 containerd[1526]: time="2025-05-13T12:51:08.894666767Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:ea6e6085feca75547d0422ab0536fe0d18c9ff5831de7a9d6a707c968027bb6a\"" May 13 12:51:08.895347 containerd[1526]: time="2025-05-13T12:51:08.895146955Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 13 12:51:09.897516 containerd[1526]: time="2025-05-13T12:51:09.897456053Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:09.898133 containerd[1526]: time="2025-05-13T12:51:09.898102407Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=17125815" May 13 12:51:09.899168 containerd[1526]: time="2025-05-13T12:51:09.899126552Z" level=info msg="ImageCreate event name:\"sha256:1d2db6ef0dd2f3e08bdfcd46afde7b755b05192841f563d8df54b807daaa7d8d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:09.901370 containerd[1526]: time="2025-05-13T12:51:09.901316724Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:09.902982 containerd[1526]: time="2025-05-13T12:51:09.902886683Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:1d2db6ef0dd2f3e08bdfcd46afde7b755b05192841f563d8df54b807daaa7d8d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"18567392\" in 1.007706128s" May 13 12:51:09.902982 containerd[1526]: time="2025-05-13T12:51:09.902919540Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:1d2db6ef0dd2f3e08bdfcd46afde7b755b05192841f563d8df54b807daaa7d8d\"" May 13 12:51:09.903388 containerd[1526]: time="2025-05-13T12:51:09.903350697Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 13 12:51:10.089096 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 12:51:10.090608 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:51:10.218921 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:51:10.222298 (kubelet)[2037]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 12:51:10.275736 kubelet[2037]: E0513 12:51:10.275681 2037 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 12:51:10.278679 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 12:51:10.278903 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 12:51:10.279232 systemd[1]: kubelet.service: Consumed 131ms CPU time, 97.2M memory peak. May 13 12:51:10.907518 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount453163572.mount: Deactivated successfully. May 13 12:51:11.250962 containerd[1526]: time="2025-05-13T12:51:11.250737425Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:11.251739 containerd[1526]: time="2025-05-13T12:51:11.251698706Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=26871919" May 13 12:51:11.252509 containerd[1526]: time="2025-05-13T12:51:11.252440443Z" level=info msg="ImageCreate event name:\"sha256:c5361ece77e80334cd5fb082c0b678cb3244f5834ecacea1719ae6b38b465581\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:11.254752 containerd[1526]: time="2025-05-13T12:51:11.254706339Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:11.255812 containerd[1526]: time="2025-05-13T12:51:11.255772091Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:c5361ece77e80334cd5fb082c0b678cb3244f5834ecacea1719ae6b38b465581\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"26870936\" in 1.352377649s" May 13 12:51:11.255812 containerd[1526]: time="2025-05-13T12:51:11.255805428Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:c5361ece77e80334cd5fb082c0b678cb3244f5834ecacea1719ae6b38b465581\"" May 13 12:51:11.256275 containerd[1526]: time="2025-05-13T12:51:11.256242710Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 12:51:11.848648 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2266711184.mount: Deactivated successfully. May 13 12:51:12.490204 containerd[1526]: time="2025-05-13T12:51:12.490022533Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:12.491011 containerd[1526]: time="2025-05-13T12:51:12.490756661Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485383" May 13 12:51:12.491841 containerd[1526]: time="2025-05-13T12:51:12.491800346Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:12.494916 containerd[1526]: time="2025-05-13T12:51:12.494877647Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:12.496251 containerd[1526]: time="2025-05-13T12:51:12.496215392Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.239928946s" May 13 12:51:12.496294 containerd[1526]: time="2025-05-13T12:51:12.496253650Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" May 13 12:51:12.496774 containerd[1526]: time="2025-05-13T12:51:12.496750001Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 12:51:12.933556 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3541434563.mount: Deactivated successfully. May 13 12:51:12.937345 containerd[1526]: time="2025-05-13T12:51:12.937108136Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 12:51:12.937911 containerd[1526]: time="2025-05-13T12:51:12.937822955Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" May 13 12:51:12.938620 containerd[1526]: time="2025-05-13T12:51:12.938574265Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 12:51:12.940669 containerd[1526]: time="2025-05-13T12:51:12.940612337Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 12:51:12.941839 containerd[1526]: time="2025-05-13T12:51:12.941803113Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 445.024449ms" May 13 12:51:12.941839 containerd[1526]: time="2025-05-13T12:51:12.941836473Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 13 12:51:12.942289 containerd[1526]: time="2025-05-13T12:51:12.942245348Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 13 12:51:13.425981 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1889237020.mount: Deactivated successfully. May 13 12:51:15.058948 containerd[1526]: time="2025-05-13T12:51:15.058890442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:15.059366 containerd[1526]: time="2025-05-13T12:51:15.059327740Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406467" May 13 12:51:15.060374 containerd[1526]: time="2025-05-13T12:51:15.060340230Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:15.063345 containerd[1526]: time="2025-05-13T12:51:15.063308452Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:15.064583 containerd[1526]: time="2025-05-13T12:51:15.064477400Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.122153343s" May 13 12:51:15.064583 containerd[1526]: time="2025-05-13T12:51:15.064513327Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" May 13 12:51:19.324201 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:51:19.324386 systemd[1]: kubelet.service: Consumed 131ms CPU time, 97.2M memory peak. May 13 12:51:19.327485 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:51:19.347085 systemd[1]: Reload requested from client PID 2183 ('systemctl') (unit session-7.scope)... May 13 12:51:19.347113 systemd[1]: Reloading... May 13 12:51:19.410428 zram_generator::config[2226]: No configuration found. May 13 12:51:19.487712 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:51:19.573514 systemd[1]: Reloading finished in 226 ms. May 13 12:51:19.633889 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 12:51:19.633964 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 12:51:19.634194 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:51:19.634240 systemd[1]: kubelet.service: Consumed 75ms CPU time, 82.3M memory peak. May 13 12:51:19.635647 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:51:19.740731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:51:19.744119 (kubelet)[2272]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 12:51:19.778021 kubelet[2272]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:51:19.778021 kubelet[2272]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 12:51:19.778021 kubelet[2272]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:51:19.778370 kubelet[2272]: I0513 12:51:19.778323 2272 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 12:51:20.266357 kubelet[2272]: I0513 12:51:20.266316 2272 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 12:51:20.266357 kubelet[2272]: I0513 12:51:20.266348 2272 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 12:51:20.266627 kubelet[2272]: I0513 12:51:20.266598 2272 server.go:929] "Client rotation is on, will bootstrap in background" May 13 12:51:20.297314 kubelet[2272]: E0513 12:51:20.297269 2272 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.111:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" May 13 12:51:20.298169 kubelet[2272]: I0513 12:51:20.298052 2272 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 12:51:20.311424 kubelet[2272]: I0513 12:51:20.311363 2272 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 12:51:20.314627 kubelet[2272]: I0513 12:51:20.314576 2272 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 12:51:20.315453 kubelet[2272]: I0513 12:51:20.315425 2272 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 12:51:20.315617 kubelet[2272]: I0513 12:51:20.315585 2272 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 12:51:20.315766 kubelet[2272]: I0513 12:51:20.315610 2272 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 12:51:20.315903 kubelet[2272]: I0513 12:51:20.315892 2272 topology_manager.go:138] "Creating topology manager with none policy" May 13 12:51:20.315903 kubelet[2272]: I0513 12:51:20.315904 2272 container_manager_linux.go:300] "Creating device plugin manager" May 13 12:51:20.316078 kubelet[2272]: I0513 12:51:20.316058 2272 state_mem.go:36] "Initialized new in-memory state store" May 13 12:51:20.317765 kubelet[2272]: I0513 12:51:20.317737 2272 kubelet.go:408] "Attempting to sync node with API server" May 13 12:51:20.317812 kubelet[2272]: I0513 12:51:20.317769 2272 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 12:51:20.317865 kubelet[2272]: I0513 12:51:20.317854 2272 kubelet.go:314] "Adding apiserver pod source" May 13 12:51:20.317891 kubelet[2272]: I0513 12:51:20.317867 2272 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 12:51:20.322577 kubelet[2272]: I0513 12:51:20.322322 2272 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 13 12:51:20.324354 kubelet[2272]: W0513 12:51:20.324299 2272 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.111:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused May 13 12:51:20.324419 kubelet[2272]: E0513 12:51:20.324355 2272 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.111:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" May 13 12:51:20.324585 kubelet[2272]: W0513 12:51:20.324518 2272 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.111:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused May 13 12:51:20.324585 kubelet[2272]: E0513 12:51:20.324564 2272 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.111:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" May 13 12:51:20.327940 kubelet[2272]: I0513 12:51:20.327917 2272 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 12:51:20.328584 kubelet[2272]: W0513 12:51:20.328560 2272 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 12:51:20.329200 kubelet[2272]: I0513 12:51:20.329183 2272 server.go:1269] "Started kubelet" May 13 12:51:20.329450 kubelet[2272]: I0513 12:51:20.329422 2272 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 12:51:20.330455 kubelet[2272]: I0513 12:51:20.329492 2272 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 12:51:20.330710 kubelet[2272]: I0513 12:51:20.330692 2272 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 12:51:20.331389 kubelet[2272]: I0513 12:51:20.331266 2272 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 12:51:20.331389 kubelet[2272]: I0513 12:51:20.331290 2272 server.go:460] "Adding debug handlers to kubelet server" May 13 12:51:20.332457 kubelet[2272]: I0513 12:51:20.331951 2272 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 12:51:20.332457 kubelet[2272]: E0513 12:51:20.332367 2272 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:51:20.333158 kubelet[2272]: I0513 12:51:20.332535 2272 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 12:51:20.333158 kubelet[2272]: I0513 12:51:20.332684 2272 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 12:51:20.333158 kubelet[2272]: I0513 12:51:20.332803 2272 reconciler.go:26] "Reconciler: start to sync state" May 13 12:51:20.333158 kubelet[2272]: W0513 12:51:20.333046 2272 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.111:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused May 13 12:51:20.333158 kubelet[2272]: E0513 12:51:20.333090 2272 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.111:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" May 13 12:51:20.333334 kubelet[2272]: I0513 12:51:20.333312 2272 factory.go:221] Registration of the systemd container factory successfully May 13 12:51:20.333396 kubelet[2272]: I0513 12:51:20.333375 2272 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 12:51:20.333783 kubelet[2272]: E0513 12:51:20.333751 2272 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.111:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.111:6443: connect: connection refused" interval="200ms" May 13 12:51:20.334460 kubelet[2272]: E0513 12:51:20.334435 2272 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 12:51:20.334785 kubelet[2272]: E0513 12:51:20.332283 2272 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.111:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.111:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183f173127a7587b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 12:51:20.329156731 +0000 UTC m=+0.582228269,LastTimestamp:2025-05-13 12:51:20.329156731 +0000 UTC m=+0.582228269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 12:51:20.335473 kubelet[2272]: I0513 12:51:20.335452 2272 factory.go:221] Registration of the containerd container factory successfully May 13 12:51:20.346312 kubelet[2272]: I0513 12:51:20.346245 2272 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 12:51:20.346384 kubelet[2272]: I0513 12:51:20.346316 2272 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 12:51:20.346384 kubelet[2272]: I0513 12:51:20.346332 2272 state_mem.go:36] "Initialized new in-memory state store" May 13 12:51:20.347714 kubelet[2272]: I0513 12:51:20.347645 2272 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 12:51:20.348602 kubelet[2272]: I0513 12:51:20.348577 2272 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 12:51:20.348602 kubelet[2272]: I0513 12:51:20.348603 2272 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 12:51:20.348709 kubelet[2272]: I0513 12:51:20.348619 2272 kubelet.go:2321] "Starting kubelet main sync loop" May 13 12:51:20.348709 kubelet[2272]: E0513 12:51:20.348654 2272 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 12:51:20.349080 kubelet[2272]: W0513 12:51:20.349001 2272 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.111:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.111:6443: connect: connection refused May 13 12:51:20.349080 kubelet[2272]: E0513 12:51:20.349040 2272 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.111:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.111:6443: connect: connection refused" logger="UnhandledError" May 13 12:51:20.349387 kubelet[2272]: I0513 12:51:20.349364 2272 policy_none.go:49] "None policy: Start" May 13 12:51:20.349996 kubelet[2272]: I0513 12:51:20.349973 2272 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 12:51:20.349996 kubelet[2272]: I0513 12:51:20.350000 2272 state_mem.go:35] "Initializing new in-memory state store" May 13 12:51:20.355801 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 12:51:20.372990 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 12:51:20.375897 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 12:51:20.389161 kubelet[2272]: I0513 12:51:20.389132 2272 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 12:51:20.389457 kubelet[2272]: I0513 12:51:20.389437 2272 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 12:51:20.389709 kubelet[2272]: I0513 12:51:20.389547 2272 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 12:51:20.389935 kubelet[2272]: I0513 12:51:20.389913 2272 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 12:51:20.391958 kubelet[2272]: E0513 12:51:20.391934 2272 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 13 12:51:20.455830 systemd[1]: Created slice kubepods-burstable-podd8db02311f1b62f9c5098a01217e9253.slice - libcontainer container kubepods-burstable-podd8db02311f1b62f9c5098a01217e9253.slice. May 13 12:51:20.485522 systemd[1]: Created slice kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice - libcontainer container kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice. May 13 12:51:20.491525 kubelet[2272]: I0513 12:51:20.491492 2272 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:51:20.491952 kubelet[2272]: E0513 12:51:20.491916 2272 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.111:6443/api/v1/nodes\": dial tcp 10.0.0.111:6443: connect: connection refused" node="localhost" May 13 12:51:20.508977 systemd[1]: Created slice kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice - libcontainer container kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice. May 13 12:51:20.534949 kubelet[2272]: E0513 12:51:20.534851 2272 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.111:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.111:6443: connect: connection refused" interval="400ms" May 13 12:51:20.634240 kubelet[2272]: I0513 12:51:20.634198 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8db02311f1b62f9c5098a01217e9253-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8db02311f1b62f9c5098a01217e9253\") " pod="kube-system/kube-apiserver-localhost" May 13 12:51:20.634290 kubelet[2272]: I0513 12:51:20.634239 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:20.634290 kubelet[2272]: I0513 12:51:20.634260 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:20.634290 kubelet[2272]: I0513 12:51:20.634275 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:20.634359 kubelet[2272]: I0513 12:51:20.634297 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 13 12:51:20.634359 kubelet[2272]: I0513 12:51:20.634312 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8db02311f1b62f9c5098a01217e9253-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8db02311f1b62f9c5098a01217e9253\") " pod="kube-system/kube-apiserver-localhost" May 13 12:51:20.634400 kubelet[2272]: I0513 12:51:20.634357 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:20.634400 kubelet[2272]: I0513 12:51:20.634393 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:20.634471 kubelet[2272]: I0513 12:51:20.634435 2272 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8db02311f1b62f9c5098a01217e9253-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d8db02311f1b62f9c5098a01217e9253\") " pod="kube-system/kube-apiserver-localhost" May 13 12:51:20.693424 kubelet[2272]: I0513 12:51:20.693389 2272 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:51:20.693790 kubelet[2272]: E0513 12:51:20.693745 2272 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.111:6443/api/v1/nodes\": dial tcp 10.0.0.111:6443: connect: connection refused" node="localhost" May 13 12:51:20.784787 containerd[1526]: time="2025-05-13T12:51:20.784745250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d8db02311f1b62f9c5098a01217e9253,Namespace:kube-system,Attempt:0,}" May 13 12:51:20.807487 containerd[1526]: time="2025-05-13T12:51:20.807357665Z" level=info msg="connecting to shim af6d3127c6d4c0b179e5e22d0c47f3fef195ad9891ba468acdae97bb5d43238b" address="unix:///run/containerd/s/280b1420749e61ed845f95cf3545f515eaf7a153d132c1bac7438e82f718f621" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:20.808058 containerd[1526]: time="2025-05-13T12:51:20.807973430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,}" May 13 12:51:20.811383 containerd[1526]: time="2025-05-13T12:51:20.811336328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,}" May 13 12:51:20.833625 systemd[1]: Started cri-containerd-af6d3127c6d4c0b179e5e22d0c47f3fef195ad9891ba468acdae97bb5d43238b.scope - libcontainer container af6d3127c6d4c0b179e5e22d0c47f3fef195ad9891ba468acdae97bb5d43238b. May 13 12:51:20.846375 containerd[1526]: time="2025-05-13T12:51:20.846306656Z" level=info msg="connecting to shim d8eb7879dd2de6f26d583dfce1d602f66803b8839376c97d91b7d9b38716c71c" address="unix:///run/containerd/s/5946549aad9c2d7ae5c1d5f8414be6261d9b8bc86b42d76ee966a927fa9dd976" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:20.854457 containerd[1526]: time="2025-05-13T12:51:20.854386615Z" level=info msg="connecting to shim 3fe633783337961fb5aabc04de2f9f9f01594b291a818b37bfbbe365ec1eea87" address="unix:///run/containerd/s/6e67b836690c44407ab2fbcd5faf1b2811b5be20bd6a013e723216f913bd0c5e" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:20.873733 systemd[1]: Started cri-containerd-d8eb7879dd2de6f26d583dfce1d602f66803b8839376c97d91b7d9b38716c71c.scope - libcontainer container d8eb7879dd2de6f26d583dfce1d602f66803b8839376c97d91b7d9b38716c71c. May 13 12:51:20.879835 systemd[1]: Started cri-containerd-3fe633783337961fb5aabc04de2f9f9f01594b291a818b37bfbbe365ec1eea87.scope - libcontainer container 3fe633783337961fb5aabc04de2f9f9f01594b291a818b37bfbbe365ec1eea87. May 13 12:51:20.882678 containerd[1526]: time="2025-05-13T12:51:20.882623017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d8db02311f1b62f9c5098a01217e9253,Namespace:kube-system,Attempt:0,} returns sandbox id \"af6d3127c6d4c0b179e5e22d0c47f3fef195ad9891ba468acdae97bb5d43238b\"" May 13 12:51:20.887317 containerd[1526]: time="2025-05-13T12:51:20.887261981Z" level=info msg="CreateContainer within sandbox \"af6d3127c6d4c0b179e5e22d0c47f3fef195ad9891ba468acdae97bb5d43238b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 12:51:20.898683 containerd[1526]: time="2025-05-13T12:51:20.898642200Z" level=info msg="Container 11d5037919ecad98f457d719f5e6b3bd6cc2c9128978b7e00f0f38b318a972ac: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:20.908028 containerd[1526]: time="2025-05-13T12:51:20.907977879Z" level=info msg="CreateContainer within sandbox \"af6d3127c6d4c0b179e5e22d0c47f3fef195ad9891ba468acdae97bb5d43238b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"11d5037919ecad98f457d719f5e6b3bd6cc2c9128978b7e00f0f38b318a972ac\"" May 13 12:51:20.909131 containerd[1526]: time="2025-05-13T12:51:20.909092384Z" level=info msg="StartContainer for \"11d5037919ecad98f457d719f5e6b3bd6cc2c9128978b7e00f0f38b318a972ac\"" May 13 12:51:20.911310 containerd[1526]: time="2025-05-13T12:51:20.911271731Z" level=info msg="connecting to shim 11d5037919ecad98f457d719f5e6b3bd6cc2c9128978b7e00f0f38b318a972ac" address="unix:///run/containerd/s/280b1420749e61ed845f95cf3545f515eaf7a153d132c1bac7438e82f718f621" protocol=ttrpc version=3 May 13 12:51:20.911773 containerd[1526]: time="2025-05-13T12:51:20.911725936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,} returns sandbox id \"d8eb7879dd2de6f26d583dfce1d602f66803b8839376c97d91b7d9b38716c71c\"" May 13 12:51:20.915126 containerd[1526]: time="2025-05-13T12:51:20.915083988Z" level=info msg="CreateContainer within sandbox \"d8eb7879dd2de6f26d583dfce1d602f66803b8839376c97d91b7d9b38716c71c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 12:51:20.918109 containerd[1526]: time="2025-05-13T12:51:20.917466908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,} returns sandbox id \"3fe633783337961fb5aabc04de2f9f9f01594b291a818b37bfbbe365ec1eea87\"" May 13 12:51:20.920980 containerd[1526]: time="2025-05-13T12:51:20.920932895Z" level=info msg="CreateContainer within sandbox \"3fe633783337961fb5aabc04de2f9f9f01594b291a818b37bfbbe365ec1eea87\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 12:51:20.923983 containerd[1526]: time="2025-05-13T12:51:20.923754560Z" level=info msg="Container b2606d7c01ced14fbb593dafe9b48c93df88d9031a861173446d090ac0eb6fb4: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:20.931654 containerd[1526]: time="2025-05-13T12:51:20.931616889Z" level=info msg="Container 4df84088d72fb27e530d0698a3266f9aba73a9de511a85e4fd322440db4dc32e: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:20.932389 containerd[1526]: time="2025-05-13T12:51:20.932345754Z" level=info msg="CreateContainer within sandbox \"d8eb7879dd2de6f26d583dfce1d602f66803b8839376c97d91b7d9b38716c71c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b2606d7c01ced14fbb593dafe9b48c93df88d9031a861173446d090ac0eb6fb4\"" May 13 12:51:20.933002 containerd[1526]: time="2025-05-13T12:51:20.932969930Z" level=info msg="StartContainer for \"b2606d7c01ced14fbb593dafe9b48c93df88d9031a861173446d090ac0eb6fb4\"" May 13 12:51:20.934447 containerd[1526]: time="2025-05-13T12:51:20.934394340Z" level=info msg="connecting to shim b2606d7c01ced14fbb593dafe9b48c93df88d9031a861173446d090ac0eb6fb4" address="unix:///run/containerd/s/5946549aad9c2d7ae5c1d5f8414be6261d9b8bc86b42d76ee966a927fa9dd976" protocol=ttrpc version=3 May 13 12:51:20.935660 kubelet[2272]: E0513 12:51:20.935607 2272 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.111:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.111:6443: connect: connection refused" interval="800ms" May 13 12:51:20.936617 systemd[1]: Started cri-containerd-11d5037919ecad98f457d719f5e6b3bd6cc2c9128978b7e00f0f38b318a972ac.scope - libcontainer container 11d5037919ecad98f457d719f5e6b3bd6cc2c9128978b7e00f0f38b318a972ac. May 13 12:51:20.938348 containerd[1526]: time="2025-05-13T12:51:20.937998498Z" level=info msg="CreateContainer within sandbox \"3fe633783337961fb5aabc04de2f9f9f01594b291a818b37bfbbe365ec1eea87\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"4df84088d72fb27e530d0698a3266f9aba73a9de511a85e4fd322440db4dc32e\"" May 13 12:51:20.938627 containerd[1526]: time="2025-05-13T12:51:20.938602848Z" level=info msg="StartContainer for \"4df84088d72fb27e530d0698a3266f9aba73a9de511a85e4fd322440db4dc32e\"" May 13 12:51:20.940057 containerd[1526]: time="2025-05-13T12:51:20.940028740Z" level=info msg="connecting to shim 4df84088d72fb27e530d0698a3266f9aba73a9de511a85e4fd322440db4dc32e" address="unix:///run/containerd/s/6e67b836690c44407ab2fbcd5faf1b2811b5be20bd6a013e723216f913bd0c5e" protocol=ttrpc version=3 May 13 12:51:20.957593 systemd[1]: Started cri-containerd-b2606d7c01ced14fbb593dafe9b48c93df88d9031a861173446d090ac0eb6fb4.scope - libcontainer container b2606d7c01ced14fbb593dafe9b48c93df88d9031a861173446d090ac0eb6fb4. May 13 12:51:20.960495 systemd[1]: Started cri-containerd-4df84088d72fb27e530d0698a3266f9aba73a9de511a85e4fd322440db4dc32e.scope - libcontainer container 4df84088d72fb27e530d0698a3266f9aba73a9de511a85e4fd322440db4dc32e. May 13 12:51:20.981950 containerd[1526]: time="2025-05-13T12:51:20.981346314Z" level=info msg="StartContainer for \"11d5037919ecad98f457d719f5e6b3bd6cc2c9128978b7e00f0f38b318a972ac\" returns successfully" May 13 12:51:21.025449 containerd[1526]: time="2025-05-13T12:51:21.025283328Z" level=info msg="StartContainer for \"b2606d7c01ced14fbb593dafe9b48c93df88d9031a861173446d090ac0eb6fb4\" returns successfully" May 13 12:51:21.054494 containerd[1526]: time="2025-05-13T12:51:21.054139143Z" level=info msg="StartContainer for \"4df84088d72fb27e530d0698a3266f9aba73a9de511a85e4fd322440db4dc32e\" returns successfully" May 13 12:51:21.096146 kubelet[2272]: I0513 12:51:21.096025 2272 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:51:21.096558 kubelet[2272]: E0513 12:51:21.096528 2272 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.111:6443/api/v1/nodes\": dial tcp 10.0.0.111:6443: connect: connection refused" node="localhost" May 13 12:51:21.898089 kubelet[2272]: I0513 12:51:21.898044 2272 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:51:22.855798 kubelet[2272]: E0513 12:51:22.855703 2272 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 13 12:51:23.012703 kubelet[2272]: I0513 12:51:23.012649 2272 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 13 12:51:23.319811 kubelet[2272]: I0513 12:51:23.319688 2272 apiserver.go:52] "Watching apiserver" May 13 12:51:23.333160 kubelet[2272]: I0513 12:51:23.333102 2272 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 12:51:24.693345 systemd[1]: Reload requested from client PID 2546 ('systemctl') (unit session-7.scope)... May 13 12:51:24.693362 systemd[1]: Reloading... May 13 12:51:24.757456 zram_generator::config[2589]: No configuration found. May 13 12:51:24.833037 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:51:24.930671 systemd[1]: Reloading finished in 236 ms. May 13 12:51:24.958100 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:51:24.973400 systemd[1]: kubelet.service: Deactivated successfully. May 13 12:51:24.973711 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:51:24.973768 systemd[1]: kubelet.service: Consumed 962ms CPU time, 117.7M memory peak. May 13 12:51:24.975614 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:51:25.116635 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:51:25.121010 (kubelet)[2631]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 12:51:25.165487 kubelet[2631]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:51:25.166472 kubelet[2631]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 12:51:25.166472 kubelet[2631]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:51:25.166472 kubelet[2631]: I0513 12:51:25.165942 2631 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 12:51:25.172720 kubelet[2631]: I0513 12:51:25.172668 2631 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 12:51:25.172720 kubelet[2631]: I0513 12:51:25.172699 2631 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 12:51:25.172948 kubelet[2631]: I0513 12:51:25.172924 2631 server.go:929] "Client rotation is on, will bootstrap in background" May 13 12:51:25.174301 kubelet[2631]: I0513 12:51:25.174276 2631 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 12:51:25.176453 kubelet[2631]: I0513 12:51:25.176403 2631 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 12:51:25.180402 kubelet[2631]: I0513 12:51:25.180373 2631 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 12:51:25.183435 kubelet[2631]: I0513 12:51:25.182888 2631 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 12:51:25.183435 kubelet[2631]: I0513 12:51:25.183007 2631 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 12:51:25.183435 kubelet[2631]: I0513 12:51:25.183092 2631 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 12:51:25.183435 kubelet[2631]: I0513 12:51:25.183117 2631 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 12:51:25.183658 kubelet[2631]: I0513 12:51:25.183369 2631 topology_manager.go:138] "Creating topology manager with none policy" May 13 12:51:25.183658 kubelet[2631]: I0513 12:51:25.183378 2631 container_manager_linux.go:300] "Creating device plugin manager" May 13 12:51:25.183658 kubelet[2631]: I0513 12:51:25.183429 2631 state_mem.go:36] "Initialized new in-memory state store" May 13 12:51:25.183658 kubelet[2631]: I0513 12:51:25.183531 2631 kubelet.go:408] "Attempting to sync node with API server" May 13 12:51:25.183658 kubelet[2631]: I0513 12:51:25.183542 2631 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 12:51:25.183658 kubelet[2631]: I0513 12:51:25.183564 2631 kubelet.go:314] "Adding apiserver pod source" May 13 12:51:25.183658 kubelet[2631]: I0513 12:51:25.183583 2631 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 12:51:25.184560 kubelet[2631]: I0513 12:51:25.184535 2631 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 13 12:51:25.185082 kubelet[2631]: I0513 12:51:25.185056 2631 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 12:51:25.185950 kubelet[2631]: I0513 12:51:25.185930 2631 server.go:1269] "Started kubelet" May 13 12:51:25.186333 kubelet[2631]: I0513 12:51:25.186266 2631 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 12:51:25.186499 kubelet[2631]: I0513 12:51:25.186482 2631 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 12:51:25.188427 kubelet[2631]: I0513 12:51:25.188382 2631 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 12:51:25.188755 kubelet[2631]: I0513 12:51:25.186100 2631 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 12:51:25.188854 kubelet[2631]: I0513 12:51:25.188837 2631 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 12:51:25.189755 kubelet[2631]: E0513 12:51:25.189725 2631 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 12:51:25.189899 kubelet[2631]: I0513 12:51:25.189883 2631 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 12:51:25.190000 kubelet[2631]: I0513 12:51:25.189986 2631 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 12:51:25.190137 kubelet[2631]: I0513 12:51:25.190121 2631 reconciler.go:26] "Reconciler: start to sync state" May 13 12:51:25.190764 kubelet[2631]: E0513 12:51:25.190727 2631 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:51:25.196440 kubelet[2631]: I0513 12:51:25.194770 2631 server.go:460] "Adding debug handlers to kubelet server" May 13 12:51:25.208337 kubelet[2631]: I0513 12:51:25.208228 2631 factory.go:221] Registration of the systemd container factory successfully May 13 12:51:25.209297 kubelet[2631]: I0513 12:51:25.208342 2631 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 12:51:25.210039 kubelet[2631]: I0513 12:51:25.209861 2631 factory.go:221] Registration of the containerd container factory successfully May 13 12:51:25.217671 kubelet[2631]: I0513 12:51:25.217632 2631 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 12:51:25.219910 kubelet[2631]: I0513 12:51:25.219882 2631 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 12:51:25.220034 kubelet[2631]: I0513 12:51:25.220024 2631 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 12:51:25.220114 kubelet[2631]: I0513 12:51:25.220104 2631 kubelet.go:2321] "Starting kubelet main sync loop" May 13 12:51:25.220436 kubelet[2631]: E0513 12:51:25.220373 2631 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 12:51:25.244671 kubelet[2631]: I0513 12:51:25.244643 2631 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 12:51:25.244671 kubelet[2631]: I0513 12:51:25.244664 2631 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 12:51:25.244778 kubelet[2631]: I0513 12:51:25.244685 2631 state_mem.go:36] "Initialized new in-memory state store" May 13 12:51:25.244851 kubelet[2631]: I0513 12:51:25.244834 2631 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 12:51:25.244880 kubelet[2631]: I0513 12:51:25.244850 2631 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 12:51:25.244880 kubelet[2631]: I0513 12:51:25.244869 2631 policy_none.go:49] "None policy: Start" May 13 12:51:25.245589 kubelet[2631]: I0513 12:51:25.245569 2631 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 12:51:25.245636 kubelet[2631]: I0513 12:51:25.245597 2631 state_mem.go:35] "Initializing new in-memory state store" May 13 12:51:25.245815 kubelet[2631]: I0513 12:51:25.245796 2631 state_mem.go:75] "Updated machine memory state" May 13 12:51:25.250330 kubelet[2631]: I0513 12:51:25.250305 2631 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 12:51:25.250520 kubelet[2631]: I0513 12:51:25.250498 2631 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 12:51:25.250553 kubelet[2631]: I0513 12:51:25.250516 2631 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 12:51:25.250804 kubelet[2631]: I0513 12:51:25.250787 2631 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 12:51:25.352730 kubelet[2631]: I0513 12:51:25.352698 2631 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:51:25.359579 kubelet[2631]: I0513 12:51:25.359488 2631 kubelet_node_status.go:111] "Node was previously registered" node="localhost" May 13 12:51:25.359741 kubelet[2631]: I0513 12:51:25.359612 2631 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 13 12:51:25.391358 kubelet[2631]: I0513 12:51:25.391322 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8db02311f1b62f9c5098a01217e9253-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8db02311f1b62f9c5098a01217e9253\") " pod="kube-system/kube-apiserver-localhost" May 13 12:51:25.391358 kubelet[2631]: I0513 12:51:25.391357 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:25.391535 kubelet[2631]: I0513 12:51:25.391385 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:25.391535 kubelet[2631]: I0513 12:51:25.391420 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 13 12:51:25.391535 kubelet[2631]: I0513 12:51:25.391446 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8db02311f1b62f9c5098a01217e9253-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d8db02311f1b62f9c5098a01217e9253\") " pod="kube-system/kube-apiserver-localhost" May 13 12:51:25.391535 kubelet[2631]: I0513 12:51:25.391502 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8db02311f1b62f9c5098a01217e9253-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d8db02311f1b62f9c5098a01217e9253\") " pod="kube-system/kube-apiserver-localhost" May 13 12:51:25.391617 kubelet[2631]: I0513 12:51:25.391561 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:25.391617 kubelet[2631]: I0513 12:51:25.391588 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:25.391617 kubelet[2631]: I0513 12:51:25.391604 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:51:26.184647 kubelet[2631]: I0513 12:51:26.184599 2631 apiserver.go:52] "Watching apiserver" May 13 12:51:26.190456 kubelet[2631]: I0513 12:51:26.190398 2631 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 12:51:26.242898 kubelet[2631]: E0513 12:51:26.242856 2631 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 13 12:51:26.246757 kubelet[2631]: E0513 12:51:26.246724 2631 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 13 12:51:26.268390 kubelet[2631]: I0513 12:51:26.268163 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.268131417 podStartE2EDuration="1.268131417s" podCreationTimestamp="2025-05-13 12:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:51:26.260897981 +0000 UTC m=+1.136732639" watchObservedRunningTime="2025-05-13 12:51:26.268131417 +0000 UTC m=+1.143966075" May 13 12:51:26.276231 kubelet[2631]: I0513 12:51:26.276141 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.276125638 podStartE2EDuration="1.276125638s" podCreationTimestamp="2025-05-13 12:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:51:26.268492979 +0000 UTC m=+1.144327637" watchObservedRunningTime="2025-05-13 12:51:26.276125638 +0000 UTC m=+1.151960256" May 13 12:51:30.066581 sudo[1733]: pam_unix(sudo:session): session closed for user root May 13 12:51:30.067825 sshd[1732]: Connection closed by 10.0.0.1 port 49252 May 13 12:51:30.068329 sshd-session[1730]: pam_unix(sshd:session): session closed for user core May 13 12:51:30.071973 systemd[1]: sshd@6-10.0.0.111:22-10.0.0.1:49252.service: Deactivated successfully. May 13 12:51:30.073905 systemd[1]: session-7.scope: Deactivated successfully. May 13 12:51:30.074098 systemd[1]: session-7.scope: Consumed 6.141s CPU time, 226.9M memory peak. May 13 12:51:30.075122 systemd-logind[1509]: Session 7 logged out. Waiting for processes to exit. May 13 12:51:30.076159 systemd-logind[1509]: Removed session 7. May 13 12:51:30.690978 kubelet[2631]: I0513 12:51:30.690938 2631 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 12:51:30.691344 containerd[1526]: time="2025-05-13T12:51:30.691230715Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 12:51:30.691554 kubelet[2631]: I0513 12:51:30.691417 2631 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 12:51:31.666137 kubelet[2631]: I0513 12:51:31.665903 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=6.665867328 podStartE2EDuration="6.665867328s" podCreationTimestamp="2025-05-13 12:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:51:26.276483478 +0000 UTC m=+1.152318136" watchObservedRunningTime="2025-05-13 12:51:31.665867328 +0000 UTC m=+6.541701986" May 13 12:51:31.674112 systemd[1]: Created slice kubepods-besteffort-pod95bc2942_352d_4dbb_a651_a0123b8f4a05.slice - libcontainer container kubepods-besteffort-pod95bc2942_352d_4dbb_a651_a0123b8f4a05.slice. May 13 12:51:31.830149 kubelet[2631]: I0513 12:51:31.830070 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/95bc2942-352d-4dbb-a651-a0123b8f4a05-kube-proxy\") pod \"kube-proxy-w2fm2\" (UID: \"95bc2942-352d-4dbb-a651-a0123b8f4a05\") " pod="kube-system/kube-proxy-w2fm2" May 13 12:51:31.830149 kubelet[2631]: I0513 12:51:31.830107 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/95bc2942-352d-4dbb-a651-a0123b8f4a05-xtables-lock\") pod \"kube-proxy-w2fm2\" (UID: \"95bc2942-352d-4dbb-a651-a0123b8f4a05\") " pod="kube-system/kube-proxy-w2fm2" May 13 12:51:31.830149 kubelet[2631]: I0513 12:51:31.830123 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95bc2942-352d-4dbb-a651-a0123b8f4a05-lib-modules\") pod \"kube-proxy-w2fm2\" (UID: \"95bc2942-352d-4dbb-a651-a0123b8f4a05\") " pod="kube-system/kube-proxy-w2fm2" May 13 12:51:31.831358 kubelet[2631]: I0513 12:51:31.831171 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmsz5\" (UniqueName: \"kubernetes.io/projected/95bc2942-352d-4dbb-a651-a0123b8f4a05-kube-api-access-mmsz5\") pod \"kube-proxy-w2fm2\" (UID: \"95bc2942-352d-4dbb-a651-a0123b8f4a05\") " pod="kube-system/kube-proxy-w2fm2" May 13 12:51:31.836660 systemd[1]: Created slice kubepods-besteffort-poddce32633_b036_4cab_a4ea_758e5345eea1.slice - libcontainer container kubepods-besteffort-poddce32633_b036_4cab_a4ea_758e5345eea1.slice. May 13 12:51:31.931616 kubelet[2631]: I0513 12:51:31.931498 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dce32633-b036-4cab-a4ea-758e5345eea1-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-92mds\" (UID: \"dce32633-b036-4cab-a4ea-758e5345eea1\") " pod="tigera-operator/tigera-operator-6f6897fdc5-92mds" May 13 12:51:31.931616 kubelet[2631]: I0513 12:51:31.931550 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9kvw\" (UniqueName: \"kubernetes.io/projected/dce32633-b036-4cab-a4ea-758e5345eea1-kube-api-access-r9kvw\") pod \"tigera-operator-6f6897fdc5-92mds\" (UID: \"dce32633-b036-4cab-a4ea-758e5345eea1\") " pod="tigera-operator/tigera-operator-6f6897fdc5-92mds" May 13 12:51:31.989398 containerd[1526]: time="2025-05-13T12:51:31.989360395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-w2fm2,Uid:95bc2942-352d-4dbb-a651-a0123b8f4a05,Namespace:kube-system,Attempt:0,}" May 13 12:51:32.003948 containerd[1526]: time="2025-05-13T12:51:32.003912769Z" level=info msg="connecting to shim fcd1ce48820f143679230af6942b8cee51ea49d59e7f301dc84aaa244e9acf82" address="unix:///run/containerd/s/f14ce8eecb9a8ed75ec94d7efde00b97c7634461e354ed7433e227548091a02b" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:32.026550 systemd[1]: Started cri-containerd-fcd1ce48820f143679230af6942b8cee51ea49d59e7f301dc84aaa244e9acf82.scope - libcontainer container fcd1ce48820f143679230af6942b8cee51ea49d59e7f301dc84aaa244e9acf82. May 13 12:51:32.052853 containerd[1526]: time="2025-05-13T12:51:32.052808585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-w2fm2,Uid:95bc2942-352d-4dbb-a651-a0123b8f4a05,Namespace:kube-system,Attempt:0,} returns sandbox id \"fcd1ce48820f143679230af6942b8cee51ea49d59e7f301dc84aaa244e9acf82\"" May 13 12:51:32.056472 containerd[1526]: time="2025-05-13T12:51:32.056366610Z" level=info msg="CreateContainer within sandbox \"fcd1ce48820f143679230af6942b8cee51ea49d59e7f301dc84aaa244e9acf82\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 12:51:32.066271 containerd[1526]: time="2025-05-13T12:51:32.066206860Z" level=info msg="Container f4a901bc29a872cb6603101ce76844214497471ef40eee027dba569ef4440913: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:32.078620 containerd[1526]: time="2025-05-13T12:51:32.078552749Z" level=info msg="CreateContainer within sandbox \"fcd1ce48820f143679230af6942b8cee51ea49d59e7f301dc84aaa244e9acf82\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f4a901bc29a872cb6603101ce76844214497471ef40eee027dba569ef4440913\"" May 13 12:51:32.079263 containerd[1526]: time="2025-05-13T12:51:32.079194504Z" level=info msg="StartContainer for \"f4a901bc29a872cb6603101ce76844214497471ef40eee027dba569ef4440913\"" May 13 12:51:32.080683 containerd[1526]: time="2025-05-13T12:51:32.080654920Z" level=info msg="connecting to shim f4a901bc29a872cb6603101ce76844214497471ef40eee027dba569ef4440913" address="unix:///run/containerd/s/f14ce8eecb9a8ed75ec94d7efde00b97c7634461e354ed7433e227548091a02b" protocol=ttrpc version=3 May 13 12:51:32.102694 systemd[1]: Started cri-containerd-f4a901bc29a872cb6603101ce76844214497471ef40eee027dba569ef4440913.scope - libcontainer container f4a901bc29a872cb6603101ce76844214497471ef40eee027dba569ef4440913. May 13 12:51:32.136471 containerd[1526]: time="2025-05-13T12:51:32.136402769Z" level=info msg="StartContainer for \"f4a901bc29a872cb6603101ce76844214497471ef40eee027dba569ef4440913\" returns successfully" May 13 12:51:32.140072 containerd[1526]: time="2025-05-13T12:51:32.140035462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-92mds,Uid:dce32633-b036-4cab-a4ea-758e5345eea1,Namespace:tigera-operator,Attempt:0,}" May 13 12:51:32.170242 containerd[1526]: time="2025-05-13T12:51:32.170197206Z" level=info msg="connecting to shim a5e24143a84b5c83095965ff73cba3dc3bfec6d2b2a9574e69020a664a454f03" address="unix:///run/containerd/s/0edb4e285f587060666a607c302d831e67435a31d753ddb516dab6dea614240e" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:32.194623 systemd[1]: Started cri-containerd-a5e24143a84b5c83095965ff73cba3dc3bfec6d2b2a9574e69020a664a454f03.scope - libcontainer container a5e24143a84b5c83095965ff73cba3dc3bfec6d2b2a9574e69020a664a454f03. May 13 12:51:32.230985 containerd[1526]: time="2025-05-13T12:51:32.230871942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-92mds,Uid:dce32633-b036-4cab-a4ea-758e5345eea1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a5e24143a84b5c83095965ff73cba3dc3bfec6d2b2a9574e69020a664a454f03\"" May 13 12:51:32.233921 containerd[1526]: time="2025-05-13T12:51:32.233883447Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 12:51:32.259283 kubelet[2631]: I0513 12:51:32.258679 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-w2fm2" podStartSLOduration=1.258660215 podStartE2EDuration="1.258660215s" podCreationTimestamp="2025-05-13 12:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:51:32.258089446 +0000 UTC m=+7.133924104" watchObservedRunningTime="2025-05-13 12:51:32.258660215 +0000 UTC m=+7.134494873" May 13 12:51:33.574321 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1959089569.mount: Deactivated successfully. May 13 12:51:33.816209 containerd[1526]: time="2025-05-13T12:51:33.816003031Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:33.816840 containerd[1526]: time="2025-05-13T12:51:33.816801628Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=19323084" May 13 12:51:33.817593 containerd[1526]: time="2025-05-13T12:51:33.817564613Z" level=info msg="ImageCreate event name:\"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:33.820004 containerd[1526]: time="2025-05-13T12:51:33.819970127Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:33.820799 containerd[1526]: time="2025-05-13T12:51:33.820734832Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"19319079\" in 1.58681057s" May 13 12:51:33.820938 containerd[1526]: time="2025-05-13T12:51:33.820765043Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\"" May 13 12:51:33.837967 containerd[1526]: time="2025-05-13T12:51:33.837741609Z" level=info msg="CreateContainer within sandbox \"a5e24143a84b5c83095965ff73cba3dc3bfec6d2b2a9574e69020a664a454f03\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 12:51:33.843670 containerd[1526]: time="2025-05-13T12:51:33.843642896Z" level=info msg="Container c31a8fb4e166fc7dc8a125121d057c819f102e74c07d521fbfc2f4d18dc15490: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:33.845878 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount270173983.mount: Deactivated successfully. May 13 12:51:33.849441 containerd[1526]: time="2025-05-13T12:51:33.849397091Z" level=info msg="CreateContainer within sandbox \"a5e24143a84b5c83095965ff73cba3dc3bfec6d2b2a9574e69020a664a454f03\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c31a8fb4e166fc7dc8a125121d057c819f102e74c07d521fbfc2f4d18dc15490\"" May 13 12:51:33.849797 containerd[1526]: time="2025-05-13T12:51:33.849771781Z" level=info msg="StartContainer for \"c31a8fb4e166fc7dc8a125121d057c819f102e74c07d521fbfc2f4d18dc15490\"" May 13 12:51:33.850512 containerd[1526]: time="2025-05-13T12:51:33.850487349Z" level=info msg="connecting to shim c31a8fb4e166fc7dc8a125121d057c819f102e74c07d521fbfc2f4d18dc15490" address="unix:///run/containerd/s/0edb4e285f587060666a607c302d831e67435a31d753ddb516dab6dea614240e" protocol=ttrpc version=3 May 13 12:51:33.866577 systemd[1]: Started cri-containerd-c31a8fb4e166fc7dc8a125121d057c819f102e74c07d521fbfc2f4d18dc15490.scope - libcontainer container c31a8fb4e166fc7dc8a125121d057c819f102e74c07d521fbfc2f4d18dc15490. May 13 12:51:33.901532 containerd[1526]: time="2025-05-13T12:51:33.901498918Z" level=info msg="StartContainer for \"c31a8fb4e166fc7dc8a125121d057c819f102e74c07d521fbfc2f4d18dc15490\" returns successfully" May 13 12:51:35.748598 kubelet[2631]: I0513 12:51:35.748526 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-92mds" podStartSLOduration=3.144004766 podStartE2EDuration="4.748511871s" podCreationTimestamp="2025-05-13 12:51:31 +0000 UTC" firstStartedPulling="2025-05-13 12:51:32.23198239 +0000 UTC m=+7.107817048" lastFinishedPulling="2025-05-13 12:51:33.836489535 +0000 UTC m=+8.712324153" observedRunningTime="2025-05-13 12:51:34.263220854 +0000 UTC m=+9.139055552" watchObservedRunningTime="2025-05-13 12:51:35.748511871 +0000 UTC m=+10.624346529" May 13 12:51:38.195806 systemd[1]: Created slice kubepods-besteffort-pod12f6d267_b24b_4de8_bc46_b496047e25bc.slice - libcontainer container kubepods-besteffort-pod12f6d267_b24b_4de8_bc46_b496047e25bc.slice. May 13 12:51:38.370938 kubelet[2631]: I0513 12:51:38.370892 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hns9p\" (UniqueName: \"kubernetes.io/projected/12f6d267-b24b-4de8-bc46-b496047e25bc-kube-api-access-hns9p\") pod \"calico-typha-6b65fcc5c5-rz2qj\" (UID: \"12f6d267-b24b-4de8-bc46-b496047e25bc\") " pod="calico-system/calico-typha-6b65fcc5c5-rz2qj" May 13 12:51:38.370938 kubelet[2631]: I0513 12:51:38.370935 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12f6d267-b24b-4de8-bc46-b496047e25bc-tigera-ca-bundle\") pod \"calico-typha-6b65fcc5c5-rz2qj\" (UID: \"12f6d267-b24b-4de8-bc46-b496047e25bc\") " pod="calico-system/calico-typha-6b65fcc5c5-rz2qj" May 13 12:51:38.371762 kubelet[2631]: I0513 12:51:38.370951 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/12f6d267-b24b-4de8-bc46-b496047e25bc-typha-certs\") pod \"calico-typha-6b65fcc5c5-rz2qj\" (UID: \"12f6d267-b24b-4de8-bc46-b496047e25bc\") " pod="calico-system/calico-typha-6b65fcc5c5-rz2qj" May 13 12:51:38.375260 systemd[1]: Created slice kubepods-besteffort-podb4db8ca5_7e24_4262_8869_ee737a72c1b5.slice - libcontainer container kubepods-besteffort-podb4db8ca5_7e24_4262_8869_ee737a72c1b5.slice. May 13 12:51:38.471175 kubelet[2631]: I0513 12:51:38.471076 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvs7\" (UniqueName: \"kubernetes.io/projected/b4db8ca5-7e24-4262-8869-ee737a72c1b5-kube-api-access-jnvs7\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.471614 kubelet[2631]: I0513 12:51:38.471594 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-policysync\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.471697 kubelet[2631]: I0513 12:51:38.471683 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-flexvol-driver-host\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472359 kubelet[2631]: I0513 12:51:38.471754 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-lib-modules\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472359 kubelet[2631]: I0513 12:51:38.471787 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4db8ca5-7e24-4262-8869-ee737a72c1b5-tigera-ca-bundle\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472359 kubelet[2631]: I0513 12:51:38.471805 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-cni-log-dir\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472359 kubelet[2631]: I0513 12:51:38.471819 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-xtables-lock\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472359 kubelet[2631]: I0513 12:51:38.471847 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-var-lib-calico\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472644 kubelet[2631]: I0513 12:51:38.471874 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-cni-bin-dir\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472644 kubelet[2631]: I0513 12:51:38.471889 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-cni-net-dir\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472644 kubelet[2631]: I0513 12:51:38.471903 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b4db8ca5-7e24-4262-8869-ee737a72c1b5-node-certs\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.472644 kubelet[2631]: I0513 12:51:38.471918 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b4db8ca5-7e24-4262-8869-ee737a72c1b5-var-run-calico\") pod \"calico-node-nttjr\" (UID: \"b4db8ca5-7e24-4262-8869-ee737a72c1b5\") " pod="calico-system/calico-node-nttjr" May 13 12:51:38.499602 containerd[1526]: time="2025-05-13T12:51:38.499495379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b65fcc5c5-rz2qj,Uid:12f6d267-b24b-4de8-bc46-b496047e25bc,Namespace:calico-system,Attempt:0,}" May 13 12:51:38.531398 containerd[1526]: time="2025-05-13T12:51:38.531334496Z" level=info msg="connecting to shim fbce42772f4636a594899ac117d8a0c559653ac9f1722e217c84852e860f9233" address="unix:///run/containerd/s/cf6400af7b515d5190121b3feebbde2455b00d17b386263582d3da702eb8f923" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:38.571437 kubelet[2631]: E0513 12:51:38.570984 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-grt52" podUID="6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e" May 13 12:51:38.574527 kubelet[2631]: E0513 12:51:38.574493 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.574527 kubelet[2631]: W0513 12:51:38.574516 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.575474 kubelet[2631]: E0513 12:51:38.574639 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.575940 kubelet[2631]: E0513 12:51:38.575570 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.575940 kubelet[2631]: W0513 12:51:38.575589 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.575940 kubelet[2631]: E0513 12:51:38.575613 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.576377 kubelet[2631]: E0513 12:51:38.576226 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.576377 kubelet[2631]: W0513 12:51:38.576247 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.576377 kubelet[2631]: E0513 12:51:38.576279 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.576722 kubelet[2631]: E0513 12:51:38.576688 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.576796 kubelet[2631]: W0513 12:51:38.576783 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.576927 kubelet[2631]: E0513 12:51:38.576896 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.577032 kubelet[2631]: E0513 12:51:38.577019 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.577086 kubelet[2631]: W0513 12:51:38.577077 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.577193 kubelet[2631]: E0513 12:51:38.577173 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.577341 kubelet[2631]: E0513 12:51:38.577328 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.577418 kubelet[2631]: W0513 12:51:38.577395 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.577556 kubelet[2631]: E0513 12:51:38.577537 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.577724 kubelet[2631]: E0513 12:51:38.577711 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.577821 kubelet[2631]: W0513 12:51:38.577773 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.577821 kubelet[2631]: E0513 12:51:38.577814 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.579666 kubelet[2631]: E0513 12:51:38.579489 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.579666 kubelet[2631]: W0513 12:51:38.579509 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.580771 kubelet[2631]: E0513 12:51:38.579723 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.580771 kubelet[2631]: E0513 12:51:38.580033 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.580771 kubelet[2631]: W0513 12:51:38.580045 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.580771 kubelet[2631]: E0513 12:51:38.580129 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.580771 kubelet[2631]: E0513 12:51:38.580481 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.580771 kubelet[2631]: W0513 12:51:38.580494 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.580771 kubelet[2631]: E0513 12:51:38.580568 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.581362 kubelet[2631]: E0513 12:51:38.581008 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.581362 kubelet[2631]: W0513 12:51:38.581021 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.581362 kubelet[2631]: E0513 12:51:38.581059 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.581967 kubelet[2631]: E0513 12:51:38.581614 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.581967 kubelet[2631]: W0513 12:51:38.581630 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.581967 kubelet[2631]: E0513 12:51:38.581681 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.582617 kubelet[2631]: E0513 12:51:38.582109 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.582617 kubelet[2631]: W0513 12:51:38.582123 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.582617 kubelet[2631]: E0513 12:51:38.582182 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.582617 kubelet[2631]: E0513 12:51:38.582266 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.582617 kubelet[2631]: W0513 12:51:38.582273 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.582617 kubelet[2631]: E0513 12:51:38.582451 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.582617 kubelet[2631]: W0513 12:51:38.582460 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.582885 kubelet[2631]: E0513 12:51:38.582842 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.582885 kubelet[2631]: W0513 12:51:38.582856 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.583100 kubelet[2631]: E0513 12:51:38.583089 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.583337 kubelet[2631]: W0513 12:51:38.583193 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.583337 kubelet[2631]: E0513 12:51:38.583328 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.583469 kubelet[2631]: E0513 12:51:38.583375 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.583469 kubelet[2631]: E0513 12:51:38.583396 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.583469 kubelet[2631]: E0513 12:51:38.583435 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.583753 kubelet[2631]: E0513 12:51:38.583650 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.583753 kubelet[2631]: W0513 12:51:38.583666 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.583753 kubelet[2631]: E0513 12:51:38.583736 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.584356 kubelet[2631]: E0513 12:51:38.584058 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.584356 kubelet[2631]: W0513 12:51:38.584072 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.584356 kubelet[2631]: E0513 12:51:38.584112 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.584679 kubelet[2631]: E0513 12:51:38.584533 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.584679 kubelet[2631]: W0513 12:51:38.584549 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.584999 kubelet[2631]: E0513 12:51:38.584899 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.585290 kubelet[2631]: E0513 12:51:38.585275 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.585839 kubelet[2631]: W0513 12:51:38.585351 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.585839 kubelet[2631]: E0513 12:51:38.585453 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.585839 kubelet[2631]: E0513 12:51:38.585561 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.585839 kubelet[2631]: W0513 12:51:38.585570 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.585839 kubelet[2631]: E0513 12:51:38.585648 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.585839 kubelet[2631]: E0513 12:51:38.585754 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.585839 kubelet[2631]: W0513 12:51:38.585762 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.586008 kubelet[2631]: E0513 12:51:38.585958 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.587493 kubelet[2631]: E0513 12:51:38.587471 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.587598 kubelet[2631]: W0513 12:51:38.587583 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.587742 kubelet[2631]: E0513 12:51:38.587729 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.589552 kubelet[2631]: E0513 12:51:38.589440 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.589647 kubelet[2631]: W0513 12:51:38.589632 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.589813 kubelet[2631]: E0513 12:51:38.589782 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.589912 kubelet[2631]: E0513 12:51:38.589900 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.589975 kubelet[2631]: W0513 12:51:38.589964 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.590111 kubelet[2631]: E0513 12:51:38.590088 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.590209 kubelet[2631]: E0513 12:51:38.590198 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.590262 kubelet[2631]: W0513 12:51:38.590253 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.590365 kubelet[2631]: E0513 12:51:38.590347 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.590708 kubelet[2631]: E0513 12:51:38.590595 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.590708 kubelet[2631]: W0513 12:51:38.590610 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.590708 kubelet[2631]: E0513 12:51:38.590646 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.590880 kubelet[2631]: E0513 12:51:38.590867 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.590934 kubelet[2631]: W0513 12:51:38.590924 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.591049 kubelet[2631]: E0513 12:51:38.591026 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.591433 kubelet[2631]: E0513 12:51:38.591218 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.591433 kubelet[2631]: W0513 12:51:38.591229 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.591433 kubelet[2631]: E0513 12:51:38.591240 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.591607 kubelet[2631]: E0513 12:51:38.591593 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.591638 kubelet[2631]: W0513 12:51:38.591607 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.591638 kubelet[2631]: E0513 12:51:38.591619 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.594019 kubelet[2631]: E0513 12:51:38.593994 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.594019 kubelet[2631]: W0513 12:51:38.594012 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.594183 kubelet[2631]: E0513 12:51:38.594028 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.595130 kubelet[2631]: E0513 12:51:38.595111 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.595130 kubelet[2631]: W0513 12:51:38.595128 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.595280 kubelet[2631]: E0513 12:51:38.595141 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.595625 kubelet[2631]: E0513 12:51:38.595608 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.595625 kubelet[2631]: W0513 12:51:38.595625 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.595688 kubelet[2631]: E0513 12:51:38.595639 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.595991 kubelet[2631]: E0513 12:51:38.595976 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.595991 kubelet[2631]: W0513 12:51:38.595989 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.595991 kubelet[2631]: E0513 12:51:38.595999 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.596323 kubelet[2631]: E0513 12:51:38.596310 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.596352 kubelet[2631]: W0513 12:51:38.596324 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.596352 kubelet[2631]: E0513 12:51:38.596334 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.596769 kubelet[2631]: E0513 12:51:38.596731 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.596769 kubelet[2631]: W0513 12:51:38.596746 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.596769 kubelet[2631]: E0513 12:51:38.596756 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.597028 kubelet[2631]: E0513 12:51:38.597004 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.597028 kubelet[2631]: W0513 12:51:38.597018 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.597028 kubelet[2631]: E0513 12:51:38.597027 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.597320 kubelet[2631]: E0513 12:51:38.597302 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.597320 kubelet[2631]: W0513 12:51:38.597317 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.597382 kubelet[2631]: E0513 12:51:38.597327 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.597561 kubelet[2631]: E0513 12:51:38.597545 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.597561 kubelet[2631]: W0513 12:51:38.597558 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.597668 kubelet[2631]: E0513 12:51:38.597568 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.598159 kubelet[2631]: E0513 12:51:38.598142 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.598159 kubelet[2631]: W0513 12:51:38.598158 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.598221 kubelet[2631]: E0513 12:51:38.598169 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.598319 kubelet[2631]: E0513 12:51:38.598308 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.598319 kubelet[2631]: W0513 12:51:38.598319 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.598366 kubelet[2631]: E0513 12:51:38.598328 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.598705 kubelet[2631]: E0513 12:51:38.598682 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.598705 kubelet[2631]: W0513 12:51:38.598698 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.598774 kubelet[2631]: E0513 12:51:38.598718 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.599655 kubelet[2631]: E0513 12:51:38.599624 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.599734 kubelet[2631]: W0513 12:51:38.599642 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.599734 kubelet[2631]: E0513 12:51:38.599673 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.600043 kubelet[2631]: E0513 12:51:38.600027 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.600043 kubelet[2631]: W0513 12:51:38.600042 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.600102 kubelet[2631]: E0513 12:51:38.600058 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.600382 kubelet[2631]: E0513 12:51:38.600358 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.600382 kubelet[2631]: W0513 12:51:38.600380 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.600483 kubelet[2631]: E0513 12:51:38.600451 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.600777 kubelet[2631]: E0513 12:51:38.600762 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.600777 kubelet[2631]: W0513 12:51:38.600777 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.600840 kubelet[2631]: E0513 12:51:38.600788 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.609583 systemd[1]: Started cri-containerd-fbce42772f4636a594899ac117d8a0c559653ac9f1722e217c84852e860f9233.scope - libcontainer container fbce42772f4636a594899ac117d8a0c559653ac9f1722e217c84852e860f9233. May 13 12:51:38.659652 containerd[1526]: time="2025-05-13T12:51:38.659615809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b65fcc5c5-rz2qj,Uid:12f6d267-b24b-4de8-bc46-b496047e25bc,Namespace:calico-system,Attempt:0,} returns sandbox id \"fbce42772f4636a594899ac117d8a0c559653ac9f1722e217c84852e860f9233\"" May 13 12:51:38.662860 containerd[1526]: time="2025-05-13T12:51:38.662839859Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 12:51:38.673025 kubelet[2631]: E0513 12:51:38.672992 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.673195 kubelet[2631]: W0513 12:51:38.673107 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.673195 kubelet[2631]: E0513 12:51:38.673128 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.673195 kubelet[2631]: I0513 12:51:38.673157 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e-registration-dir\") pod \"csi-node-driver-grt52\" (UID: \"6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e\") " pod="calico-system/csi-node-driver-grt52" May 13 12:51:38.673389 kubelet[2631]: E0513 12:51:38.673373 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.673389 kubelet[2631]: W0513 12:51:38.673389 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.673476 kubelet[2631]: E0513 12:51:38.673427 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.673575 kubelet[2631]: E0513 12:51:38.673557 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.673575 kubelet[2631]: W0513 12:51:38.673567 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.673638 kubelet[2631]: E0513 12:51:38.673592 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.673777 kubelet[2631]: E0513 12:51:38.673763 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.673833 kubelet[2631]: W0513 12:51:38.673819 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.673894 kubelet[2631]: E0513 12:51:38.673837 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.673920 kubelet[2631]: I0513 12:51:38.673901 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzcms\" (UniqueName: \"kubernetes.io/projected/6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e-kube-api-access-rzcms\") pod \"csi-node-driver-grt52\" (UID: \"6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e\") " pod="calico-system/csi-node-driver-grt52" May 13 12:51:38.674118 kubelet[2631]: E0513 12:51:38.674103 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.674118 kubelet[2631]: W0513 12:51:38.674117 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.674170 kubelet[2631]: E0513 12:51:38.674132 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.674170 kubelet[2631]: I0513 12:51:38.674161 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e-socket-dir\") pod \"csi-node-driver-grt52\" (UID: \"6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e\") " pod="calico-system/csi-node-driver-grt52" May 13 12:51:38.674352 kubelet[2631]: E0513 12:51:38.674340 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.674352 kubelet[2631]: W0513 12:51:38.674352 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.674412 kubelet[2631]: E0513 12:51:38.674366 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.674412 kubelet[2631]: I0513 12:51:38.674385 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e-kubelet-dir\") pod \"csi-node-driver-grt52\" (UID: \"6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e\") " pod="calico-system/csi-node-driver-grt52" May 13 12:51:38.674550 kubelet[2631]: E0513 12:51:38.674538 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.674584 kubelet[2631]: W0513 12:51:38.674550 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.674584 kubelet[2631]: E0513 12:51:38.674562 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.675154 kubelet[2631]: I0513 12:51:38.675115 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e-varrun\") pod \"csi-node-driver-grt52\" (UID: \"6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e\") " pod="calico-system/csi-node-driver-grt52" May 13 12:51:38.675332 kubelet[2631]: E0513 12:51:38.675234 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.675332 kubelet[2631]: W0513 12:51:38.675253 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.675332 kubelet[2631]: E0513 12:51:38.675276 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.675749 kubelet[2631]: E0513 12:51:38.675690 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.675749 kubelet[2631]: W0513 12:51:38.675704 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.676039 kubelet[2631]: E0513 12:51:38.676011 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.676174 kubelet[2631]: E0513 12:51:38.676112 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.676174 kubelet[2631]: W0513 12:51:38.676120 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.676321 kubelet[2631]: E0513 12:51:38.676289 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.676473 kubelet[2631]: E0513 12:51:38.676420 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.676473 kubelet[2631]: W0513 12:51:38.676432 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.676473 kubelet[2631]: E0513 12:51:38.676457 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.676763 kubelet[2631]: E0513 12:51:38.676700 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.676763 kubelet[2631]: W0513 12:51:38.676713 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.676763 kubelet[2631]: E0513 12:51:38.676744 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.676976 kubelet[2631]: E0513 12:51:38.676965 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.677098 kubelet[2631]: W0513 12:51:38.677027 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.677098 kubelet[2631]: E0513 12:51:38.677044 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.677284 kubelet[2631]: E0513 12:51:38.677273 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.677338 kubelet[2631]: W0513 12:51:38.677328 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.677394 kubelet[2631]: E0513 12:51:38.677384 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.677623 kubelet[2631]: E0513 12:51:38.677611 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.677698 kubelet[2631]: W0513 12:51:38.677687 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.677770 kubelet[2631]: E0513 12:51:38.677758 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.679184 containerd[1526]: time="2025-05-13T12:51:38.679158043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nttjr,Uid:b4db8ca5-7e24-4262-8869-ee737a72c1b5,Namespace:calico-system,Attempt:0,}" May 13 12:51:38.714471 containerd[1526]: time="2025-05-13T12:51:38.714424504Z" level=info msg="connecting to shim e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3" address="unix:///run/containerd/s/2d18e007be70b85cb122868f79379ab64aae7a6f2ba33e76e1f756c39b7434ce" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:38.739585 systemd[1]: Started cri-containerd-e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3.scope - libcontainer container e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3. May 13 12:51:38.766727 containerd[1526]: time="2025-05-13T12:51:38.766680926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nttjr,Uid:b4db8ca5-7e24-4262-8869-ee737a72c1b5,Namespace:calico-system,Attempt:0,} returns sandbox id \"e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3\"" May 13 12:51:38.776318 kubelet[2631]: E0513 12:51:38.776288 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.776318 kubelet[2631]: W0513 12:51:38.776312 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.776541 kubelet[2631]: E0513 12:51:38.776330 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.776541 kubelet[2631]: E0513 12:51:38.776548 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.776612 kubelet[2631]: W0513 12:51:38.776558 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.776612 kubelet[2631]: E0513 12:51:38.776575 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.776752 kubelet[2631]: E0513 12:51:38.776732 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.776752 kubelet[2631]: W0513 12:51:38.776749 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.776857 kubelet[2631]: E0513 12:51:38.776766 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.776955 kubelet[2631]: E0513 12:51:38.776943 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.776955 kubelet[2631]: W0513 12:51:38.776955 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.777011 kubelet[2631]: E0513 12:51:38.776968 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.777142 kubelet[2631]: E0513 12:51:38.777129 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.777142 kubelet[2631]: W0513 12:51:38.777139 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.777209 kubelet[2631]: E0513 12:51:38.777148 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.777320 kubelet[2631]: E0513 12:51:38.777299 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.777320 kubelet[2631]: W0513 12:51:38.777310 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.777365 kubelet[2631]: E0513 12:51:38.777325 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.777476 kubelet[2631]: E0513 12:51:38.777459 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.777476 kubelet[2631]: W0513 12:51:38.777469 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.777535 kubelet[2631]: E0513 12:51:38.777483 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.777609 kubelet[2631]: E0513 12:51:38.777598 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.777609 kubelet[2631]: W0513 12:51:38.777608 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.777669 kubelet[2631]: E0513 12:51:38.777634 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.777772 kubelet[2631]: E0513 12:51:38.777752 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.777772 kubelet[2631]: W0513 12:51:38.777763 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.777835 kubelet[2631]: E0513 12:51:38.777783 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.777907 kubelet[2631]: E0513 12:51:38.777894 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.777907 kubelet[2631]: W0513 12:51:38.777905 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.777973 kubelet[2631]: E0513 12:51:38.777927 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.778033 kubelet[2631]: E0513 12:51:38.778021 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.778033 kubelet[2631]: W0513 12:51:38.778031 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.778102 kubelet[2631]: E0513 12:51:38.778049 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.778150 kubelet[2631]: E0513 12:51:38.778139 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.778150 kubelet[2631]: W0513 12:51:38.778149 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.778213 kubelet[2631]: E0513 12:51:38.778165 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.778311 kubelet[2631]: E0513 12:51:38.778301 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.778311 kubelet[2631]: W0513 12:51:38.778311 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.778386 kubelet[2631]: E0513 12:51:38.778323 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.778461 kubelet[2631]: E0513 12:51:38.778451 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.778461 kubelet[2631]: W0513 12:51:38.778460 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.778510 kubelet[2631]: E0513 12:51:38.778472 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.778635 kubelet[2631]: E0513 12:51:38.778624 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.778635 kubelet[2631]: W0513 12:51:38.778634 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.778701 kubelet[2631]: E0513 12:51:38.778646 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.779020 kubelet[2631]: E0513 12:51:38.779003 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.779093 kubelet[2631]: W0513 12:51:38.779081 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.779236 kubelet[2631]: E0513 12:51:38.779168 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.779402 kubelet[2631]: E0513 12:51:38.779390 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.779484 kubelet[2631]: W0513 12:51:38.779472 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.779572 kubelet[2631]: E0513 12:51:38.779559 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.779808 kubelet[2631]: E0513 12:51:38.779790 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.779808 kubelet[2631]: W0513 12:51:38.779806 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.779924 kubelet[2631]: E0513 12:51:38.779822 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.779978 kubelet[2631]: E0513 12:51:38.779966 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.780079 kubelet[2631]: W0513 12:51:38.779978 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.780079 kubelet[2631]: E0513 12:51:38.780005 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.780130 kubelet[2631]: E0513 12:51:38.780115 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.780130 kubelet[2631]: W0513 12:51:38.780122 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.780210 kubelet[2631]: E0513 12:51:38.780184 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.780278 kubelet[2631]: E0513 12:51:38.780266 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.780278 kubelet[2631]: W0513 12:51:38.780274 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.780328 kubelet[2631]: E0513 12:51:38.780286 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.780466 kubelet[2631]: E0513 12:51:38.780454 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.780466 kubelet[2631]: W0513 12:51:38.780464 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.780517 kubelet[2631]: E0513 12:51:38.780473 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.780655 kubelet[2631]: E0513 12:51:38.780643 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.780655 kubelet[2631]: W0513 12:51:38.780654 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.780715 kubelet[2631]: E0513 12:51:38.780667 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.780843 kubelet[2631]: E0513 12:51:38.780830 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.780843 kubelet[2631]: W0513 12:51:38.780841 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.780889 kubelet[2631]: E0513 12:51:38.780849 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.781017 kubelet[2631]: E0513 12:51:38.781005 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.781017 kubelet[2631]: W0513 12:51:38.781015 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.781075 kubelet[2631]: E0513 12:51:38.781023 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:38.788607 kubelet[2631]: E0513 12:51:38.788567 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:38.788607 kubelet[2631]: W0513 12:51:38.788585 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:38.788607 kubelet[2631]: E0513 12:51:38.788598 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.146662 containerd[1526]: time="2025-05-13T12:51:40.146619728Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:40.147343 containerd[1526]: time="2025-05-13T12:51:40.147305091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=28370571" May 13 12:51:40.147966 containerd[1526]: time="2025-05-13T12:51:40.147939641Z" level=info msg="ImageCreate event name:\"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:40.150040 containerd[1526]: time="2025-05-13T12:51:40.149806444Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:40.150564 containerd[1526]: time="2025-05-13T12:51:40.150536778Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"29739745\" in 1.487496226s" May 13 12:51:40.150714 containerd[1526]: time="2025-05-13T12:51:40.150690134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\"" May 13 12:51:40.152980 containerd[1526]: time="2025-05-13T12:51:40.152955072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 12:51:40.172149 containerd[1526]: time="2025-05-13T12:51:40.172105097Z" level=info msg="CreateContainer within sandbox \"fbce42772f4636a594899ac117d8a0c559653ac9f1722e217c84852e860f9233\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 12:51:40.178997 containerd[1526]: time="2025-05-13T12:51:40.178566110Z" level=info msg="Container 4c9d4e77fd95efd5ef4047d3ed5a9087730d1a12ae9d1846db3560cce7c17fe0: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:40.185049 containerd[1526]: time="2025-05-13T12:51:40.185014961Z" level=info msg="CreateContainer within sandbox \"fbce42772f4636a594899ac117d8a0c559653ac9f1722e217c84852e860f9233\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4c9d4e77fd95efd5ef4047d3ed5a9087730d1a12ae9d1846db3560cce7c17fe0\"" May 13 12:51:40.187105 containerd[1526]: time="2025-05-13T12:51:40.187043562Z" level=info msg="StartContainer for \"4c9d4e77fd95efd5ef4047d3ed5a9087730d1a12ae9d1846db3560cce7c17fe0\"" May 13 12:51:40.194507 containerd[1526]: time="2025-05-13T12:51:40.194471125Z" level=info msg="connecting to shim 4c9d4e77fd95efd5ef4047d3ed5a9087730d1a12ae9d1846db3560cce7c17fe0" address="unix:///run/containerd/s/cf6400af7b515d5190121b3feebbde2455b00d17b386263582d3da702eb8f923" protocol=ttrpc version=3 May 13 12:51:40.215579 systemd[1]: Started cri-containerd-4c9d4e77fd95efd5ef4047d3ed5a9087730d1a12ae9d1846db3560cce7c17fe0.scope - libcontainer container 4c9d4e77fd95efd5ef4047d3ed5a9087730d1a12ae9d1846db3560cce7c17fe0. May 13 12:51:40.225583 kubelet[2631]: E0513 12:51:40.225209 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-grt52" podUID="6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e" May 13 12:51:40.263022 containerd[1526]: time="2025-05-13T12:51:40.262975944Z" level=info msg="StartContainer for \"4c9d4e77fd95efd5ef4047d3ed5a9087730d1a12ae9d1846db3560cce7c17fe0\" returns successfully" May 13 12:51:40.294477 kubelet[2631]: I0513 12:51:40.294400 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b65fcc5c5-rz2qj" podStartSLOduration=0.803706313 podStartE2EDuration="2.294380877s" podCreationTimestamp="2025-05-13 12:51:38 +0000 UTC" firstStartedPulling="2025-05-13 12:51:38.662150117 +0000 UTC m=+13.537984775" lastFinishedPulling="2025-05-13 12:51:40.152824681 +0000 UTC m=+15.028659339" observedRunningTime="2025-05-13 12:51:40.293321266 +0000 UTC m=+15.169155924" watchObservedRunningTime="2025-05-13 12:51:40.294380877 +0000 UTC m=+15.170215535" May 13 12:51:40.313846 kubelet[2631]: E0513 12:51:40.313729 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.313846 kubelet[2631]: W0513 12:51:40.313753 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.313846 kubelet[2631]: E0513 12:51:40.313772 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.314563 kubelet[2631]: E0513 12:51:40.314550 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.314803 kubelet[2631]: W0513 12:51:40.314658 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.314803 kubelet[2631]: E0513 12:51:40.314676 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.314944 kubelet[2631]: E0513 12:51:40.314912 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.315001 kubelet[2631]: W0513 12:51:40.314989 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.315061 kubelet[2631]: E0513 12:51:40.315050 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.315273 kubelet[2631]: E0513 12:51:40.315261 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.315349 kubelet[2631]: W0513 12:51:40.315338 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.315403 kubelet[2631]: E0513 12:51:40.315394 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.315719 kubelet[2631]: E0513 12:51:40.315616 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.315719 kubelet[2631]: W0513 12:51:40.315628 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.315719 kubelet[2631]: E0513 12:51:40.315638 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.315870 kubelet[2631]: E0513 12:51:40.315859 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.315916 kubelet[2631]: W0513 12:51:40.315907 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.315968 kubelet[2631]: E0513 12:51:40.315958 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.316235 kubelet[2631]: E0513 12:51:40.316150 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.316235 kubelet[2631]: W0513 12:51:40.316162 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.316235 kubelet[2631]: E0513 12:51:40.316170 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.316384 kubelet[2631]: E0513 12:51:40.316372 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.316458 kubelet[2631]: W0513 12:51:40.316447 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.316508 kubelet[2631]: E0513 12:51:40.316497 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.316775 kubelet[2631]: E0513 12:51:40.316684 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.316775 kubelet[2631]: W0513 12:51:40.316694 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.316775 kubelet[2631]: E0513 12:51:40.316703 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.316923 kubelet[2631]: E0513 12:51:40.316911 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.316973 kubelet[2631]: W0513 12:51:40.316964 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.317022 kubelet[2631]: E0513 12:51:40.317013 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.317277 kubelet[2631]: E0513 12:51:40.317197 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.317277 kubelet[2631]: W0513 12:51:40.317208 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.317277 kubelet[2631]: E0513 12:51:40.317216 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.317438 kubelet[2631]: E0513 12:51:40.317427 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.317497 kubelet[2631]: W0513 12:51:40.317486 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.317543 kubelet[2631]: E0513 12:51:40.317534 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.317804 kubelet[2631]: E0513 12:51:40.317711 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.317804 kubelet[2631]: W0513 12:51:40.317721 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.317804 kubelet[2631]: E0513 12:51:40.317730 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.317951 kubelet[2631]: E0513 12:51:40.317940 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.317998 kubelet[2631]: W0513 12:51:40.317989 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.318049 kubelet[2631]: E0513 12:51:40.318040 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.318330 kubelet[2631]: E0513 12:51:40.318246 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.318330 kubelet[2631]: W0513 12:51:40.318258 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.318330 kubelet[2631]: E0513 12:51:40.318267 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.318502 kubelet[2631]: E0513 12:51:40.318490 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.318553 kubelet[2631]: W0513 12:51:40.318543 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.318604 kubelet[2631]: E0513 12:51:40.318595 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.318853 kubelet[2631]: E0513 12:51:40.318767 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.318853 kubelet[2631]: W0513 12:51:40.318778 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.318853 kubelet[2631]: E0513 12:51:40.318786 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.318999 kubelet[2631]: E0513 12:51:40.318989 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.319048 kubelet[2631]: W0513 12:51:40.319039 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.319098 kubelet[2631]: E0513 12:51:40.319089 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.319356 kubelet[2631]: E0513 12:51:40.319277 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.319356 kubelet[2631]: W0513 12:51:40.319288 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.319356 kubelet[2631]: E0513 12:51:40.319297 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.319527 kubelet[2631]: E0513 12:51:40.319514 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.319579 kubelet[2631]: W0513 12:51:40.319569 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.319639 kubelet[2631]: E0513 12:51:40.319621 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.389401 kubelet[2631]: E0513 12:51:40.389372 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.389641 kubelet[2631]: W0513 12:51:40.389517 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.389641 kubelet[2631]: E0513 12:51:40.389540 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.389891 kubelet[2631]: E0513 12:51:40.389870 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.389891 kubelet[2631]: W0513 12:51:40.389889 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.389957 kubelet[2631]: E0513 12:51:40.389907 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.390777 kubelet[2631]: E0513 12:51:40.390756 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.390777 kubelet[2631]: W0513 12:51:40.390772 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.390870 kubelet[2631]: E0513 12:51:40.390791 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.391031 kubelet[2631]: E0513 12:51:40.391013 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.391031 kubelet[2631]: W0513 12:51:40.391025 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.391148 kubelet[2631]: E0513 12:51:40.391082 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.393963 kubelet[2631]: E0513 12:51:40.393932 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.393963 kubelet[2631]: W0513 12:51:40.393953 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.393963 kubelet[2631]: E0513 12:51:40.393969 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.395397 kubelet[2631]: E0513 12:51:40.395373 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.395397 kubelet[2631]: W0513 12:51:40.395390 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.395480 kubelet[2631]: E0513 12:51:40.395442 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.397269 kubelet[2631]: E0513 12:51:40.397187 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.399538 kubelet[2631]: W0513 12:51:40.397205 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.399737 kubelet[2631]: E0513 12:51:40.399610 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.399825 kubelet[2631]: E0513 12:51:40.399801 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.399825 kubelet[2631]: W0513 12:51:40.399816 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.399825 kubelet[2631]: E0513 12:51:40.399845 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.399825 kubelet[2631]: E0513 12:51:40.399985 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.399825 kubelet[2631]: W0513 12:51:40.399994 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.399825 kubelet[2631]: E0513 12:51:40.400047 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.399825 kubelet[2631]: E0513 12:51:40.400160 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.399825 kubelet[2631]: W0513 12:51:40.400171 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.399825 kubelet[2631]: E0513 12:51:40.400183 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.399825 kubelet[2631]: E0513 12:51:40.400397 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.401451 kubelet[2631]: W0513 12:51:40.400419 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.401451 kubelet[2631]: E0513 12:51:40.400432 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.401451 kubelet[2631]: E0513 12:51:40.400631 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.401451 kubelet[2631]: W0513 12:51:40.400640 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.401451 kubelet[2631]: E0513 12:51:40.400649 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.404496 kubelet[2631]: E0513 12:51:40.404471 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.404496 kubelet[2631]: W0513 12:51:40.404489 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.404631 kubelet[2631]: E0513 12:51:40.404562 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.404707 kubelet[2631]: E0513 12:51:40.404692 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.404735 kubelet[2631]: W0513 12:51:40.404707 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.404771 kubelet[2631]: E0513 12:51:40.404754 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.404880 kubelet[2631]: E0513 12:51:40.404866 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.404880 kubelet[2631]: W0513 12:51:40.404877 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.404958 kubelet[2631]: E0513 12:51:40.404892 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.406299 kubelet[2631]: E0513 12:51:40.406275 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.406299 kubelet[2631]: W0513 12:51:40.406291 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.406414 kubelet[2631]: E0513 12:51:40.406304 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.406523 kubelet[2631]: E0513 12:51:40.406509 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.406523 kubelet[2631]: W0513 12:51:40.406520 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.406592 kubelet[2631]: E0513 12:51:40.406533 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:40.408578 kubelet[2631]: E0513 12:51:40.408490 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:40.408638 kubelet[2631]: W0513 12:51:40.408583 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:40.408638 kubelet[2631]: E0513 12:51:40.408627 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.283757 kubelet[2631]: I0513 12:51:41.283571 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:51:41.326976 kubelet[2631]: E0513 12:51:41.326920 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.326976 kubelet[2631]: W0513 12:51:41.326941 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.326976 kubelet[2631]: E0513 12:51:41.326959 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.327471 kubelet[2631]: E0513 12:51:41.327116 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.327471 kubelet[2631]: W0513 12:51:41.327124 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.327471 kubelet[2631]: E0513 12:51:41.327132 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.327471 kubelet[2631]: E0513 12:51:41.327296 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.327471 kubelet[2631]: W0513 12:51:41.327304 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.327471 kubelet[2631]: E0513 12:51:41.327312 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.327772 kubelet[2631]: E0513 12:51:41.327561 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.327772 kubelet[2631]: W0513 12:51:41.327571 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.327772 kubelet[2631]: E0513 12:51:41.327580 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.328158 kubelet[2631]: E0513 12:51:41.327927 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.328158 kubelet[2631]: W0513 12:51:41.327940 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.328158 kubelet[2631]: E0513 12:51:41.327949 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.328158 kubelet[2631]: E0513 12:51:41.328071 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.328916 kubelet[2631]: W0513 12:51:41.328892 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.328976 kubelet[2631]: E0513 12:51:41.328919 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.329210 kubelet[2631]: E0513 12:51:41.329198 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.329245 kubelet[2631]: W0513 12:51:41.329212 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.329245 kubelet[2631]: E0513 12:51:41.329221 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.329448 kubelet[2631]: E0513 12:51:41.329434 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.329448 kubelet[2631]: W0513 12:51:41.329447 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.329517 kubelet[2631]: E0513 12:51:41.329458 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.329688 kubelet[2631]: E0513 12:51:41.329636 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.329688 kubelet[2631]: W0513 12:51:41.329661 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.329688 kubelet[2631]: E0513 12:51:41.329671 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.330108 kubelet[2631]: E0513 12:51:41.329833 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.330108 kubelet[2631]: W0513 12:51:41.329841 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.330108 kubelet[2631]: E0513 12:51:41.329848 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.330508 kubelet[2631]: E0513 12:51:41.330486 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.330508 kubelet[2631]: W0513 12:51:41.330501 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.330695 kubelet[2631]: E0513 12:51:41.330513 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.331059 kubelet[2631]: E0513 12:51:41.331044 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.331059 kubelet[2631]: W0513 12:51:41.331058 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.331129 kubelet[2631]: E0513 12:51:41.331070 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.331468 kubelet[2631]: E0513 12:51:41.331308 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.331515 kubelet[2631]: W0513 12:51:41.331472 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.331515 kubelet[2631]: E0513 12:51:41.331487 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.331869 kubelet[2631]: E0513 12:51:41.331854 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.331869 kubelet[2631]: W0513 12:51:41.331867 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.331927 kubelet[2631]: E0513 12:51:41.331876 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.333184 kubelet[2631]: E0513 12:51:41.333164 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.333184 kubelet[2631]: W0513 12:51:41.333182 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.333272 kubelet[2631]: E0513 12:51:41.333194 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.361560 containerd[1526]: time="2025-05-13T12:51:41.361505034Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:41.361998 containerd[1526]: time="2025-05-13T12:51:41.361953055Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5122903" May 13 12:51:41.363428 containerd[1526]: time="2025-05-13T12:51:41.363371855Z" level=info msg="ImageCreate event name:\"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:41.365972 containerd[1526]: time="2025-05-13T12:51:41.365935552Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:41.366660 containerd[1526]: time="2025-05-13T12:51:41.366612665Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6492045\" in 1.213628546s" May 13 12:51:41.366660 containerd[1526]: time="2025-05-13T12:51:41.366643872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\"" May 13 12:51:41.369567 containerd[1526]: time="2025-05-13T12:51:41.369455026Z" level=info msg="CreateContainer within sandbox \"e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 12:51:41.377316 containerd[1526]: time="2025-05-13T12:51:41.376674693Z" level=info msg="Container 2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:41.385733 containerd[1526]: time="2025-05-13T12:51:41.385690524Z" level=info msg="CreateContainer within sandbox \"e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61\"" May 13 12:51:41.386201 containerd[1526]: time="2025-05-13T12:51:41.386163831Z" level=info msg="StartContainer for \"2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61\"" May 13 12:51:41.388464 containerd[1526]: time="2025-05-13T12:51:41.388402256Z" level=info msg="connecting to shim 2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61" address="unix:///run/containerd/s/2d18e007be70b85cb122868f79379ab64aae7a6f2ba33e76e1f756c39b7434ce" protocol=ttrpc version=3 May 13 12:51:41.403260 kubelet[2631]: E0513 12:51:41.402991 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.403260 kubelet[2631]: W0513 12:51:41.403011 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.403260 kubelet[2631]: E0513 12:51:41.403029 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.403260 kubelet[2631]: E0513 12:51:41.403217 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.403260 kubelet[2631]: W0513 12:51:41.403224 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.403260 kubelet[2631]: E0513 12:51:41.403238 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.403700 kubelet[2631]: E0513 12:51:41.403455 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.403700 kubelet[2631]: W0513 12:51:41.403464 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.403700 kubelet[2631]: E0513 12:51:41.403481 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.403700 kubelet[2631]: E0513 12:51:41.403669 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.403700 kubelet[2631]: W0513 12:51:41.403678 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.403700 kubelet[2631]: E0513 12:51:41.403693 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.404915 kubelet[2631]: E0513 12:51:41.403849 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.404915 kubelet[2631]: W0513 12:51:41.403856 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.404915 kubelet[2631]: E0513 12:51:41.403868 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.404915 kubelet[2631]: E0513 12:51:41.404009 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.404915 kubelet[2631]: W0513 12:51:41.404024 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.404915 kubelet[2631]: E0513 12:51:41.404040 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.405127 kubelet[2631]: E0513 12:51:41.405109 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.405164 kubelet[2631]: W0513 12:51:41.405123 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.405164 kubelet[2631]: E0513 12:51:41.405153 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.405770 kubelet[2631]: E0513 12:51:41.405752 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.405770 kubelet[2631]: W0513 12:51:41.405768 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.405841 kubelet[2631]: E0513 12:51:41.405784 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.406718 kubelet[2631]: E0513 12:51:41.406691 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.406718 kubelet[2631]: W0513 12:51:41.406714 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.406835 kubelet[2631]: E0513 12:51:41.406807 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.406934 kubelet[2631]: E0513 12:51:41.406907 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.406982 kubelet[2631]: W0513 12:51:41.406937 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.407062 kubelet[2631]: E0513 12:51:41.407030 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.407096 kubelet[2631]: E0513 12:51:41.407083 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.407096 kubelet[2631]: W0513 12:51:41.407091 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.407255 kubelet[2631]: E0513 12:51:41.407106 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.407333 kubelet[2631]: E0513 12:51:41.407320 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.407546 kubelet[2631]: W0513 12:51:41.407401 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.407546 kubelet[2631]: E0513 12:51:41.407447 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.407680 kubelet[2631]: E0513 12:51:41.407668 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.407731 kubelet[2631]: W0513 12:51:41.407721 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.407789 kubelet[2631]: E0513 12:51:41.407779 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.408015 kubelet[2631]: E0513 12:51:41.407978 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.408015 kubelet[2631]: W0513 12:51:41.407994 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.408015 kubelet[2631]: E0513 12:51:41.408008 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.408161 kubelet[2631]: E0513 12:51:41.408150 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.408161 kubelet[2631]: W0513 12:51:41.408161 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.408212 kubelet[2631]: E0513 12:51:41.408173 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.408352 kubelet[2631]: E0513 12:51:41.408342 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.408399 kubelet[2631]: W0513 12:51:41.408362 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.408399 kubelet[2631]: E0513 12:51:41.408377 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.408587 kubelet[2631]: E0513 12:51:41.408573 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.408624 kubelet[2631]: W0513 12:51:41.408587 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.408624 kubelet[2631]: E0513 12:51:41.408598 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.408761 kubelet[2631]: E0513 12:51:41.408739 2631 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:51:41.408794 kubelet[2631]: W0513 12:51:41.408761 2631 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:51:41.408794 kubelet[2631]: E0513 12:51:41.408771 2631 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:51:41.409596 systemd[1]: Started cri-containerd-2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61.scope - libcontainer container 2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61. May 13 12:51:41.438391 containerd[1526]: time="2025-05-13T12:51:41.438336669Z" level=info msg="StartContainer for \"2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61\" returns successfully" May 13 12:51:41.469667 systemd[1]: cri-containerd-2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61.scope: Deactivated successfully. May 13 12:51:41.470280 systemd[1]: cri-containerd-2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61.scope: Consumed 41ms CPU time, 7.8M memory peak, 6.2M written to disk. May 13 12:51:41.486466 containerd[1526]: time="2025-05-13T12:51:41.486422506Z" level=info msg="received exit event container_id:\"2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61\" id:\"2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61\" pid:3349 exited_at:{seconds:1747140701 nanos:477007704}" May 13 12:51:41.487395 containerd[1526]: time="2025-05-13T12:51:41.487157031Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61\" id:\"2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61\" pid:3349 exited_at:{seconds:1747140701 nanos:477007704}" May 13 12:51:41.524590 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2c69ce705e4eee7632dbb26e8b25fb899ac8b85876194bf4a544210492896c61-rootfs.mount: Deactivated successfully. May 13 12:51:42.220898 kubelet[2631]: E0513 12:51:42.220856 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-grt52" podUID="6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e" May 13 12:51:42.288644 containerd[1526]: time="2025-05-13T12:51:42.288371241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 12:51:42.456500 update_engine[1511]: I20250513 12:51:42.456434 1511 update_attempter.cc:509] Updating boot flags... May 13 12:51:42.539528 kubelet[2631]: I0513 12:51:42.538192 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:51:44.220999 kubelet[2631]: E0513 12:51:44.220940 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-grt52" podUID="6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e" May 13 12:51:46.221422 kubelet[2631]: E0513 12:51:46.221337 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-grt52" podUID="6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e" May 13 12:51:46.578855 containerd[1526]: time="2025-05-13T12:51:46.578809272Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:46.579362 containerd[1526]: time="2025-05-13T12:51:46.579315241Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=91256270" May 13 12:51:46.580142 containerd[1526]: time="2025-05-13T12:51:46.580111901Z" level=info msg="ImageCreate event name:\"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:46.582542 containerd[1526]: time="2025-05-13T12:51:46.582477397Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:46.584453 containerd[1526]: time="2025-05-13T12:51:46.584373930Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"92625452\" in 4.295779801s" May 13 12:51:46.584573 containerd[1526]: time="2025-05-13T12:51:46.584454544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\"" May 13 12:51:46.586610 containerd[1526]: time="2025-05-13T12:51:46.586570076Z" level=info msg="CreateContainer within sandbox \"e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 12:51:46.595742 containerd[1526]: time="2025-05-13T12:51:46.595682758Z" level=info msg="Container b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:46.604053 containerd[1526]: time="2025-05-13T12:51:46.603977537Z" level=info msg="CreateContainer within sandbox \"e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c\"" May 13 12:51:46.610237 containerd[1526]: time="2025-05-13T12:51:46.610201191Z" level=info msg="StartContainer for \"b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c\"" May 13 12:51:46.612183 containerd[1526]: time="2025-05-13T12:51:46.612136051Z" level=info msg="connecting to shim b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c" address="unix:///run/containerd/s/2d18e007be70b85cb122868f79379ab64aae7a6f2ba33e76e1f756c39b7434ce" protocol=ttrpc version=3 May 13 12:51:46.634572 systemd[1]: Started cri-containerd-b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c.scope - libcontainer container b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c. May 13 12:51:46.666766 containerd[1526]: time="2025-05-13T12:51:46.666723048Z" level=info msg="StartContainer for \"b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c\" returns successfully" May 13 12:51:47.252419 systemd[1]: cri-containerd-b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c.scope: Deactivated successfully. May 13 12:51:47.252707 systemd[1]: cri-containerd-b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c.scope: Consumed 452ms CPU time, 159.4M memory peak, 4K read from disk, 150.3M written to disk. May 13 12:51:47.261370 containerd[1526]: time="2025-05-13T12:51:47.261329147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c\" id:\"b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c\" pid:3428 exited_at:{seconds:1747140707 nanos:260790336}" May 13 12:51:47.277099 containerd[1526]: time="2025-05-13T12:51:47.277050383Z" level=info msg="received exit event container_id:\"b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c\" id:\"b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c\" pid:3428 exited_at:{seconds:1747140707 nanos:260790336}" May 13 12:51:47.293781 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b8be3c6cd1868a8bbd925b1ac4a81892e27a1018e902a0c65e8ff3aac0d1f22c-rootfs.mount: Deactivated successfully. May 13 12:51:47.321697 kubelet[2631]: I0513 12:51:47.321597 2631 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 13 12:51:47.364152 systemd[1]: Created slice kubepods-burstable-pod7f0ad085_6ec5_4d45_bfe4_5446ac10196b.slice - libcontainer container kubepods-burstable-pod7f0ad085_6ec5_4d45_bfe4_5446ac10196b.slice. May 13 12:51:47.377905 systemd[1]: Created slice kubepods-burstable-pod26e9b4bd_2d18_4eb3_aa85_2b3b1ab905bc.slice - libcontainer container kubepods-burstable-pod26e9b4bd_2d18_4eb3_aa85_2b3b1ab905bc.slice. May 13 12:51:47.388090 systemd[1]: Created slice kubepods-besteffort-pod24a16169_b295_46ad_9302_8b8b82c92f2d.slice - libcontainer container kubepods-besteffort-pod24a16169_b295_46ad_9302_8b8b82c92f2d.slice. May 13 12:51:47.401592 systemd[1]: Created slice kubepods-besteffort-pod7b29b44c_d00d_426c_bf21_d892e750d90b.slice - libcontainer container kubepods-besteffort-pod7b29b44c_d00d_426c_bf21_d892e750d90b.slice. May 13 12:51:47.408466 systemd[1]: Created slice kubepods-besteffort-pod36ff9343_a76a_4f7a_a113_216af1e8e6aa.slice - libcontainer container kubepods-besteffort-pod36ff9343_a76a_4f7a_a113_216af1e8e6aa.slice. May 13 12:51:47.446186 kubelet[2631]: I0513 12:51:47.446137 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/36ff9343-a76a-4f7a-a113-216af1e8e6aa-calico-apiserver-certs\") pod \"calico-apiserver-6f98d9df74-mm9jv\" (UID: \"36ff9343-a76a-4f7a-a113-216af1e8e6aa\") " pod="calico-apiserver/calico-apiserver-6f98d9df74-mm9jv" May 13 12:51:47.446186 kubelet[2631]: I0513 12:51:47.446184 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f0ad085-6ec5-4d45-bfe4-5446ac10196b-config-volume\") pod \"coredns-6f6b679f8f-fq9tz\" (UID: \"7f0ad085-6ec5-4d45-bfe4-5446ac10196b\") " pod="kube-system/coredns-6f6b679f8f-fq9tz" May 13 12:51:47.446344 kubelet[2631]: I0513 12:51:47.446205 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc-config-volume\") pod \"coredns-6f6b679f8f-f7zvz\" (UID: \"26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc\") " pod="kube-system/coredns-6f6b679f8f-f7zvz" May 13 12:51:47.446344 kubelet[2631]: I0513 12:51:47.446232 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc4vz\" (UniqueName: \"kubernetes.io/projected/36ff9343-a76a-4f7a-a113-216af1e8e6aa-kube-api-access-vc4vz\") pod \"calico-apiserver-6f98d9df74-mm9jv\" (UID: \"36ff9343-a76a-4f7a-a113-216af1e8e6aa\") " pod="calico-apiserver/calico-apiserver-6f98d9df74-mm9jv" May 13 12:51:47.446344 kubelet[2631]: I0513 12:51:47.446252 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm8wx\" (UniqueName: \"kubernetes.io/projected/26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc-kube-api-access-lm8wx\") pod \"coredns-6f6b679f8f-f7zvz\" (UID: \"26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc\") " pod="kube-system/coredns-6f6b679f8f-f7zvz" May 13 12:51:47.446344 kubelet[2631]: I0513 12:51:47.446271 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24a16169-b295-46ad-9302-8b8b82c92f2d-tigera-ca-bundle\") pod \"calico-kube-controllers-789c54bdb8-t9tw4\" (UID: \"24a16169-b295-46ad-9302-8b8b82c92f2d\") " pod="calico-system/calico-kube-controllers-789c54bdb8-t9tw4" May 13 12:51:47.446470 kubelet[2631]: I0513 12:51:47.446331 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czd95\" (UniqueName: \"kubernetes.io/projected/7b29b44c-d00d-426c-bf21-d892e750d90b-kube-api-access-czd95\") pod \"calico-apiserver-6f98d9df74-4dtwn\" (UID: \"7b29b44c-d00d-426c-bf21-d892e750d90b\") " pod="calico-apiserver/calico-apiserver-6f98d9df74-4dtwn" May 13 12:51:47.446470 kubelet[2631]: I0513 12:51:47.446377 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d4m5\" (UniqueName: \"kubernetes.io/projected/24a16169-b295-46ad-9302-8b8b82c92f2d-kube-api-access-5d4m5\") pod \"calico-kube-controllers-789c54bdb8-t9tw4\" (UID: \"24a16169-b295-46ad-9302-8b8b82c92f2d\") " pod="calico-system/calico-kube-controllers-789c54bdb8-t9tw4" May 13 12:51:47.446470 kubelet[2631]: I0513 12:51:47.446452 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xnvj\" (UniqueName: \"kubernetes.io/projected/7f0ad085-6ec5-4d45-bfe4-5446ac10196b-kube-api-access-9xnvj\") pod \"coredns-6f6b679f8f-fq9tz\" (UID: \"7f0ad085-6ec5-4d45-bfe4-5446ac10196b\") " pod="kube-system/coredns-6f6b679f8f-fq9tz" May 13 12:51:47.446540 kubelet[2631]: I0513 12:51:47.446491 2631 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7b29b44c-d00d-426c-bf21-d892e750d90b-calico-apiserver-certs\") pod \"calico-apiserver-6f98d9df74-4dtwn\" (UID: \"7b29b44c-d00d-426c-bf21-d892e750d90b\") " pod="calico-apiserver/calico-apiserver-6f98d9df74-4dtwn" May 13 12:51:47.674276 containerd[1526]: time="2025-05-13T12:51:47.674236664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-fq9tz,Uid:7f0ad085-6ec5-4d45-bfe4-5446ac10196b,Namespace:kube-system,Attempt:0,}" May 13 12:51:47.681968 containerd[1526]: time="2025-05-13T12:51:47.681936955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-f7zvz,Uid:26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc,Namespace:kube-system,Attempt:0,}" May 13 12:51:47.706360 containerd[1526]: time="2025-05-13T12:51:47.701972835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-789c54bdb8-t9tw4,Uid:24a16169-b295-46ad-9302-8b8b82c92f2d,Namespace:calico-system,Attempt:0,}" May 13 12:51:47.710259 containerd[1526]: time="2025-05-13T12:51:47.707012800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-4dtwn,Uid:7b29b44c-d00d-426c-bf21-d892e750d90b,Namespace:calico-apiserver,Attempt:0,}" May 13 12:51:47.725337 containerd[1526]: time="2025-05-13T12:51:47.718773852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-mm9jv,Uid:36ff9343-a76a-4f7a-a113-216af1e8e6aa,Namespace:calico-apiserver,Attempt:0,}" May 13 12:51:48.056468 containerd[1526]: time="2025-05-13T12:51:48.056421344Z" level=error msg="Failed to destroy network for sandbox \"c4299bcb34521abb4ad86741d998bf34a24a497b317ec23e88308af48e6560e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.057999 containerd[1526]: time="2025-05-13T12:51:48.057955550Z" level=error msg="Failed to destroy network for sandbox \"5cae5b672eef4f2b447f19739a6cd8225b3bb4eb2d862f26e045c2d48be3554d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.058717 containerd[1526]: time="2025-05-13T12:51:48.058655582Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-fq9tz,Uid:7f0ad085-6ec5-4d45-bfe4-5446ac10196b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4299bcb34521abb4ad86741d998bf34a24a497b317ec23e88308af48e6560e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.059816 containerd[1526]: time="2025-05-13T12:51:48.059722633Z" level=error msg="Failed to destroy network for sandbox \"52075e4196cf24331b43f26f197480ee676ed0327560b881c3fb9f10f7503295\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.062673 containerd[1526]: time="2025-05-13T12:51:48.062625137Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-f7zvz,Uid:26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cae5b672eef4f2b447f19739a6cd8225b3bb4eb2d862f26e045c2d48be3554d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.063012 containerd[1526]: time="2025-05-13T12:51:48.062985355Z" level=error msg="Failed to destroy network for sandbox \"5818f9479c654340f9accd46d3729db555776222f315bdf27873dc44b260a4e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.063695 kubelet[2631]: E0513 12:51:48.063529 2631 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cae5b672eef4f2b447f19739a6cd8225b3bb4eb2d862f26e045c2d48be3554d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.063695 kubelet[2631]: E0513 12:51:48.063620 2631 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cae5b672eef4f2b447f19739a6cd8225b3bb4eb2d862f26e045c2d48be3554d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-f7zvz" May 13 12:51:48.063695 kubelet[2631]: E0513 12:51:48.063641 2631 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cae5b672eef4f2b447f19739a6cd8225b3bb4eb2d862f26e045c2d48be3554d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-f7zvz" May 13 12:51:48.063864 kubelet[2631]: E0513 12:51:48.063787 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-f7zvz_kube-system(26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-f7zvz_kube-system(26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5cae5b672eef4f2b447f19739a6cd8225b3bb4eb2d862f26e045c2d48be3554d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-f7zvz" podUID="26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc" May 13 12:51:48.063913 containerd[1526]: time="2025-05-13T12:51:48.063709991Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-789c54bdb8-t9tw4,Uid:24a16169-b295-46ad-9302-8b8b82c92f2d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"52075e4196cf24331b43f26f197480ee676ed0327560b881c3fb9f10f7503295\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.064008 kubelet[2631]: E0513 12:51:48.063961 2631 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4299bcb34521abb4ad86741d998bf34a24a497b317ec23e88308af48e6560e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.064042 kubelet[2631]: E0513 12:51:48.064021 2631 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4299bcb34521abb4ad86741d998bf34a24a497b317ec23e88308af48e6560e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-fq9tz" May 13 12:51:48.064067 kubelet[2631]: E0513 12:51:48.064039 2631 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4299bcb34521abb4ad86741d998bf34a24a497b317ec23e88308af48e6560e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-fq9tz" May 13 12:51:48.064088 kubelet[2631]: E0513 12:51:48.064067 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-fq9tz_kube-system(7f0ad085-6ec5-4d45-bfe4-5446ac10196b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-fq9tz_kube-system(7f0ad085-6ec5-4d45-bfe4-5446ac10196b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4299bcb34521abb4ad86741d998bf34a24a497b317ec23e88308af48e6560e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-fq9tz" podUID="7f0ad085-6ec5-4d45-bfe4-5446ac10196b" May 13 12:51:48.065442 kubelet[2631]: E0513 12:51:48.064379 2631 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52075e4196cf24331b43f26f197480ee676ed0327560b881c3fb9f10f7503295\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.065442 kubelet[2631]: E0513 12:51:48.064458 2631 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52075e4196cf24331b43f26f197480ee676ed0327560b881c3fb9f10f7503295\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-789c54bdb8-t9tw4" May 13 12:51:48.065442 kubelet[2631]: E0513 12:51:48.064475 2631 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52075e4196cf24331b43f26f197480ee676ed0327560b881c3fb9f10f7503295\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-789c54bdb8-t9tw4" May 13 12:51:48.065609 containerd[1526]: time="2025-05-13T12:51:48.064905702Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-mm9jv,Uid:36ff9343-a76a-4f7a-a113-216af1e8e6aa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5818f9479c654340f9accd46d3729db555776222f315bdf27873dc44b260a4e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.065658 kubelet[2631]: E0513 12:51:48.064524 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-789c54bdb8-t9tw4_calico-system(24a16169-b295-46ad-9302-8b8b82c92f2d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-789c54bdb8-t9tw4_calico-system(24a16169-b295-46ad-9302-8b8b82c92f2d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52075e4196cf24331b43f26f197480ee676ed0327560b881c3fb9f10f7503295\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-789c54bdb8-t9tw4" podUID="24a16169-b295-46ad-9302-8b8b82c92f2d" May 13 12:51:48.065658 kubelet[2631]: E0513 12:51:48.065034 2631 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5818f9479c654340f9accd46d3729db555776222f315bdf27873dc44b260a4e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.065658 kubelet[2631]: E0513 12:51:48.065066 2631 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5818f9479c654340f9accd46d3729db555776222f315bdf27873dc44b260a4e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f98d9df74-mm9jv" May 13 12:51:48.065743 kubelet[2631]: E0513 12:51:48.065080 2631 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5818f9479c654340f9accd46d3729db555776222f315bdf27873dc44b260a4e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f98d9df74-mm9jv" May 13 12:51:48.065743 kubelet[2631]: E0513 12:51:48.065113 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f98d9df74-mm9jv_calico-apiserver(36ff9343-a76a-4f7a-a113-216af1e8e6aa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f98d9df74-mm9jv_calico-apiserver(36ff9343-a76a-4f7a-a113-216af1e8e6aa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5818f9479c654340f9accd46d3729db555776222f315bdf27873dc44b260a4e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f98d9df74-mm9jv" podUID="36ff9343-a76a-4f7a-a113-216af1e8e6aa" May 13 12:51:48.068872 containerd[1526]: time="2025-05-13T12:51:48.068837252Z" level=error msg="Failed to destroy network for sandbox \"5d85a20f0a30f50758012bfe8ad0f8a2eadb1a3d171121e5360d2fbff390e3f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.070308 containerd[1526]: time="2025-05-13T12:51:48.070273402Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-4dtwn,Uid:7b29b44c-d00d-426c-bf21-d892e750d90b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d85a20f0a30f50758012bfe8ad0f8a2eadb1a3d171121e5360d2fbff390e3f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.070645 kubelet[2631]: E0513 12:51:48.070497 2631 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d85a20f0a30f50758012bfe8ad0f8a2eadb1a3d171121e5360d2fbff390e3f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.070645 kubelet[2631]: E0513 12:51:48.070541 2631 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d85a20f0a30f50758012bfe8ad0f8a2eadb1a3d171121e5360d2fbff390e3f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f98d9df74-4dtwn" May 13 12:51:48.070645 kubelet[2631]: E0513 12:51:48.070556 2631 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d85a20f0a30f50758012bfe8ad0f8a2eadb1a3d171121e5360d2fbff390e3f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f98d9df74-4dtwn" May 13 12:51:48.070745 kubelet[2631]: E0513 12:51:48.070607 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f98d9df74-4dtwn_calico-apiserver(7b29b44c-d00d-426c-bf21-d892e750d90b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f98d9df74-4dtwn_calico-apiserver(7b29b44c-d00d-426c-bf21-d892e750d90b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d85a20f0a30f50758012bfe8ad0f8a2eadb1a3d171121e5360d2fbff390e3f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f98d9df74-4dtwn" podUID="7b29b44c-d00d-426c-bf21-d892e750d90b" May 13 12:51:48.226533 systemd[1]: Created slice kubepods-besteffort-pod6b738dcd_3dcf_4d6f_8e7e_e0a90a81033e.slice - libcontainer container kubepods-besteffort-pod6b738dcd_3dcf_4d6f_8e7e_e0a90a81033e.slice. May 13 12:51:48.228729 containerd[1526]: time="2025-05-13T12:51:48.228696519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-grt52,Uid:6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e,Namespace:calico-system,Attempt:0,}" May 13 12:51:48.269855 containerd[1526]: time="2025-05-13T12:51:48.269799537Z" level=error msg="Failed to destroy network for sandbox \"54dd763df12e9f837f3fdf3ffa04da37010682bfc5025cc692c9a0f8575d5dd5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.270912 containerd[1526]: time="2025-05-13T12:51:48.270877430Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-grt52,Uid:6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dd763df12e9f837f3fdf3ffa04da37010682bfc5025cc692c9a0f8575d5dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.271122 kubelet[2631]: E0513 12:51:48.271087 2631 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dd763df12e9f837f3fdf3ffa04da37010682bfc5025cc692c9a0f8575d5dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:51:48.271178 kubelet[2631]: E0513 12:51:48.271144 2631 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dd763df12e9f837f3fdf3ffa04da37010682bfc5025cc692c9a0f8575d5dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-grt52" May 13 12:51:48.271178 kubelet[2631]: E0513 12:51:48.271164 2631 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"54dd763df12e9f837f3fdf3ffa04da37010682bfc5025cc692c9a0f8575d5dd5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-grt52" May 13 12:51:48.271234 kubelet[2631]: E0513 12:51:48.271207 2631 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-grt52_calico-system(6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-grt52_calico-system(6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"54dd763df12e9f837f3fdf3ffa04da37010682bfc5025cc692c9a0f8575d5dd5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-grt52" podUID="6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e" May 13 12:51:48.308440 containerd[1526]: time="2025-05-13T12:51:48.307281017Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 12:51:48.595507 systemd[1]: run-netns-cni\x2d79e19c4f\x2d7c95\x2db798\x2d4051\x2d2133cc137811.mount: Deactivated successfully. May 13 12:51:48.595867 systemd[1]: run-netns-cni\x2d60e26524\x2da029\x2d195f\x2db07c\x2d8c6a159ecc92.mount: Deactivated successfully. May 13 12:51:48.596014 systemd[1]: run-netns-cni\x2d2df75b6d\x2d5c36\x2d98e3\x2d8ee3\x2d582539fbcd61.mount: Deactivated successfully. May 13 12:51:48.596140 systemd[1]: run-netns-cni\x2d107acf4c\x2d1e91\x2dec27\x2defb7\x2d30abd1fe2cde.mount: Deactivated successfully. May 13 12:51:52.059808 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount318846598.mount: Deactivated successfully. May 13 12:51:52.127397 containerd[1526]: time="2025-05-13T12:51:52.127324525Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=138981893" May 13 12:51:52.141548 containerd[1526]: time="2025-05-13T12:51:52.141505626Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:52.144457 containerd[1526]: time="2025-05-13T12:51:52.144394773Z" level=info msg="ImageCreate event name:\"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:52.145031 containerd[1526]: time="2025-05-13T12:51:52.144977571Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:51:52.145822 containerd[1526]: time="2025-05-13T12:51:52.145483359Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"138981755\" in 3.838166217s" May 13 12:51:52.145822 containerd[1526]: time="2025-05-13T12:51:52.145511563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\"" May 13 12:51:52.156506 containerd[1526]: time="2025-05-13T12:51:52.156174192Z" level=info msg="CreateContainer within sandbox \"e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 12:51:52.178859 containerd[1526]: time="2025-05-13T12:51:52.178319280Z" level=info msg="Container 0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:52.190163 containerd[1526]: time="2025-05-13T12:51:52.190107940Z" level=info msg="CreateContainer within sandbox \"e60821ccf6c9c7c2ef8a18f19deff781a0415ecf24e4d3234172580cd27a82b3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff\"" May 13 12:51:52.190580 containerd[1526]: time="2025-05-13T12:51:52.190543279Z" level=info msg="StartContainer for \"0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff\"" May 13 12:51:52.192291 containerd[1526]: time="2025-05-13T12:51:52.192248427Z" level=info msg="connecting to shim 0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff" address="unix:///run/containerd/s/2d18e007be70b85cb122868f79379ab64aae7a6f2ba33e76e1f756c39b7434ce" protocol=ttrpc version=3 May 13 12:51:52.219596 systemd[1]: Started cri-containerd-0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff.scope - libcontainer container 0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff. May 13 12:51:52.263437 containerd[1526]: time="2025-05-13T12:51:52.263374241Z" level=info msg="StartContainer for \"0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff\" returns successfully" May 13 12:51:52.337132 kubelet[2631]: I0513 12:51:52.336926 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-nttjr" podStartSLOduration=0.958511036 podStartE2EDuration="14.336910457s" podCreationTimestamp="2025-05-13 12:51:38 +0000 UTC" firstStartedPulling="2025-05-13 12:51:38.767903648 +0000 UTC m=+13.643738306" lastFinishedPulling="2025-05-13 12:51:52.146303029 +0000 UTC m=+27.022137727" observedRunningTime="2025-05-13 12:51:52.334901308 +0000 UTC m=+27.210735966" watchObservedRunningTime="2025-05-13 12:51:52.336910457 +0000 UTC m=+27.212745195" May 13 12:51:52.442707 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 12:51:52.442810 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 12:51:53.319143 kubelet[2631]: I0513 12:51:53.319060 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:51:53.992791 systemd-networkd[1444]: vxlan.calico: Link UP May 13 12:51:53.992800 systemd-networkd[1444]: vxlan.calico: Gained carrier May 13 12:51:55.495583 systemd-networkd[1444]: vxlan.calico: Gained IPv6LL May 13 12:51:56.151647 systemd[1]: Started sshd@7-10.0.0.111:22-10.0.0.1:35556.service - OpenSSH per-connection server daemon (10.0.0.1:35556). May 13 12:51:56.213163 sshd[3962]: Accepted publickey for core from 10.0.0.1 port 35556 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:51:56.214479 sshd-session[3962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:51:56.218882 systemd-logind[1509]: New session 8 of user core. May 13 12:51:56.223547 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 12:51:56.348258 sshd[3964]: Connection closed by 10.0.0.1 port 35556 May 13 12:51:56.348753 sshd-session[3962]: pam_unix(sshd:session): session closed for user core May 13 12:51:56.352105 systemd[1]: sshd@7-10.0.0.111:22-10.0.0.1:35556.service: Deactivated successfully. May 13 12:51:56.353944 systemd[1]: session-8.scope: Deactivated successfully. May 13 12:51:56.354573 systemd-logind[1509]: Session 8 logged out. Waiting for processes to exit. May 13 12:51:56.356001 systemd-logind[1509]: Removed session 8. May 13 12:51:58.222218 containerd[1526]: time="2025-05-13T12:51:58.222031698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-f7zvz,Uid:26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc,Namespace:kube-system,Attempt:0,}" May 13 12:51:58.464760 systemd-networkd[1444]: cali15d33e38612: Link UP May 13 12:51:58.464999 systemd-networkd[1444]: cali15d33e38612: Gained carrier May 13 12:51:58.481643 containerd[1526]: 2025-05-13 12:51:58.309 [INFO][3979] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0 coredns-6f6b679f8f- kube-system 26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc 700 0 2025-05-13 12:51:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-f7zvz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali15d33e38612 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-" May 13 12:51:58.481643 containerd[1526]: 2025-05-13 12:51:58.311 [INFO][3979] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" May 13 12:51:58.481643 containerd[1526]: 2025-05-13 12:51:58.416 [INFO][3994] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" HandleID="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Workload="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.432 [INFO][3994] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" HandleID="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Workload="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003231f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-f7zvz", "timestamp":"2025-05-13 12:51:58.416377676 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.432 [INFO][3994] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.432 [INFO][3994] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.432 [INFO][3994] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.434 [INFO][3994] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" host="localhost" May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.439 [INFO][3994] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.443 [INFO][3994] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.445 [INFO][3994] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.446 [INFO][3994] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:51:58.482134 containerd[1526]: 2025-05-13 12:51:58.447 [INFO][3994] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" host="localhost" May 13 12:51:58.482358 containerd[1526]: 2025-05-13 12:51:58.448 [INFO][3994] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f May 13 12:51:58.482358 containerd[1526]: 2025-05-13 12:51:58.453 [INFO][3994] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" host="localhost" May 13 12:51:58.482358 containerd[1526]: 2025-05-13 12:51:58.457 [INFO][3994] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" host="localhost" May 13 12:51:58.482358 containerd[1526]: 2025-05-13 12:51:58.457 [INFO][3994] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" host="localhost" May 13 12:51:58.482358 containerd[1526]: 2025-05-13 12:51:58.457 [INFO][3994] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:51:58.482358 containerd[1526]: 2025-05-13 12:51:58.457 [INFO][3994] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" HandleID="k8s-pod-network.fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Workload="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" May 13 12:51:58.482597 containerd[1526]: 2025-05-13 12:51:58.460 [INFO][3979] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-f7zvz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali15d33e38612", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:51:58.482670 containerd[1526]: 2025-05-13 12:51:58.460 [INFO][3979] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" May 13 12:51:58.482670 containerd[1526]: 2025-05-13 12:51:58.460 [INFO][3979] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali15d33e38612 ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" May 13 12:51:58.482670 containerd[1526]: 2025-05-13 12:51:58.466 [INFO][3979] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" May 13 12:51:58.482818 containerd[1526]: 2025-05-13 12:51:58.467 [INFO][3979] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc", ResourceVersion:"700", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f", Pod:"coredns-6f6b679f8f-f7zvz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali15d33e38612", MAC:"f2:fb:0a:8b:b5:67", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:51:58.482818 containerd[1526]: 2025-05-13 12:51:58.478 [INFO][3979] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" Namespace="kube-system" Pod="coredns-6f6b679f8f-f7zvz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--f7zvz-eth0" May 13 12:51:58.538638 containerd[1526]: time="2025-05-13T12:51:58.538588870Z" level=info msg="connecting to shim fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f" address="unix:///run/containerd/s/1201c2de1e72aa805fe3bbf208ece550524b4fed785035feed20fd4dd959093a" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:58.563594 systemd[1]: Started cri-containerd-fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f.scope - libcontainer container fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f. May 13 12:51:58.575205 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:51:58.606659 containerd[1526]: time="2025-05-13T12:51:58.606606658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-f7zvz,Uid:26e9b4bd-2d18-4eb3-aa85-2b3b1ab905bc,Namespace:kube-system,Attempt:0,} returns sandbox id \"fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f\"" May 13 12:51:58.630691 containerd[1526]: time="2025-05-13T12:51:58.630633317Z" level=info msg="CreateContainer within sandbox \"fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 12:51:58.640124 containerd[1526]: time="2025-05-13T12:51:58.639720117Z" level=info msg="Container 80ad74444d8b7a105ebac55ccbddaab821bc315509ca58ffe3e13b54e51de467: CDI devices from CRI Config.CDIDevices: []" May 13 12:51:58.641232 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2225463486.mount: Deactivated successfully. May 13 12:51:58.645657 containerd[1526]: time="2025-05-13T12:51:58.645613140Z" level=info msg="CreateContainer within sandbox \"fc47e2e3c252ea9e60427d35e597dc04e9087457352c8608fce0234b35ffd67f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"80ad74444d8b7a105ebac55ccbddaab821bc315509ca58ffe3e13b54e51de467\"" May 13 12:51:58.646121 containerd[1526]: time="2025-05-13T12:51:58.646094151Z" level=info msg="StartContainer for \"80ad74444d8b7a105ebac55ccbddaab821bc315509ca58ffe3e13b54e51de467\"" May 13 12:51:58.647125 containerd[1526]: time="2025-05-13T12:51:58.647093816Z" level=info msg="connecting to shim 80ad74444d8b7a105ebac55ccbddaab821bc315509ca58ffe3e13b54e51de467" address="unix:///run/containerd/s/1201c2de1e72aa805fe3bbf208ece550524b4fed785035feed20fd4dd959093a" protocol=ttrpc version=3 May 13 12:51:58.666279 systemd[1]: Started cri-containerd-80ad74444d8b7a105ebac55ccbddaab821bc315509ca58ffe3e13b54e51de467.scope - libcontainer container 80ad74444d8b7a105ebac55ccbddaab821bc315509ca58ffe3e13b54e51de467. May 13 12:51:58.694204 containerd[1526]: time="2025-05-13T12:51:58.694146869Z" level=info msg="StartContainer for \"80ad74444d8b7a105ebac55ccbddaab821bc315509ca58ffe3e13b54e51de467\" returns successfully" May 13 12:51:59.222376 containerd[1526]: time="2025-05-13T12:51:59.222272970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-789c54bdb8-t9tw4,Uid:24a16169-b295-46ad-9302-8b8b82c92f2d,Namespace:calico-system,Attempt:0,}" May 13 12:51:59.322191 systemd-networkd[1444]: cali05ad3b95710: Link UP May 13 12:51:59.322379 systemd-networkd[1444]: cali05ad3b95710: Gained carrier May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.255 [INFO][4104] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0 calico-kube-controllers-789c54bdb8- calico-system 24a16169-b295-46ad-9302-8b8b82c92f2d 702 0 2025-05-13 12:51:38 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:789c54bdb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-789c54bdb8-t9tw4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali05ad3b95710 [] []}} ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.255 [INFO][4104] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.282 [INFO][4118] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" HandleID="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Workload="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.292 [INFO][4118] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" HandleID="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Workload="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f3c70), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-789c54bdb8-t9tw4", "timestamp":"2025-05-13 12:51:59.282396618 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.292 [INFO][4118] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.292 [INFO][4118] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.292 [INFO][4118] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.294 [INFO][4118] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.298 [INFO][4118] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.303 [INFO][4118] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.305 [INFO][4118] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.307 [INFO][4118] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.307 [INFO][4118] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.309 [INFO][4118] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.312 [INFO][4118] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.317 [INFO][4118] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.317 [INFO][4118] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" host="localhost" May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.318 [INFO][4118] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:51:59.334981 containerd[1526]: 2025-05-13 12:51:59.318 [INFO][4118] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" HandleID="k8s-pod-network.6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Workload="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" May 13 12:51:59.335616 containerd[1526]: 2025-05-13 12:51:59.320 [INFO][4104] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0", GenerateName:"calico-kube-controllers-789c54bdb8-", Namespace:"calico-system", SelfLink:"", UID:"24a16169-b295-46ad-9302-8b8b82c92f2d", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"789c54bdb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-789c54bdb8-t9tw4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali05ad3b95710", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:51:59.335616 containerd[1526]: 2025-05-13 12:51:59.320 [INFO][4104] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" May 13 12:51:59.335616 containerd[1526]: 2025-05-13 12:51:59.320 [INFO][4104] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali05ad3b95710 ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" May 13 12:51:59.335616 containerd[1526]: 2025-05-13 12:51:59.321 [INFO][4104] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" May 13 12:51:59.335616 containerd[1526]: 2025-05-13 12:51:59.321 [INFO][4104] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0", GenerateName:"calico-kube-controllers-789c54bdb8-", Namespace:"calico-system", SelfLink:"", UID:"24a16169-b295-46ad-9302-8b8b82c92f2d", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"789c54bdb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c", Pod:"calico-kube-controllers-789c54bdb8-t9tw4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali05ad3b95710", MAC:"46:56:40:87:c8:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:51:59.335616 containerd[1526]: 2025-05-13 12:51:59.333 [INFO][4104] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" Namespace="calico-system" Pod="calico-kube-controllers-789c54bdb8-t9tw4" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--789c54bdb8--t9tw4-eth0" May 13 12:51:59.361127 containerd[1526]: time="2025-05-13T12:51:59.360986068Z" level=info msg="connecting to shim 6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c" address="unix:///run/containerd/s/889c6d56471ccfe9f04932358a178e501bedf8efa0c8e811b757553c0219c65c" namespace=k8s.io protocol=ttrpc version=3 May 13 12:51:59.366874 kubelet[2631]: I0513 12:51:59.366530 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-f7zvz" podStartSLOduration=28.366518632000002 podStartE2EDuration="28.366518632s" podCreationTimestamp="2025-05-13 12:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:51:59.366197359 +0000 UTC m=+34.242032017" watchObservedRunningTime="2025-05-13 12:51:59.366518632 +0000 UTC m=+34.242353290" May 13 12:51:59.397564 systemd[1]: Started cri-containerd-6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c.scope - libcontainer container 6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c. May 13 12:51:59.415496 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:51:59.436392 containerd[1526]: time="2025-05-13T12:51:59.436357150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-789c54bdb8-t9tw4,Uid:24a16169-b295-46ad-9302-8b8b82c92f2d,Namespace:calico-system,Attempt:0,} returns sandbox id \"6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c\"" May 13 12:51:59.438152 containerd[1526]: time="2025-05-13T12:51:59.438112369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 12:51:59.527688 systemd-networkd[1444]: cali15d33e38612: Gained IPv6LL May 13 12:52:00.221733 containerd[1526]: time="2025-05-13T12:52:00.221693426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-4dtwn,Uid:7b29b44c-d00d-426c-bf21-d892e750d90b,Namespace:calico-apiserver,Attempt:0,}" May 13 12:52:00.336104 systemd-networkd[1444]: cali12832e08699: Link UP May 13 12:52:00.336708 systemd-networkd[1444]: cali12832e08699: Gained carrier May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.259 [INFO][4192] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0 calico-apiserver-6f98d9df74- calico-apiserver 7b29b44c-d00d-426c-bf21-d892e750d90b 701 0 2025-05-13 12:51:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f98d9df74 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6f98d9df74-4dtwn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali12832e08699 [] []}} ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.259 [INFO][4192] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.290 [INFO][4209] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" HandleID="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Workload="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.302 [INFO][4209] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" HandleID="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Workload="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400027add0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6f98d9df74-4dtwn", "timestamp":"2025-05-13 12:52:00.290669334 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.302 [INFO][4209] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.302 [INFO][4209] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.302 [INFO][4209] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.304 [INFO][4209] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.308 [INFO][4209] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.313 [INFO][4209] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.315 [INFO][4209] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.317 [INFO][4209] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.317 [INFO][4209] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.319 [INFO][4209] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987 May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.325 [INFO][4209] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.331 [INFO][4209] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.331 [INFO][4209] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" host="localhost" May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.331 [INFO][4209] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:52:00.350888 containerd[1526]: 2025-05-13 12:52:00.331 [INFO][4209] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" HandleID="k8s-pod-network.2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Workload="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" May 13 12:52:00.351627 containerd[1526]: 2025-05-13 12:52:00.333 [INFO][4192] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0", GenerateName:"calico-apiserver-6f98d9df74-", Namespace:"calico-apiserver", SelfLink:"", UID:"7b29b44c-d00d-426c-bf21-d892e750d90b", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f98d9df74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6f98d9df74-4dtwn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali12832e08699", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:00.351627 containerd[1526]: 2025-05-13 12:52:00.333 [INFO][4192] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" May 13 12:52:00.351627 containerd[1526]: 2025-05-13 12:52:00.333 [INFO][4192] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali12832e08699 ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" May 13 12:52:00.351627 containerd[1526]: 2025-05-13 12:52:00.336 [INFO][4192] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" May 13 12:52:00.351627 containerd[1526]: 2025-05-13 12:52:00.337 [INFO][4192] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0", GenerateName:"calico-apiserver-6f98d9df74-", Namespace:"calico-apiserver", SelfLink:"", UID:"7b29b44c-d00d-426c-bf21-d892e750d90b", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f98d9df74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987", Pod:"calico-apiserver-6f98d9df74-4dtwn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali12832e08699", MAC:"8a:e9:97:de:89:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:00.351627 containerd[1526]: 2025-05-13 12:52:00.347 [INFO][4192] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-4dtwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--4dtwn-eth0" May 13 12:52:00.375139 containerd[1526]: time="2025-05-13T12:52:00.375088682Z" level=info msg="connecting to shim 2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987" address="unix:///run/containerd/s/d6a746adb0cd4c0bf02e321af619452bde683c2772f59528b38714bbc5aac8f6" namespace=k8s.io protocol=ttrpc version=3 May 13 12:52:00.401640 systemd[1]: Started cri-containerd-2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987.scope - libcontainer container 2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987. May 13 12:52:00.416104 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:52:00.423575 systemd-networkd[1444]: cali05ad3b95710: Gained IPv6LL May 13 12:52:00.443174 containerd[1526]: time="2025-05-13T12:52:00.443112856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-4dtwn,Uid:7b29b44c-d00d-426c-bf21-d892e750d90b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987\"" May 13 12:52:01.045395 containerd[1526]: time="2025-05-13T12:52:01.045351978Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=32554116" May 13 12:52:01.047919 containerd[1526]: time="2025-05-13T12:52:01.047860377Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:01.049073 containerd[1526]: time="2025-05-13T12:52:01.048935839Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"33923266\" in 1.610785906s" May 13 12:52:01.049073 containerd[1526]: time="2025-05-13T12:52:01.048970922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\"" May 13 12:52:01.049073 containerd[1526]: time="2025-05-13T12:52:01.049053610Z" level=info msg="ImageCreate event name:\"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:01.049904 containerd[1526]: time="2025-05-13T12:52:01.049855927Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:01.050825 containerd[1526]: time="2025-05-13T12:52:01.050785135Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 12:52:01.058686 containerd[1526]: time="2025-05-13T12:52:01.058641122Z" level=info msg="CreateContainer within sandbox \"6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 12:52:01.066154 containerd[1526]: time="2025-05-13T12:52:01.066101672Z" level=info msg="Container c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7: CDI devices from CRI Config.CDIDevices: []" May 13 12:52:01.073038 containerd[1526]: time="2025-05-13T12:52:01.072938522Z" level=info msg="CreateContainer within sandbox \"6f3bb4f49e38c212bd3ffc02e195bd7f2ece2d4fc63dbfbaa76c9d4426423d2c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7\"" May 13 12:52:01.073660 containerd[1526]: time="2025-05-13T12:52:01.073567462Z" level=info msg="StartContainer for \"c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7\"" May 13 12:52:01.076033 containerd[1526]: time="2025-05-13T12:52:01.075993053Z" level=info msg="connecting to shim c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7" address="unix:///run/containerd/s/889c6d56471ccfe9f04932358a178e501bedf8efa0c8e811b757553c0219c65c" protocol=ttrpc version=3 May 13 12:52:01.093636 systemd[1]: Started cri-containerd-c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7.scope - libcontainer container c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7. May 13 12:52:01.143985 containerd[1526]: time="2025-05-13T12:52:01.143945716Z" level=info msg="StartContainer for \"c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7\" returns successfully" May 13 12:52:01.221635 containerd[1526]: time="2025-05-13T12:52:01.221570419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-grt52,Uid:6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e,Namespace:calico-system,Attempt:0,}" May 13 12:52:01.222966 containerd[1526]: time="2025-05-13T12:52:01.222930989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-mm9jv,Uid:36ff9343-a76a-4f7a-a113-216af1e8e6aa,Namespace:calico-apiserver,Attempt:0,}" May 13 12:52:01.378842 systemd[1]: Started sshd@8-10.0.0.111:22-10.0.0.1:35560.service - OpenSSH per-connection server daemon (10.0.0.1:35560). May 13 12:52:01.389304 systemd-networkd[1444]: calie8baf67aee9: Link UP May 13 12:52:01.390604 systemd-networkd[1444]: calie8baf67aee9: Gained carrier May 13 12:52:01.398632 kubelet[2631]: I0513 12:52:01.398392 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-789c54bdb8-t9tw4" podStartSLOduration=21.78594477 podStartE2EDuration="23.398371076s" podCreationTimestamp="2025-05-13 12:51:38 +0000 UTC" firstStartedPulling="2025-05-13 12:51:59.437684565 +0000 UTC m=+34.313519223" lastFinishedPulling="2025-05-13 12:52:01.050110831 +0000 UTC m=+35.925945529" observedRunningTime="2025-05-13 12:52:01.395636376 +0000 UTC m=+36.271471034" watchObservedRunningTime="2025-05-13 12:52:01.398371076 +0000 UTC m=+36.274205734" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.272 [INFO][4312] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--grt52-eth0 csi-node-driver- calico-system 6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e 614 0 2025-05-13 12:51:38 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-grt52 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie8baf67aee9 [] []}} ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.273 [INFO][4312] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-eth0" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.306 [INFO][4338] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" HandleID="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Workload="localhost-k8s-csi--node--driver--grt52-eth0" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.321 [INFO][4338] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" HandleID="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Workload="localhost-k8s-csi--node--driver--grt52-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003aa810), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-grt52", "timestamp":"2025-05-13 12:52:01.306127022 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.321 [INFO][4338] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.321 [INFO][4338] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.321 [INFO][4338] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.323 [INFO][4338] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.328 [INFO][4338] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.333 [INFO][4338] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.344 [INFO][4338] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.347 [INFO][4338] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.347 [INFO][4338] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.350 [INFO][4338] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.359 [INFO][4338] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.373 [INFO][4338] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.373 [INFO][4338] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" host="localhost" May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.373 [INFO][4338] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:52:01.408770 containerd[1526]: 2025-05-13 12:52:01.373 [INFO][4338] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" HandleID="k8s-pod-network.571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Workload="localhost-k8s-csi--node--driver--grt52-eth0" May 13 12:52:01.409505 containerd[1526]: 2025-05-13 12:52:01.383 [INFO][4312] cni-plugin/k8s.go 386: Populated endpoint ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--grt52-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e", ResourceVersion:"614", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-grt52", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie8baf67aee9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:01.409505 containerd[1526]: 2025-05-13 12:52:01.383 [INFO][4312] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-eth0" May 13 12:52:01.409505 containerd[1526]: 2025-05-13 12:52:01.383 [INFO][4312] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8baf67aee9 ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-eth0" May 13 12:52:01.409505 containerd[1526]: 2025-05-13 12:52:01.391 [INFO][4312] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-eth0" May 13 12:52:01.409505 containerd[1526]: 2025-05-13 12:52:01.391 [INFO][4312] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--grt52-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e", ResourceVersion:"614", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d", Pod:"csi-node-driver-grt52", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie8baf67aee9", MAC:"c2:12:b9:e2:95:54", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:01.409505 containerd[1526]: 2025-05-13 12:52:01.405 [INFO][4312] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" Namespace="calico-system" Pod="csi-node-driver-grt52" WorkloadEndpoint="localhost-k8s-csi--node--driver--grt52-eth0" May 13 12:52:01.438596 containerd[1526]: time="2025-05-13T12:52:01.438536736Z" level=info msg="connecting to shim 571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d" address="unix:///run/containerd/s/78c0f2f248e9fa5af7dc6f3d29cd20442b2312f9c576253a375d5e230d583096" namespace=k8s.io protocol=ttrpc version=3 May 13 12:52:01.455467 sshd[4358]: Accepted publickey for core from 10.0.0.1 port 35560 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:01.459073 sshd-session[4358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:01.469650 systemd[1]: Started cri-containerd-571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d.scope - libcontainer container 571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d. May 13 12:52:01.470391 systemd-networkd[1444]: cali59e4bbca861: Link UP May 13 12:52:01.470560 systemd-networkd[1444]: cali59e4bbca861: Gained carrier May 13 12:52:01.475854 systemd-logind[1509]: New session 9 of user core. May 13 12:52:01.476678 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.297 [INFO][4327] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0 calico-apiserver-6f98d9df74- calico-apiserver 36ff9343-a76a-4f7a-a113-216af1e8e6aa 703 0 2025-05-13 12:51:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f98d9df74 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6f98d9df74-mm9jv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali59e4bbca861 [] []}} ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.297 [INFO][4327] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.330 [INFO][4348] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" HandleID="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Workload="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.420 [INFO][4348] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" HandleID="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Workload="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c0b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6f98d9df74-mm9jv", "timestamp":"2025-05-13 12:52:01.330824531 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.420 [INFO][4348] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.420 [INFO][4348] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.420 [INFO][4348] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.424 [INFO][4348] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.431 [INFO][4348] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.438 [INFO][4348] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.441 [INFO][4348] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.447 [INFO][4348] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.447 [INFO][4348] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.449 [INFO][4348] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.454 [INFO][4348] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.463 [INFO][4348] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.463 [INFO][4348] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" host="localhost" May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.463 [INFO][4348] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:52:01.487516 containerd[1526]: 2025-05-13 12:52:01.463 [INFO][4348] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" HandleID="k8s-pod-network.aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Workload="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" May 13 12:52:01.488051 containerd[1526]: 2025-05-13 12:52:01.468 [INFO][4327] cni-plugin/k8s.go 386: Populated endpoint ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0", GenerateName:"calico-apiserver-6f98d9df74-", Namespace:"calico-apiserver", SelfLink:"", UID:"36ff9343-a76a-4f7a-a113-216af1e8e6aa", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f98d9df74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6f98d9df74-mm9jv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali59e4bbca861", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:01.488051 containerd[1526]: 2025-05-13 12:52:01.468 [INFO][4327] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" May 13 12:52:01.488051 containerd[1526]: 2025-05-13 12:52:01.468 [INFO][4327] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali59e4bbca861 ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" May 13 12:52:01.488051 containerd[1526]: 2025-05-13 12:52:01.470 [INFO][4327] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" May 13 12:52:01.488051 containerd[1526]: 2025-05-13 12:52:01.470 [INFO][4327] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0", GenerateName:"calico-apiserver-6f98d9df74-", Namespace:"calico-apiserver", SelfLink:"", UID:"36ff9343-a76a-4f7a-a113-216af1e8e6aa", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f98d9df74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f", Pod:"calico-apiserver-6f98d9df74-mm9jv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali59e4bbca861", MAC:"d6:22:e7:fc:4b:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:01.488051 containerd[1526]: 2025-05-13 12:52:01.484 [INFO][4327] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" Namespace="calico-apiserver" Pod="calico-apiserver-6f98d9df74-mm9jv" WorkloadEndpoint="localhost-k8s-calico--apiserver--6f98d9df74--mm9jv-eth0" May 13 12:52:01.508282 containerd[1526]: time="2025-05-13T12:52:01.508237886Z" level=info msg="connecting to shim aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f" address="unix:///run/containerd/s/7c51eda4a04ee45ffdb1142dc51c43a5e0473beee8f76e0c651a3177ad044838" namespace=k8s.io protocol=ttrpc version=3 May 13 12:52:01.509299 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:52:01.525905 containerd[1526]: time="2025-05-13T12:52:01.525842080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-grt52,Uid:6b738dcd-3dcf-4d6f-8e7e-e0a90a81033e,Namespace:calico-system,Attempt:0,} returns sandbox id \"571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d\"" May 13 12:52:01.544619 systemd[1]: Started cri-containerd-aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f.scope - libcontainer container aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f. May 13 12:52:01.569046 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:52:01.595422 containerd[1526]: time="2025-05-13T12:52:01.595369894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f98d9df74-mm9jv,Uid:36ff9343-a76a-4f7a-a113-216af1e8e6aa,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f\"" May 13 12:52:01.622070 sshd[4415]: Connection closed by 10.0.0.1 port 35560 May 13 12:52:01.622648 sshd-session[4358]: pam_unix(sshd:session): session closed for user core May 13 12:52:01.626553 systemd-logind[1509]: Session 9 logged out. Waiting for processes to exit. May 13 12:52:01.626866 systemd[1]: sshd@8-10.0.0.111:22-10.0.0.1:35560.service: Deactivated successfully. May 13 12:52:01.628741 systemd[1]: session-9.scope: Deactivated successfully. May 13 12:52:01.630998 systemd-logind[1509]: Removed session 9. May 13 12:52:02.215792 systemd-networkd[1444]: cali12832e08699: Gained IPv6LL May 13 12:52:02.385638 kubelet[2631]: I0513 12:52:02.385603 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:52:02.863235 containerd[1526]: time="2025-05-13T12:52:02.863188898Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:02.864121 containerd[1526]: time="2025-05-13T12:52:02.864018374Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=40247603" May 13 12:52:02.864821 containerd[1526]: time="2025-05-13T12:52:02.864774924Z" level=info msg="ImageCreate event name:\"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:02.867232 containerd[1526]: time="2025-05-13T12:52:02.867171865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:02.867884 containerd[1526]: time="2025-05-13T12:52:02.867847847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 1.817030189s" May 13 12:52:02.867884 containerd[1526]: time="2025-05-13T12:52:02.867879650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 12:52:02.869022 containerd[1526]: time="2025-05-13T12:52:02.868996433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 12:52:02.871009 containerd[1526]: time="2025-05-13T12:52:02.870520693Z" level=info msg="CreateContainer within sandbox \"2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 12:52:02.876462 containerd[1526]: time="2025-05-13T12:52:02.876418516Z" level=info msg="Container 7a1b673fb53def9d9c8cee4f32ef706257eef5503a087724902b91e24b6d7471: CDI devices from CRI Config.CDIDevices: []" May 13 12:52:02.884061 containerd[1526]: time="2025-05-13T12:52:02.883968770Z" level=info msg="CreateContainer within sandbox \"2ca7d377706f77561b5db94f4c5ab2f83cb08da4990cbb0ebe44d7d005ded987\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7a1b673fb53def9d9c8cee4f32ef706257eef5503a087724902b91e24b6d7471\"" May 13 12:52:02.884703 containerd[1526]: time="2025-05-13T12:52:02.884538103Z" level=info msg="StartContainer for \"7a1b673fb53def9d9c8cee4f32ef706257eef5503a087724902b91e24b6d7471\"" May 13 12:52:02.885668 containerd[1526]: time="2025-05-13T12:52:02.885638524Z" level=info msg="connecting to shim 7a1b673fb53def9d9c8cee4f32ef706257eef5503a087724902b91e24b6d7471" address="unix:///run/containerd/s/d6a746adb0cd4c0bf02e321af619452bde683c2772f59528b38714bbc5aac8f6" protocol=ttrpc version=3 May 13 12:52:02.904554 systemd[1]: Started cri-containerd-7a1b673fb53def9d9c8cee4f32ef706257eef5503a087724902b91e24b6d7471.scope - libcontainer container 7a1b673fb53def9d9c8cee4f32ef706257eef5503a087724902b91e24b6d7471. May 13 12:52:02.941784 containerd[1526]: time="2025-05-13T12:52:02.941714845Z" level=info msg="StartContainer for \"7a1b673fb53def9d9c8cee4f32ef706257eef5503a087724902b91e24b6d7471\" returns successfully" May 13 12:52:03.175631 systemd-networkd[1444]: calie8baf67aee9: Gained IPv6LL May 13 12:52:03.221543 containerd[1526]: time="2025-05-13T12:52:03.221498752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-fq9tz,Uid:7f0ad085-6ec5-4d45-bfe4-5446ac10196b,Namespace:kube-system,Attempt:0,}" May 13 12:52:03.365530 systemd-networkd[1444]: calida97ecca980: Link UP May 13 12:52:03.365713 systemd-networkd[1444]: calida97ecca980: Gained carrier May 13 12:52:03.367859 systemd-networkd[1444]: cali59e4bbca861: Gained IPv6LL May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.281 [INFO][4540] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0 coredns-6f6b679f8f- kube-system 7f0ad085-6ec5-4d45-bfe4-5446ac10196b 698 0 2025-05-13 12:51:31 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-fq9tz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calida97ecca980 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.281 [INFO][4540] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.312 [INFO][4555] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" HandleID="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Workload="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.327 [INFO][4555] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" HandleID="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Workload="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005c1250), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-fq9tz", "timestamp":"2025-05-13 12:52:03.31292246 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.327 [INFO][4555] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.328 [INFO][4555] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.328 [INFO][4555] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.330 [INFO][4555] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.334 [INFO][4555] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.339 [INFO][4555] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.341 [INFO][4555] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.343 [INFO][4555] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.344 [INFO][4555] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.345 [INFO][4555] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28 May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.349 [INFO][4555] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.357 [INFO][4555] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.357 [INFO][4555] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" host="localhost" May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.357 [INFO][4555] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:52:03.381372 containerd[1526]: 2025-05-13 12:52:03.357 [INFO][4555] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" HandleID="k8s-pod-network.50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Workload="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" May 13 12:52:03.381901 containerd[1526]: 2025-05-13 12:52:03.362 [INFO][4540] cni-plugin/k8s.go 386: Populated endpoint ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7f0ad085-6ec5-4d45-bfe4-5446ac10196b", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-fq9tz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida97ecca980", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:03.381901 containerd[1526]: 2025-05-13 12:52:03.362 [INFO][4540] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" May 13 12:52:03.381901 containerd[1526]: 2025-05-13 12:52:03.362 [INFO][4540] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calida97ecca980 ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" May 13 12:52:03.381901 containerd[1526]: 2025-05-13 12:52:03.367 [INFO][4540] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" May 13 12:52:03.381901 containerd[1526]: 2025-05-13 12:52:03.368 [INFO][4540] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7f0ad085-6ec5-4d45-bfe4-5446ac10196b", ResourceVersion:"698", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 51, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28", Pod:"coredns-6f6b679f8f-fq9tz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida97ecca980", MAC:"82:f6:5d:9a:ac:22", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:52:03.381901 containerd[1526]: 2025-05-13 12:52:03.378 [INFO][4540] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" Namespace="kube-system" Pod="coredns-6f6b679f8f-fq9tz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--fq9tz-eth0" May 13 12:52:03.425586 containerd[1526]: time="2025-05-13T12:52:03.425533338Z" level=info msg="connecting to shim 50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28" address="unix:///run/containerd/s/2b8dcdd8ccf7c033c148b810fbb2c1830853752e4f4663508a429521ec8c3bae" namespace=k8s.io protocol=ttrpc version=3 May 13 12:52:03.464633 systemd[1]: Started cri-containerd-50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28.scope - libcontainer container 50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28. May 13 12:52:03.478050 systemd-resolved[1358]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:52:03.511095 containerd[1526]: time="2025-05-13T12:52:03.510897546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-fq9tz,Uid:7f0ad085-6ec5-4d45-bfe4-5446ac10196b,Namespace:kube-system,Attempt:0,} returns sandbox id \"50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28\"" May 13 12:52:03.514697 containerd[1526]: time="2025-05-13T12:52:03.514657481Z" level=info msg="CreateContainer within sandbox \"50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 12:52:03.537350 containerd[1526]: time="2025-05-13T12:52:03.537302020Z" level=info msg="Container 901d9f7d60243188ba4241e67931429c30607fb3ba6d70794a0b7c2c7102af9b: CDI devices from CRI Config.CDIDevices: []" May 13 12:52:03.543416 containerd[1526]: time="2025-05-13T12:52:03.543359279Z" level=info msg="CreateContainer within sandbox \"50cb3ef7e34390fdfa2a2e84d0c752e2864c9279cce183713367fa4fe6bc9f28\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"901d9f7d60243188ba4241e67931429c30607fb3ba6d70794a0b7c2c7102af9b\"" May 13 12:52:03.544285 containerd[1526]: time="2025-05-13T12:52:03.544240358Z" level=info msg="StartContainer for \"901d9f7d60243188ba4241e67931429c30607fb3ba6d70794a0b7c2c7102af9b\"" May 13 12:52:03.545549 containerd[1526]: time="2025-05-13T12:52:03.545493270Z" level=info msg="connecting to shim 901d9f7d60243188ba4241e67931429c30607fb3ba6d70794a0b7c2c7102af9b" address="unix:///run/containerd/s/2b8dcdd8ccf7c033c148b810fbb2c1830853752e4f4663508a429521ec8c3bae" protocol=ttrpc version=3 May 13 12:52:03.561564 systemd[1]: Started cri-containerd-901d9f7d60243188ba4241e67931429c30607fb3ba6d70794a0b7c2c7102af9b.scope - libcontainer container 901d9f7d60243188ba4241e67931429c30607fb3ba6d70794a0b7c2c7102af9b. May 13 12:52:03.602318 containerd[1526]: time="2025-05-13T12:52:03.602281691Z" level=info msg="StartContainer for \"901d9f7d60243188ba4241e67931429c30607fb3ba6d70794a0b7c2c7102af9b\" returns successfully" May 13 12:52:03.986423 containerd[1526]: time="2025-05-13T12:52:03.986269636Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:03.988001 containerd[1526]: time="2025-05-13T12:52:03.987963107Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7474935" May 13 12:52:03.989446 containerd[1526]: time="2025-05-13T12:52:03.989323589Z" level=info msg="ImageCreate event name:\"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:03.992171 containerd[1526]: time="2025-05-13T12:52:03.992139040Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:03.993374 containerd[1526]: time="2025-05-13T12:52:03.993345827Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"8844117\" in 1.124058527s" May 13 12:52:03.993456 containerd[1526]: time="2025-05-13T12:52:03.993375110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\"" May 13 12:52:03.994201 containerd[1526]: time="2025-05-13T12:52:03.994179861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 12:52:03.995322 containerd[1526]: time="2025-05-13T12:52:03.995291761Z" level=info msg="CreateContainer within sandbox \"571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 12:52:04.002585 containerd[1526]: time="2025-05-13T12:52:04.002548161Z" level=info msg="Container 9092e2b7a7c1d4f6cc4111a28f095130f3aa4bf7e4d3a44b2427b67170ac36bd: CDI devices from CRI Config.CDIDevices: []" May 13 12:52:04.010096 containerd[1526]: time="2025-05-13T12:52:04.010050170Z" level=info msg="CreateContainer within sandbox \"571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9092e2b7a7c1d4f6cc4111a28f095130f3aa4bf7e4d3a44b2427b67170ac36bd\"" May 13 12:52:04.011258 containerd[1526]: time="2025-05-13T12:52:04.010623739Z" level=info msg="StartContainer for \"9092e2b7a7c1d4f6cc4111a28f095130f3aa4bf7e4d3a44b2427b67170ac36bd\"" May 13 12:52:04.013199 containerd[1526]: time="2025-05-13T12:52:04.013073791Z" level=info msg="connecting to shim 9092e2b7a7c1d4f6cc4111a28f095130f3aa4bf7e4d3a44b2427b67170ac36bd" address="unix:///run/containerd/s/78c0f2f248e9fa5af7dc6f3d29cd20442b2312f9c576253a375d5e230d583096" protocol=ttrpc version=3 May 13 12:52:04.034561 systemd[1]: Started cri-containerd-9092e2b7a7c1d4f6cc4111a28f095130f3aa4bf7e4d3a44b2427b67170ac36bd.scope - libcontainer container 9092e2b7a7c1d4f6cc4111a28f095130f3aa4bf7e4d3a44b2427b67170ac36bd. May 13 12:52:04.064905 containerd[1526]: time="2025-05-13T12:52:04.064873907Z" level=info msg="StartContainer for \"9092e2b7a7c1d4f6cc4111a28f095130f3aa4bf7e4d3a44b2427b67170ac36bd\" returns successfully" May 13 12:52:04.246055 containerd[1526]: time="2025-05-13T12:52:04.245945235Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:04.246589 containerd[1526]: time="2025-05-13T12:52:04.246554287Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 12:52:04.248859 containerd[1526]: time="2025-05-13T12:52:04.248830844Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 254.62074ms" May 13 12:52:04.248859 containerd[1526]: time="2025-05-13T12:52:04.248861527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 12:52:04.249748 containerd[1526]: time="2025-05-13T12:52:04.249717881Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 12:52:04.251704 containerd[1526]: time="2025-05-13T12:52:04.251661369Z" level=info msg="CreateContainer within sandbox \"aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 12:52:04.258449 containerd[1526]: time="2025-05-13T12:52:04.257989156Z" level=info msg="Container c4fea6cc204fe09dcf9bc653425e3bb1dfc689becc8d2679af3f9bc5e3b5339b: CDI devices from CRI Config.CDIDevices: []" May 13 12:52:04.263858 containerd[1526]: time="2025-05-13T12:52:04.263817019Z" level=info msg="CreateContainer within sandbox \"aca21e7cf038262dd51379a01cdea2c0065cec8767f4ddb8234508a5ed40af1f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c4fea6cc204fe09dcf9bc653425e3bb1dfc689becc8d2679af3f9bc5e3b5339b\"" May 13 12:52:04.264448 containerd[1526]: time="2025-05-13T12:52:04.264398149Z" level=info msg="StartContainer for \"c4fea6cc204fe09dcf9bc653425e3bb1dfc689becc8d2679af3f9bc5e3b5339b\"" May 13 12:52:04.265792 containerd[1526]: time="2025-05-13T12:52:04.265525887Z" level=info msg="connecting to shim c4fea6cc204fe09dcf9bc653425e3bb1dfc689becc8d2679af3f9bc5e3b5339b" address="unix:///run/containerd/s/7c51eda4a04ee45ffdb1142dc51c43a5e0473beee8f76e0c651a3177ad044838" protocol=ttrpc version=3 May 13 12:52:04.285596 systemd[1]: Started cri-containerd-c4fea6cc204fe09dcf9bc653425e3bb1dfc689becc8d2679af3f9bc5e3b5339b.scope - libcontainer container c4fea6cc204fe09dcf9bc653425e3bb1dfc689becc8d2679af3f9bc5e3b5339b. May 13 12:52:04.319795 containerd[1526]: time="2025-05-13T12:52:04.319748293Z" level=info msg="StartContainer for \"c4fea6cc204fe09dcf9bc653425e3bb1dfc689becc8d2679af3f9bc5e3b5339b\" returns successfully" May 13 12:52:04.400903 kubelet[2631]: I0513 12:52:04.400875 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:52:04.409531 kubelet[2631]: I0513 12:52:04.409290 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6f98d9df74-mm9jv" podStartSLOduration=24.756962514 podStartE2EDuration="27.409275309s" podCreationTimestamp="2025-05-13 12:51:37 +0000 UTC" firstStartedPulling="2025-05-13 12:52:01.597267314 +0000 UTC m=+36.473101972" lastFinishedPulling="2025-05-13 12:52:04.249580109 +0000 UTC m=+39.125414767" observedRunningTime="2025-05-13 12:52:04.408831511 +0000 UTC m=+39.284666209" watchObservedRunningTime="2025-05-13 12:52:04.409275309 +0000 UTC m=+39.285109927" May 13 12:52:04.410152 kubelet[2631]: I0513 12:52:04.410002 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6f98d9df74-4dtwn" podStartSLOduration=24.985558775 podStartE2EDuration="27.409989771s" podCreationTimestamp="2025-05-13 12:51:37 +0000 UTC" firstStartedPulling="2025-05-13 12:52:00.444187722 +0000 UTC m=+35.320022340" lastFinishedPulling="2025-05-13 12:52:02.868618718 +0000 UTC m=+37.744453336" observedRunningTime="2025-05-13 12:52:03.403784279 +0000 UTC m=+38.279618977" watchObservedRunningTime="2025-05-13 12:52:04.409989771 +0000 UTC m=+39.285824469" May 13 12:52:04.422748 kubelet[2631]: I0513 12:52:04.422682 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-fq9tz" podStartSLOduration=33.422147861 podStartE2EDuration="33.422147861s" podCreationTimestamp="2025-05-13 12:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:52:04.421986488 +0000 UTC m=+39.297821146" watchObservedRunningTime="2025-05-13 12:52:04.422147861 +0000 UTC m=+39.297982519" May 13 12:52:05.159693 systemd-networkd[1444]: calida97ecca980: Gained IPv6LL May 13 12:52:05.676541 containerd[1526]: time="2025-05-13T12:52:05.676487175Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:05.677374 containerd[1526]: time="2025-05-13T12:52:05.677332166Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13124299" May 13 12:52:05.678495 containerd[1526]: time="2025-05-13T12:52:05.678465901Z" level=info msg="ImageCreate event name:\"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:05.680961 containerd[1526]: time="2025-05-13T12:52:05.680880664Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:52:05.681582 containerd[1526]: time="2025-05-13T12:52:05.681551560Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"14493433\" in 1.431791155s" May 13 12:52:05.681788 containerd[1526]: time="2025-05-13T12:52:05.681682011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\"" May 13 12:52:05.683845 containerd[1526]: time="2025-05-13T12:52:05.683799469Z" level=info msg="CreateContainer within sandbox \"571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 12:52:05.702821 containerd[1526]: time="2025-05-13T12:52:05.701719651Z" level=info msg="Container 2f3353c10acc5d220831493dba76598db621d8c9dd0caf3055fd8ebb6b1e6935: CDI devices from CRI Config.CDIDevices: []" May 13 12:52:05.710183 containerd[1526]: time="2025-05-13T12:52:05.710113676Z" level=info msg="CreateContainer within sandbox \"571e1523d3f0cfe1cea9bbb6d508daabae7b379ac676a88b6825aa5fb39fbd0d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"2f3353c10acc5d220831493dba76598db621d8c9dd0caf3055fd8ebb6b1e6935\"" May 13 12:52:05.711092 containerd[1526]: time="2025-05-13T12:52:05.711050394Z" level=info msg="StartContainer for \"2f3353c10acc5d220831493dba76598db621d8c9dd0caf3055fd8ebb6b1e6935\"" May 13 12:52:05.712882 containerd[1526]: time="2025-05-13T12:52:05.712853505Z" level=info msg="connecting to shim 2f3353c10acc5d220831493dba76598db621d8c9dd0caf3055fd8ebb6b1e6935" address="unix:///run/containerd/s/78c0f2f248e9fa5af7dc6f3d29cd20442b2312f9c576253a375d5e230d583096" protocol=ttrpc version=3 May 13 12:52:05.737581 systemd[1]: Started cri-containerd-2f3353c10acc5d220831493dba76598db621d8c9dd0caf3055fd8ebb6b1e6935.scope - libcontainer container 2f3353c10acc5d220831493dba76598db621d8c9dd0caf3055fd8ebb6b1e6935. May 13 12:52:05.798399 containerd[1526]: time="2025-05-13T12:52:05.797698181Z" level=info msg="StartContainer for \"2f3353c10acc5d220831493dba76598db621d8c9dd0caf3055fd8ebb6b1e6935\" returns successfully" May 13 12:52:06.186078 kubelet[2631]: I0513 12:52:06.186038 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:52:06.296269 kubelet[2631]: I0513 12:52:06.296196 2631 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 12:52:06.298491 kubelet[2631]: I0513 12:52:06.298474 2631 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 12:52:06.380563 kubelet[2631]: I0513 12:52:06.380529 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:52:06.418325 kubelet[2631]: I0513 12:52:06.418256 2631 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-grt52" podStartSLOduration=24.263020527 podStartE2EDuration="28.41824139s" podCreationTimestamp="2025-05-13 12:51:38 +0000 UTC" firstStartedPulling="2025-05-13 12:52:01.527247334 +0000 UTC m=+36.403081992" lastFinishedPulling="2025-05-13 12:52:05.682468157 +0000 UTC m=+40.558302855" observedRunningTime="2025-05-13 12:52:06.418204387 +0000 UTC m=+41.294039045" watchObservedRunningTime="2025-05-13 12:52:06.41824139 +0000 UTC m=+41.294076048" May 13 12:52:06.475691 containerd[1526]: time="2025-05-13T12:52:06.475339963Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff\" id:\"4b7a6d479007a6a84dc926808e28e3a7993366b29f1260baf9a62aac8ddad844\" pid:4798 exited_at:{seconds:1747140726 nanos:474972253}" May 13 12:52:06.533767 containerd[1526]: time="2025-05-13T12:52:06.533563027Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0f2c7eabb891183ac25bb31bda3240ae16fd4435964e9ba475895a73d0a209ff\" id:\"3fd8d852076f2357fdeea5049a7720504caf06577f059ba182a5fe3907a91f8b\" pid:4822 exited_at:{seconds:1747140726 nanos:533279964}" May 13 12:52:06.645638 systemd[1]: Started sshd@9-10.0.0.111:22-10.0.0.1:43558.service - OpenSSH per-connection server daemon (10.0.0.1:43558). May 13 12:52:06.695118 sshd[4836]: Accepted publickey for core from 10.0.0.1 port 43558 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:06.696720 sshd-session[4836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:06.700633 systemd-logind[1509]: New session 10 of user core. May 13 12:52:06.708572 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 12:52:06.882012 sshd[4838]: Connection closed by 10.0.0.1 port 43558 May 13 12:52:06.882658 sshd-session[4836]: pam_unix(sshd:session): session closed for user core May 13 12:52:06.895384 systemd[1]: sshd@9-10.0.0.111:22-10.0.0.1:43558.service: Deactivated successfully. May 13 12:52:06.897556 systemd[1]: session-10.scope: Deactivated successfully. May 13 12:52:06.898277 systemd-logind[1509]: Session 10 logged out. Waiting for processes to exit. May 13 12:52:06.901182 systemd[1]: Started sshd@10-10.0.0.111:22-10.0.0.1:43566.service - OpenSSH per-connection server daemon (10.0.0.1:43566). May 13 12:52:06.901759 systemd-logind[1509]: Removed session 10. May 13 12:52:06.953560 sshd[4855]: Accepted publickey for core from 10.0.0.1 port 43566 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:06.954979 sshd-session[4855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:06.962193 systemd-logind[1509]: New session 11 of user core. May 13 12:52:06.966584 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 12:52:07.170832 sshd[4860]: Connection closed by 10.0.0.1 port 43566 May 13 12:52:07.177700 sshd-session[4855]: pam_unix(sshd:session): session closed for user core May 13 12:52:07.197140 systemd[1]: sshd@10-10.0.0.111:22-10.0.0.1:43566.service: Deactivated successfully. May 13 12:52:07.200043 systemd[1]: session-11.scope: Deactivated successfully. May 13 12:52:07.200992 systemd-logind[1509]: Session 11 logged out. Waiting for processes to exit. May 13 12:52:07.208869 systemd[1]: Started sshd@11-10.0.0.111:22-10.0.0.1:43568.service - OpenSSH per-connection server daemon (10.0.0.1:43568). May 13 12:52:07.209786 systemd-logind[1509]: Removed session 11. May 13 12:52:07.264212 sshd[4873]: Accepted publickey for core from 10.0.0.1 port 43568 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:07.265361 sshd-session[4873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:07.269365 systemd-logind[1509]: New session 12 of user core. May 13 12:52:07.278560 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 12:52:07.419475 sshd[4875]: Connection closed by 10.0.0.1 port 43568 May 13 12:52:07.419863 sshd-session[4873]: pam_unix(sshd:session): session closed for user core May 13 12:52:07.423471 systemd[1]: sshd@11-10.0.0.111:22-10.0.0.1:43568.service: Deactivated successfully. May 13 12:52:07.425224 systemd[1]: session-12.scope: Deactivated successfully. May 13 12:52:07.425981 systemd-logind[1509]: Session 12 logged out. Waiting for processes to exit. May 13 12:52:07.427426 systemd-logind[1509]: Removed session 12. May 13 12:52:12.446142 systemd[1]: Started sshd@12-10.0.0.111:22-10.0.0.1:43582.service - OpenSSH per-connection server daemon (10.0.0.1:43582). May 13 12:52:12.507770 sshd[4893]: Accepted publickey for core from 10.0.0.1 port 43582 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:12.509342 sshd-session[4893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:12.513550 systemd-logind[1509]: New session 13 of user core. May 13 12:52:12.521732 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 12:52:12.696764 sshd[4895]: Connection closed by 10.0.0.1 port 43582 May 13 12:52:12.697073 sshd-session[4893]: pam_unix(sshd:session): session closed for user core May 13 12:52:12.711617 systemd[1]: sshd@12-10.0.0.111:22-10.0.0.1:43582.service: Deactivated successfully. May 13 12:52:12.713499 systemd[1]: session-13.scope: Deactivated successfully. May 13 12:52:12.714235 systemd-logind[1509]: Session 13 logged out. Waiting for processes to exit. May 13 12:52:12.718178 systemd[1]: Started sshd@13-10.0.0.111:22-10.0.0.1:58978.service - OpenSSH per-connection server daemon (10.0.0.1:58978). May 13 12:52:12.719728 systemd-logind[1509]: Removed session 13. May 13 12:52:12.772483 sshd[4909]: Accepted publickey for core from 10.0.0.1 port 58978 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:12.774128 sshd-session[4909]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:12.778789 systemd-logind[1509]: New session 14 of user core. May 13 12:52:12.788574 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 12:52:13.068721 sshd[4911]: Connection closed by 10.0.0.1 port 58978 May 13 12:52:13.069277 sshd-session[4909]: pam_unix(sshd:session): session closed for user core May 13 12:52:13.078112 systemd[1]: sshd@13-10.0.0.111:22-10.0.0.1:58978.service: Deactivated successfully. May 13 12:52:13.081438 systemd[1]: session-14.scope: Deactivated successfully. May 13 12:52:13.082312 systemd-logind[1509]: Session 14 logged out. Waiting for processes to exit. May 13 12:52:13.085927 systemd[1]: Started sshd@14-10.0.0.111:22-10.0.0.1:58982.service - OpenSSH per-connection server daemon (10.0.0.1:58982). May 13 12:52:13.087021 systemd-logind[1509]: Removed session 14. May 13 12:52:13.144321 sshd[4922]: Accepted publickey for core from 10.0.0.1 port 58982 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:13.145727 sshd-session[4922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:13.149985 systemd-logind[1509]: New session 15 of user core. May 13 12:52:13.163563 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 12:52:13.186272 kubelet[2631]: I0513 12:52:13.186225 2631 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:52:13.225137 containerd[1526]: time="2025-05-13T12:52:13.225058468Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7\" id:\"52333330fd72c4991446f3a9ee52f075eb285e0a2f423f3b77609398a94d35b2\" pid:4937 exited_at:{seconds:1747140733 nanos:218835802}" May 13 12:52:13.265838 containerd[1526]: time="2025-05-13T12:52:13.265802218Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c354e26a6afe38e295ac26f7d04dd262191b4d774f3565c0fbe72424094a4be7\" id:\"df27d21446e5625fe5bce136833767f4a5a29d7dc85c3b43cf7f730d44726191\" pid:4967 exited_at:{seconds:1747140733 nanos:265260861}" May 13 12:52:14.729973 sshd[4924]: Connection closed by 10.0.0.1 port 58982 May 13 12:52:14.731107 sshd-session[4922]: pam_unix(sshd:session): session closed for user core May 13 12:52:14.741814 systemd[1]: sshd@14-10.0.0.111:22-10.0.0.1:58982.service: Deactivated successfully. May 13 12:52:14.743600 systemd[1]: session-15.scope: Deactivated successfully. May 13 12:52:14.743781 systemd[1]: session-15.scope: Consumed 508ms CPU time, 70.8M memory peak. May 13 12:52:14.744882 systemd-logind[1509]: Session 15 logged out. Waiting for processes to exit. May 13 12:52:14.748146 systemd[1]: Started sshd@15-10.0.0.111:22-10.0.0.1:58988.service - OpenSSH per-connection server daemon (10.0.0.1:58988). May 13 12:52:14.751946 systemd-logind[1509]: Removed session 15. May 13 12:52:14.813196 sshd[4999]: Accepted publickey for core from 10.0.0.1 port 58988 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:14.815342 sshd-session[4999]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:14.819548 systemd-logind[1509]: New session 16 of user core. May 13 12:52:14.830607 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 12:52:15.130160 sshd[5001]: Connection closed by 10.0.0.1 port 58988 May 13 12:52:15.131112 sshd-session[4999]: pam_unix(sshd:session): session closed for user core May 13 12:52:15.142133 systemd[1]: sshd@15-10.0.0.111:22-10.0.0.1:58988.service: Deactivated successfully. May 13 12:52:15.144966 systemd[1]: session-16.scope: Deactivated successfully. May 13 12:52:15.147376 systemd-logind[1509]: Session 16 logged out. Waiting for processes to exit. May 13 12:52:15.149271 systemd[1]: Started sshd@16-10.0.0.111:22-10.0.0.1:59000.service - OpenSSH per-connection server daemon (10.0.0.1:59000). May 13 12:52:15.150899 systemd-logind[1509]: Removed session 16. May 13 12:52:15.203962 sshd[5012]: Accepted publickey for core from 10.0.0.1 port 59000 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:15.205131 sshd-session[5012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:15.208909 systemd-logind[1509]: New session 17 of user core. May 13 12:52:15.223591 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 12:52:15.349027 sshd[5014]: Connection closed by 10.0.0.1 port 59000 May 13 12:52:15.349530 sshd-session[5012]: pam_unix(sshd:session): session closed for user core May 13 12:52:15.353544 systemd-logind[1509]: Session 17 logged out. Waiting for processes to exit. May 13 12:52:15.353618 systemd[1]: sshd@16-10.0.0.111:22-10.0.0.1:59000.service: Deactivated successfully. May 13 12:52:15.356202 systemd[1]: session-17.scope: Deactivated successfully. May 13 12:52:15.358433 systemd-logind[1509]: Removed session 17. May 13 12:52:20.363051 systemd[1]: Started sshd@17-10.0.0.111:22-10.0.0.1:59014.service - OpenSSH per-connection server daemon (10.0.0.1:59014). May 13 12:52:20.434604 sshd[5030]: Accepted publickey for core from 10.0.0.1 port 59014 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:20.435882 sshd-session[5030]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:20.440525 systemd-logind[1509]: New session 18 of user core. May 13 12:52:20.450589 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 12:52:20.573797 sshd[5032]: Connection closed by 10.0.0.1 port 59014 May 13 12:52:20.574328 sshd-session[5030]: pam_unix(sshd:session): session closed for user core May 13 12:52:20.577702 systemd-logind[1509]: Session 18 logged out. Waiting for processes to exit. May 13 12:52:20.577844 systemd[1]: sshd@17-10.0.0.111:22-10.0.0.1:59014.service: Deactivated successfully. May 13 12:52:20.579654 systemd[1]: session-18.scope: Deactivated successfully. May 13 12:52:20.581340 systemd-logind[1509]: Removed session 18. May 13 12:52:25.586619 systemd[1]: Started sshd@18-10.0.0.111:22-10.0.0.1:43590.service - OpenSSH per-connection server daemon (10.0.0.1:43590). May 13 12:52:25.635494 sshd[5050]: Accepted publickey for core from 10.0.0.1 port 43590 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:25.636741 sshd-session[5050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:25.641109 systemd-logind[1509]: New session 19 of user core. May 13 12:52:25.650620 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 12:52:25.761801 sshd[5052]: Connection closed by 10.0.0.1 port 43590 May 13 12:52:25.762141 sshd-session[5050]: pam_unix(sshd:session): session closed for user core May 13 12:52:25.764978 systemd[1]: sshd@18-10.0.0.111:22-10.0.0.1:43590.service: Deactivated successfully. May 13 12:52:25.766789 systemd[1]: session-19.scope: Deactivated successfully. May 13 12:52:25.770072 systemd-logind[1509]: Session 19 logged out. Waiting for processes to exit. May 13 12:52:25.771015 systemd-logind[1509]: Removed session 19. May 13 12:52:30.782342 systemd[1]: Started sshd@19-10.0.0.111:22-10.0.0.1:43606.service - OpenSSH per-connection server daemon (10.0.0.1:43606). May 13 12:52:30.858486 sshd[5067]: Accepted publickey for core from 10.0.0.1 port 43606 ssh2: RSA SHA256:HV7SwMkgpUcGbG5PTBCNGAhaEvexdMAt2yN/TIbGAFk May 13 12:52:30.860028 sshd-session[5067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:52:30.865639 systemd-logind[1509]: New session 20 of user core. May 13 12:52:30.874535 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 12:52:31.016213 sshd[5069]: Connection closed by 10.0.0.1 port 43606 May 13 12:52:31.016540 sshd-session[5067]: pam_unix(sshd:session): session closed for user core May 13 12:52:31.020817 systemd[1]: sshd@19-10.0.0.111:22-10.0.0.1:43606.service: Deactivated successfully. May 13 12:52:31.023009 systemd[1]: session-20.scope: Deactivated successfully. May 13 12:52:31.024438 systemd-logind[1509]: Session 20 logged out. Waiting for processes to exit. May 13 12:52:31.026380 systemd-logind[1509]: Removed session 20.