May 13 12:57:34.833897 kernel: Linux version 6.12.28-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 13 11:28:50 -00 2025 May 13 12:57:34.833917 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=7099d7ee582d4f3e6d25a3763207cfa25fb4eb117c83034e2c517b959b8370a1 May 13 12:57:34.833928 kernel: BIOS-provided physical RAM map: May 13 12:57:34.833935 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 13 12:57:34.833941 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable May 13 12:57:34.833948 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS May 13 12:57:34.833955 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable May 13 12:57:34.833962 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS May 13 12:57:34.833970 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable May 13 12:57:34.833977 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS May 13 12:57:34.833983 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable May 13 12:57:34.833990 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved May 13 12:57:34.833996 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable May 13 12:57:34.834003 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved May 13 12:57:34.834013 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data May 13 12:57:34.834020 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS May 13 12:57:34.834027 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable May 13 12:57:34.834034 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved May 13 12:57:34.834041 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS May 13 12:57:34.834048 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable May 13 12:57:34.834055 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved May 13 12:57:34.834061 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS May 13 12:57:34.834068 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 13 12:57:34.834075 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 13 12:57:34.834082 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved May 13 12:57:34.834091 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 13 12:57:34.834098 kernel: NX (Execute Disable) protection: active May 13 12:57:34.834105 kernel: APIC: Static calls initialized May 13 12:57:34.834112 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable May 13 12:57:34.834119 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable May 13 12:57:34.834126 kernel: extended physical RAM map: May 13 12:57:34.834133 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable May 13 12:57:34.834140 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable May 13 12:57:34.834147 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS May 13 12:57:34.834154 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable May 13 12:57:34.834161 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS May 13 12:57:34.834170 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable May 13 12:57:34.834177 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS May 13 12:57:34.834184 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable May 13 12:57:34.834191 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable May 13 12:57:34.834201 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable May 13 12:57:34.834208 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable May 13 12:57:34.834217 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable May 13 12:57:34.834224 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved May 13 12:57:34.834231 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable May 13 12:57:34.834238 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved May 13 12:57:34.834246 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data May 13 12:57:34.834253 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS May 13 12:57:34.834260 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable May 13 12:57:34.834267 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved May 13 12:57:34.834274 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS May 13 12:57:34.834284 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable May 13 12:57:34.834291 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved May 13 12:57:34.834298 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS May 13 12:57:34.834305 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 13 12:57:34.834312 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 13 12:57:34.834319 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved May 13 12:57:34.834327 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 13 12:57:34.834334 kernel: efi: EFI v2.7 by EDK II May 13 12:57:34.834341 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 May 13 12:57:34.834348 kernel: random: crng init done May 13 12:57:34.834356 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map May 13 12:57:34.834363 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved May 13 12:57:34.834372 kernel: secureboot: Secure boot disabled May 13 12:57:34.834379 kernel: SMBIOS 2.8 present. May 13 12:57:34.834386 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 May 13 12:57:34.834394 kernel: DMI: Memory slots populated: 1/1 May 13 12:57:34.834401 kernel: Hypervisor detected: KVM May 13 12:57:34.834408 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 13 12:57:34.834415 kernel: kvm-clock: using sched offset of 3609739839 cycles May 13 12:57:34.834423 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 13 12:57:34.834430 kernel: tsc: Detected 2794.748 MHz processor May 13 12:57:34.834438 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 13 12:57:34.834445 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 13 12:57:34.834455 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 May 13 12:57:34.834462 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 13 12:57:34.834470 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 13 12:57:34.834477 kernel: Using GB pages for direct mapping May 13 12:57:34.834484 kernel: ACPI: Early table checksum verification disabled May 13 12:57:34.834509 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) May 13 12:57:34.834517 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) May 13 12:57:34.834524 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:57:34.834532 kernel: ACPI: DSDT 0x000000009CB7A000 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:57:34.834541 kernel: ACPI: FACS 0x000000009CBDD000 000040 May 13 12:57:34.834549 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:57:34.834562 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:57:34.834570 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:57:34.834577 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 12:57:34.834584 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) May 13 12:57:34.834592 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] May 13 12:57:34.834600 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1a7] May 13 12:57:34.834610 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] May 13 12:57:34.834618 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] May 13 12:57:34.834625 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] May 13 12:57:34.834632 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] May 13 12:57:34.834639 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] May 13 12:57:34.834647 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] May 13 12:57:34.834654 kernel: No NUMA configuration found May 13 12:57:34.834661 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] May 13 12:57:34.834669 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] May 13 12:57:34.834676 kernel: Zone ranges: May 13 12:57:34.834686 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 13 12:57:34.834693 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] May 13 12:57:34.834700 kernel: Normal empty May 13 12:57:34.834707 kernel: Device empty May 13 12:57:34.834715 kernel: Movable zone start for each node May 13 12:57:34.834722 kernel: Early memory node ranges May 13 12:57:34.834729 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 13 12:57:34.834736 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] May 13 12:57:34.834744 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] May 13 12:57:34.834753 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] May 13 12:57:34.834760 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] May 13 12:57:34.834768 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] May 13 12:57:34.834775 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] May 13 12:57:34.834782 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] May 13 12:57:34.834790 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] May 13 12:57:34.834797 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 12:57:34.834804 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 13 12:57:34.834820 kernel: On node 0, zone DMA: 8 pages in unavailable ranges May 13 12:57:34.834828 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 12:57:34.834835 kernel: On node 0, zone DMA: 239 pages in unavailable ranges May 13 12:57:34.834843 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges May 13 12:57:34.834853 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges May 13 12:57:34.834860 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges May 13 12:57:34.834868 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges May 13 12:57:34.834876 kernel: ACPI: PM-Timer IO Port: 0x608 May 13 12:57:34.834883 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 13 12:57:34.834893 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 13 12:57:34.834901 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 13 12:57:34.834908 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 13 12:57:34.834916 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 13 12:57:34.834924 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 13 12:57:34.834932 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 13 12:57:34.834939 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 13 12:57:34.834947 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 13 12:57:34.834954 kernel: TSC deadline timer available May 13 12:57:34.834962 kernel: CPU topo: Max. logical packages: 1 May 13 12:57:34.834972 kernel: CPU topo: Max. logical dies: 1 May 13 12:57:34.834979 kernel: CPU topo: Max. dies per package: 1 May 13 12:57:34.834987 kernel: CPU topo: Max. threads per core: 1 May 13 12:57:34.834994 kernel: CPU topo: Num. cores per package: 4 May 13 12:57:34.835002 kernel: CPU topo: Num. threads per package: 4 May 13 12:57:34.835009 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs May 13 12:57:34.835017 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 13 12:57:34.835024 kernel: kvm-guest: KVM setup pv remote TLB flush May 13 12:57:34.835032 kernel: kvm-guest: setup PV sched yield May 13 12:57:34.835042 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices May 13 12:57:34.835049 kernel: Booting paravirtualized kernel on KVM May 13 12:57:34.835057 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 13 12:57:34.835065 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 May 13 12:57:34.835073 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 May 13 12:57:34.835080 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 May 13 12:57:34.835088 kernel: pcpu-alloc: [0] 0 1 2 3 May 13 12:57:34.835095 kernel: kvm-guest: PV spinlocks enabled May 13 12:57:34.835103 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 13 12:57:34.835114 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=7099d7ee582d4f3e6d25a3763207cfa25fb4eb117c83034e2c517b959b8370a1 May 13 12:57:34.835122 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 12:57:34.835130 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 12:57:34.835138 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 12:57:34.835145 kernel: Fallback order for Node 0: 0 May 13 12:57:34.835153 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 May 13 12:57:34.835160 kernel: Policy zone: DMA32 May 13 12:57:34.835168 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 12:57:34.835178 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 13 12:57:34.835186 kernel: ftrace: allocating 40071 entries in 157 pages May 13 12:57:34.835193 kernel: ftrace: allocated 157 pages with 5 groups May 13 12:57:34.835201 kernel: Dynamic Preempt: voluntary May 13 12:57:34.835208 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 12:57:34.835217 kernel: rcu: RCU event tracing is enabled. May 13 12:57:34.835225 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 13 12:57:34.835233 kernel: Trampoline variant of Tasks RCU enabled. May 13 12:57:34.835250 kernel: Rude variant of Tasks RCU enabled. May 13 12:57:34.835261 kernel: Tracing variant of Tasks RCU enabled. May 13 12:57:34.835277 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 12:57:34.835285 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 13 12:57:34.835300 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 12:57:34.835316 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 12:57:34.835324 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 12:57:34.835332 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 May 13 12:57:34.835339 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 12:57:34.835347 kernel: Console: colour dummy device 80x25 May 13 12:57:34.835364 kernel: printk: legacy console [ttyS0] enabled May 13 12:57:34.835373 kernel: ACPI: Core revision 20240827 May 13 12:57:34.835394 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 13 12:57:34.835410 kernel: APIC: Switch to symmetric I/O mode setup May 13 12:57:34.835418 kernel: x2apic enabled May 13 12:57:34.835426 kernel: APIC: Switched APIC routing to: physical x2apic May 13 12:57:34.835433 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() May 13 12:57:34.835441 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() May 13 12:57:34.835449 kernel: kvm-guest: setup PV IPIs May 13 12:57:34.835459 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 13 12:57:34.835467 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns May 13 12:57:34.835475 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) May 13 12:57:34.835483 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 13 12:57:34.835502 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 13 12:57:34.835510 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 13 12:57:34.835518 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 13 12:57:34.835526 kernel: Spectre V2 : Mitigation: Retpolines May 13 12:57:34.835534 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 13 12:57:34.835544 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 13 12:57:34.835552 kernel: RETBleed: Mitigation: untrained return thunk May 13 12:57:34.835565 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 13 12:57:34.835573 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 13 12:57:34.835581 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! May 13 12:57:34.835589 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. May 13 12:57:34.835597 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode May 13 12:57:34.835605 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 13 12:57:34.835615 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 13 12:57:34.835623 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 13 12:57:34.835630 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 13 12:57:34.835638 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 13 12:57:34.835646 kernel: Freeing SMP alternatives memory: 32K May 13 12:57:34.835653 kernel: pid_max: default: 32768 minimum: 301 May 13 12:57:34.835661 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 13 12:57:34.835669 kernel: landlock: Up and running. May 13 12:57:34.835676 kernel: SELinux: Initializing. May 13 12:57:34.835686 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 12:57:34.835694 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 12:57:34.835702 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) May 13 12:57:34.835709 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 13 12:57:34.835717 kernel: ... version: 0 May 13 12:57:34.835725 kernel: ... bit width: 48 May 13 12:57:34.835732 kernel: ... generic registers: 6 May 13 12:57:34.835740 kernel: ... value mask: 0000ffffffffffff May 13 12:57:34.835747 kernel: ... max period: 00007fffffffffff May 13 12:57:34.835757 kernel: ... fixed-purpose events: 0 May 13 12:57:34.835764 kernel: ... event mask: 000000000000003f May 13 12:57:34.835772 kernel: signal: max sigframe size: 1776 May 13 12:57:34.835779 kernel: rcu: Hierarchical SRCU implementation. May 13 12:57:34.835787 kernel: rcu: Max phase no-delay instances is 400. May 13 12:57:34.835795 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 13 12:57:34.835803 kernel: smp: Bringing up secondary CPUs ... May 13 12:57:34.835810 kernel: smpboot: x86: Booting SMP configuration: May 13 12:57:34.835818 kernel: .... node #0, CPUs: #1 #2 #3 May 13 12:57:34.835826 kernel: smp: Brought up 1 node, 4 CPUs May 13 12:57:34.835835 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) May 13 12:57:34.835843 kernel: Memory: 2422664K/2565800K available (14336K kernel code, 2430K rwdata, 9948K rodata, 54420K init, 2548K bss, 137200K reserved, 0K cma-reserved) May 13 12:57:34.835851 kernel: devtmpfs: initialized May 13 12:57:34.835858 kernel: x86/mm: Memory block size: 128MB May 13 12:57:34.835866 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) May 13 12:57:34.835874 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) May 13 12:57:34.835882 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) May 13 12:57:34.835890 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) May 13 12:57:34.835899 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) May 13 12:57:34.835907 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) May 13 12:57:34.835915 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 12:57:34.835922 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 13 12:57:34.835930 kernel: pinctrl core: initialized pinctrl subsystem May 13 12:57:34.835938 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 12:57:34.835945 kernel: audit: initializing netlink subsys (disabled) May 13 12:57:34.835953 kernel: audit: type=2000 audit(1747141053.012:1): state=initialized audit_enabled=0 res=1 May 13 12:57:34.835960 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 12:57:34.835970 kernel: thermal_sys: Registered thermal governor 'user_space' May 13 12:57:34.835977 kernel: cpuidle: using governor menu May 13 12:57:34.835985 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 12:57:34.835993 kernel: dca service started, version 1.12.1 May 13 12:57:34.836000 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] May 13 12:57:34.836008 kernel: PCI: Using configuration type 1 for base access May 13 12:57:34.836016 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 13 12:57:34.836023 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 12:57:34.836031 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 13 12:57:34.836040 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 12:57:34.836048 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 13 12:57:34.836056 kernel: ACPI: Added _OSI(Module Device) May 13 12:57:34.836064 kernel: ACPI: Added _OSI(Processor Device) May 13 12:57:34.836071 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 12:57:34.836079 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 12:57:34.836087 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 12:57:34.836094 kernel: ACPI: Interpreter enabled May 13 12:57:34.836102 kernel: ACPI: PM: (supports S0 S3 S5) May 13 12:57:34.836111 kernel: ACPI: Using IOAPIC for interrupt routing May 13 12:57:34.836119 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 13 12:57:34.836127 kernel: PCI: Using E820 reservations for host bridge windows May 13 12:57:34.836134 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 13 12:57:34.836142 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 12:57:34.836309 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 12:57:34.836440 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 13 12:57:34.836577 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 13 12:57:34.836591 kernel: PCI host bridge to bus 0000:00 May 13 12:57:34.836717 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 13 12:57:34.836829 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 13 12:57:34.836939 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 13 12:57:34.837043 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] May 13 12:57:34.837147 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] May 13 12:57:34.837251 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] May 13 12:57:34.837387 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 12:57:34.837598 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 13 12:57:34.837779 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint May 13 12:57:34.837900 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] May 13 12:57:34.838015 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] May 13 12:57:34.838130 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] May 13 12:57:34.838249 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 13 12:57:34.838374 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint May 13 12:57:34.838504 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] May 13 12:57:34.838633 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] May 13 12:57:34.838750 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] May 13 12:57:34.838898 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 13 12:57:34.839026 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] May 13 12:57:34.839146 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] May 13 12:57:34.839260 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] May 13 12:57:34.839402 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 13 12:57:34.839538 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] May 13 12:57:34.839677 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] May 13 12:57:34.839809 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] May 13 12:57:34.839925 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] May 13 12:57:34.840052 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 13 12:57:34.840180 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 13 12:57:34.840305 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 13 12:57:34.840426 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] May 13 12:57:34.840568 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] May 13 12:57:34.840740 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 13 12:57:34.840868 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] May 13 12:57:34.840879 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 13 12:57:34.840888 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 13 12:57:34.840896 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 13 12:57:34.840904 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 13 12:57:34.840911 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 13 12:57:34.840919 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 13 12:57:34.840927 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 13 12:57:34.840938 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 13 12:57:34.840946 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 13 12:57:34.840954 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 13 12:57:34.840962 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 13 12:57:34.840969 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 13 12:57:34.840977 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 13 12:57:34.840985 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 13 12:57:34.840993 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 13 12:57:34.841001 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 13 12:57:34.841010 kernel: iommu: Default domain type: Translated May 13 12:57:34.841018 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 13 12:57:34.841026 kernel: efivars: Registered efivars operations May 13 12:57:34.841034 kernel: PCI: Using ACPI for IRQ routing May 13 12:57:34.841042 kernel: PCI: pci_cache_line_size set to 64 bytes May 13 12:57:34.841049 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] May 13 12:57:34.841057 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] May 13 12:57:34.841065 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] May 13 12:57:34.841073 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] May 13 12:57:34.841080 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] May 13 12:57:34.841090 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] May 13 12:57:34.841098 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] May 13 12:57:34.841105 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] May 13 12:57:34.841220 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 13 12:57:34.841335 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 13 12:57:34.841460 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 13 12:57:34.841471 kernel: vgaarb: loaded May 13 12:57:34.841479 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 13 12:57:34.841503 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 13 12:57:34.841511 kernel: clocksource: Switched to clocksource kvm-clock May 13 12:57:34.841519 kernel: VFS: Disk quotas dquot_6.6.0 May 13 12:57:34.841527 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 12:57:34.841535 kernel: pnp: PnP ACPI init May 13 12:57:34.841668 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved May 13 12:57:34.841696 kernel: pnp: PnP ACPI: found 6 devices May 13 12:57:34.841706 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 13 12:57:34.841716 kernel: NET: Registered PF_INET protocol family May 13 12:57:34.841724 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 12:57:34.841732 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 12:57:34.841742 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 12:57:34.841753 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 12:57:34.841764 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 12:57:34.841775 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 12:57:34.841785 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 12:57:34.841799 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 12:57:34.841810 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 12:57:34.841820 kernel: NET: Registered PF_XDP protocol family May 13 12:57:34.841967 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window May 13 12:57:34.842088 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned May 13 12:57:34.842195 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 13 12:57:34.842300 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 13 12:57:34.842404 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 13 12:57:34.842531 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] May 13 12:57:34.842644 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] May 13 12:57:34.842749 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] May 13 12:57:34.842759 kernel: PCI: CLS 0 bytes, default 64 May 13 12:57:34.842767 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns May 13 12:57:34.842775 kernel: Initialise system trusted keyrings May 13 12:57:34.842784 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 12:57:34.842792 kernel: Key type asymmetric registered May 13 12:57:34.842803 kernel: Asymmetric key parser 'x509' registered May 13 12:57:34.842811 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 13 12:57:34.842819 kernel: io scheduler mq-deadline registered May 13 12:57:34.842827 kernel: io scheduler kyber registered May 13 12:57:34.842835 kernel: io scheduler bfq registered May 13 12:57:34.842843 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 13 12:57:34.842852 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 13 12:57:34.842862 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 13 12:57:34.842870 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 13 12:57:34.842878 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 12:57:34.842886 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 13 12:57:34.842895 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 13 12:57:34.842903 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 13 12:57:34.842911 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 13 12:57:34.843045 kernel: rtc_cmos 00:04: RTC can wake from S4 May 13 12:57:34.843062 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 13 12:57:34.843189 kernel: rtc_cmos 00:04: registered as rtc0 May 13 12:57:34.843299 kernel: rtc_cmos 00:04: setting system clock to 2025-05-13T12:57:34 UTC (1747141054) May 13 12:57:34.843411 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 13 12:57:34.843421 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 13 12:57:34.843429 kernel: efifb: probing for efifb May 13 12:57:34.843437 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k May 13 12:57:34.843445 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 May 13 12:57:34.843456 kernel: efifb: scrolling: redraw May 13 12:57:34.843464 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 13 12:57:34.843472 kernel: Console: switching to colour frame buffer device 160x50 May 13 12:57:34.843480 kernel: fb0: EFI VGA frame buffer device May 13 12:57:34.843488 kernel: pstore: Using crash dump compression: deflate May 13 12:57:34.843516 kernel: pstore: Registered efi_pstore as persistent store backend May 13 12:57:34.843524 kernel: NET: Registered PF_INET6 protocol family May 13 12:57:34.843532 kernel: Segment Routing with IPv6 May 13 12:57:34.843540 kernel: In-situ OAM (IOAM) with IPv6 May 13 12:57:34.843548 kernel: NET: Registered PF_PACKET protocol family May 13 12:57:34.843566 kernel: Key type dns_resolver registered May 13 12:57:34.843574 kernel: IPI shorthand broadcast: enabled May 13 12:57:34.843582 kernel: sched_clock: Marking stable (2776002116, 161881188)->(2957259083, -19375779) May 13 12:57:34.843590 kernel: registered taskstats version 1 May 13 12:57:34.843598 kernel: Loading compiled-in X.509 certificates May 13 12:57:34.843607 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.28-flatcar: d81efc2839896c91a2830d4cfad7b0572af8b26a' May 13 12:57:34.843614 kernel: Demotion targets for Node 0: null May 13 12:57:34.843622 kernel: Key type .fscrypt registered May 13 12:57:34.843630 kernel: Key type fscrypt-provisioning registered May 13 12:57:34.843640 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 12:57:34.843648 kernel: ima: Allocated hash algorithm: sha1 May 13 12:57:34.843656 kernel: ima: No architecture policies found May 13 12:57:34.843664 kernel: clk: Disabling unused clocks May 13 12:57:34.843672 kernel: Warning: unable to open an initial console. May 13 12:57:34.843680 kernel: Freeing unused kernel image (initmem) memory: 54420K May 13 12:57:34.843688 kernel: Write protecting the kernel read-only data: 24576k May 13 12:57:34.843696 kernel: Freeing unused kernel image (rodata/data gap) memory: 292K May 13 12:57:34.843706 kernel: Run /init as init process May 13 12:57:34.843714 kernel: with arguments: May 13 12:57:34.843722 kernel: /init May 13 12:57:34.843729 kernel: with environment: May 13 12:57:34.843737 kernel: HOME=/ May 13 12:57:34.843745 kernel: TERM=linux May 13 12:57:34.843753 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 12:57:34.843762 systemd[1]: Successfully made /usr/ read-only. May 13 12:57:34.843773 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 12:57:34.843784 systemd[1]: Detected virtualization kvm. May 13 12:57:34.843792 systemd[1]: Detected architecture x86-64. May 13 12:57:34.843801 systemd[1]: Running in initrd. May 13 12:57:34.843809 systemd[1]: No hostname configured, using default hostname. May 13 12:57:34.843818 systemd[1]: Hostname set to . May 13 12:57:34.843826 systemd[1]: Initializing machine ID from VM UUID. May 13 12:57:34.843835 systemd[1]: Queued start job for default target initrd.target. May 13 12:57:34.843845 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 12:57:34.843853 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 12:57:34.843863 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 12:57:34.843871 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 12:57:34.843880 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 12:57:34.843889 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 12:57:34.843899 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 12:57:34.843910 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 12:57:34.843918 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 12:57:34.843927 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 12:57:34.843935 systemd[1]: Reached target paths.target - Path Units. May 13 12:57:34.843944 systemd[1]: Reached target slices.target - Slice Units. May 13 12:57:34.843952 systemd[1]: Reached target swap.target - Swaps. May 13 12:57:34.843960 systemd[1]: Reached target timers.target - Timer Units. May 13 12:57:34.843969 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 12:57:34.843977 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 12:57:34.843988 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 12:57:34.843996 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 12:57:34.844005 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 12:57:34.844014 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 12:57:34.844022 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 12:57:34.844033 systemd[1]: Reached target sockets.target - Socket Units. May 13 12:57:34.844041 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 12:57:34.844050 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 12:57:34.844060 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 12:57:34.844069 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 13 12:57:34.844078 systemd[1]: Starting systemd-fsck-usr.service... May 13 12:57:34.844086 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 12:57:34.844095 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 12:57:34.844103 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:57:34.844111 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 12:57:34.844122 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 12:57:34.844150 systemd-journald[220]: Collecting audit messages is disabled. May 13 12:57:34.844177 systemd[1]: Finished systemd-fsck-usr.service. May 13 12:57:34.844186 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 12:57:34.844194 systemd-journald[220]: Journal started May 13 12:57:34.844213 systemd-journald[220]: Runtime Journal (/run/log/journal/99d3586b71114028a116347ec40c9b1c) is 6M, max 48.5M, 42.4M free. May 13 12:57:34.837734 systemd-modules-load[222]: Inserted module 'overlay' May 13 12:57:34.850627 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:57:34.852520 systemd[1]: Started systemd-journald.service - Journal Service. May 13 12:57:34.855620 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 12:57:34.860609 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 12:57:34.863928 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 12:57:34.869518 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 12:57:34.872002 systemd-modules-load[222]: Inserted module 'br_netfilter' May 13 12:57:34.873032 kernel: Bridge firewalling registered May 13 12:57:34.874225 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 12:57:34.877445 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 12:57:34.880794 systemd-tmpfiles[238]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 13 12:57:34.881653 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 12:57:34.886383 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 12:57:34.890688 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 12:57:34.895431 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 12:57:34.895732 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 12:57:34.899696 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 12:57:34.902342 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 12:57:34.929652 dracut-cmdline[259]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=7099d7ee582d4f3e6d25a3763207cfa25fb4eb117c83034e2c517b959b8370a1 May 13 12:57:34.949330 systemd-resolved[260]: Positive Trust Anchors: May 13 12:57:34.949346 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 12:57:34.949377 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 12:57:34.951850 systemd-resolved[260]: Defaulting to hostname 'linux'. May 13 12:57:34.952853 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 12:57:34.959058 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 12:57:35.042523 kernel: SCSI subsystem initialized May 13 12:57:35.051529 kernel: Loading iSCSI transport class v2.0-870. May 13 12:57:35.061523 kernel: iscsi: registered transport (tcp) May 13 12:57:35.082729 kernel: iscsi: registered transport (qla4xxx) May 13 12:57:35.082803 kernel: QLogic iSCSI HBA Driver May 13 12:57:35.105359 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 12:57:35.128874 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 12:57:35.131482 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 12:57:35.184947 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 12:57:35.187144 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 12:57:35.242533 kernel: raid6: avx2x4 gen() 30043 MB/s May 13 12:57:35.259525 kernel: raid6: avx2x2 gen() 29167 MB/s May 13 12:57:35.276611 kernel: raid6: avx2x1 gen() 25839 MB/s May 13 12:57:35.276633 kernel: raid6: using algorithm avx2x4 gen() 30043 MB/s May 13 12:57:35.294613 kernel: raid6: .... xor() 7579 MB/s, rmw enabled May 13 12:57:35.294632 kernel: raid6: using avx2x2 recovery algorithm May 13 12:57:35.314523 kernel: xor: automatically using best checksumming function avx May 13 12:57:35.478570 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 12:57:35.487658 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 12:57:35.490481 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 12:57:35.531326 systemd-udevd[470]: Using default interface naming scheme 'v255'. May 13 12:57:35.537700 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 12:57:35.541190 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 12:57:35.578752 dracut-pre-trigger[474]: rd.md=0: removing MD RAID activation May 13 12:57:35.608749 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 12:57:35.610170 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 12:57:35.681370 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 12:57:35.685623 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 12:57:35.714523 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues May 13 12:57:35.718025 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 13 12:57:35.725997 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 12:57:35.726020 kernel: GPT:9289727 != 19775487 May 13 12:57:35.726031 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 12:57:35.726043 kernel: GPT:9289727 != 19775487 May 13 12:57:35.726053 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 12:57:35.726065 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 12:57:35.728514 kernel: cryptd: max_cpu_qlen set to 1000 May 13 12:57:35.742541 kernel: AES CTR mode by8 optimization enabled May 13 12:57:35.744518 kernel: libata version 3.00 loaded. May 13 12:57:35.748539 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 12:57:35.757605 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 May 13 12:57:35.748668 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:57:35.761720 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:57:35.766456 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:57:35.770727 kernel: ahci 0000:00:1f.2: version 3.0 May 13 12:57:35.772165 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 13 12:57:35.781014 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 13 12:57:35.781181 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 13 12:57:35.781318 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 13 12:57:35.785814 kernel: scsi host0: ahci May 13 12:57:35.785987 kernel: scsi host1: ahci May 13 12:57:35.789641 kernel: scsi host2: ahci May 13 12:57:35.794515 kernel: scsi host3: ahci May 13 12:57:35.794732 kernel: scsi host4: ahci May 13 12:57:35.797149 kernel: scsi host5: ahci May 13 12:57:35.797317 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 0 May 13 12:57:35.797329 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 0 May 13 12:57:35.798999 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 0 May 13 12:57:35.799017 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 0 May 13 12:57:35.801015 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 0 May 13 12:57:35.801073 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 0 May 13 12:57:35.802422 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 13 12:57:35.814435 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 13 12:57:35.831816 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 12:57:35.841843 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 13 12:57:35.845138 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 13 12:57:35.848439 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 12:57:35.850856 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 12:57:35.850949 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:57:35.854166 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:57:35.866006 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:57:35.867326 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 12:57:35.902716 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:57:35.922968 disk-uuid[631]: Primary Header is updated. May 13 12:57:35.922968 disk-uuid[631]: Secondary Entries is updated. May 13 12:57:35.922968 disk-uuid[631]: Secondary Header is updated. May 13 12:57:35.926518 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 12:57:36.110435 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 13 12:57:36.110516 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 13 12:57:36.110535 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 13 12:57:36.110545 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 13 12:57:36.110555 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 13 12:57:36.111530 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 13 12:57:36.112531 kernel: ata3.00: applying bridge limits May 13 12:57:36.112546 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 13 12:57:36.113534 kernel: ata3.00: configured for UDMA/100 May 13 12:57:36.114538 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 13 12:57:36.160138 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 13 12:57:36.160349 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 12:57:36.180537 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 13 12:57:36.594944 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 12:57:36.595843 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 12:57:36.598410 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 12:57:36.600727 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 12:57:36.603599 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 12:57:36.632678 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 12:57:36.934093 disk-uuid[638]: The operation has completed successfully. May 13 12:57:36.935295 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 12:57:36.963533 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 12:57:36.963648 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 12:57:36.999123 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 12:57:37.023318 sh[667]: Success May 13 12:57:37.041387 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 12:57:37.041431 kernel: device-mapper: uevent: version 1.0.3 May 13 12:57:37.041444 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 13 12:57:37.050536 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 13 12:57:37.078790 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 12:57:37.081957 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 12:57:37.101265 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 12:57:37.107254 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 13 12:57:37.107285 kernel: BTRFS: device fsid 3042589c-b63f-42f0-9a6f-a4369b1889f9 devid 1 transid 40 /dev/mapper/usr (253:0) scanned by mount (679) May 13 12:57:37.110076 kernel: BTRFS info (device dm-0): first mount of filesystem 3042589c-b63f-42f0-9a6f-a4369b1889f9 May 13 12:57:37.110098 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 13 12:57:37.110109 kernel: BTRFS info (device dm-0): using free-space-tree May 13 12:57:37.113942 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 12:57:37.114513 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 13 12:57:37.116932 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 12:57:37.118105 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 12:57:37.121670 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 12:57:37.152551 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (713) May 13 12:57:37.154959 kernel: BTRFS info (device vda6): first mount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 12:57:37.154983 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 12:57:37.155000 kernel: BTRFS info (device vda6): using free-space-tree May 13 12:57:37.161520 kernel: BTRFS info (device vda6): last unmount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 12:57:37.162168 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 12:57:37.163524 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 12:57:37.245065 ignition[755]: Ignition 2.21.0 May 13 12:57:37.245082 ignition[755]: Stage: fetch-offline May 13 12:57:37.245126 ignition[755]: no configs at "/usr/lib/ignition/base.d" May 13 12:57:37.245138 ignition[755]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:57:37.245247 ignition[755]: parsed url from cmdline: "" May 13 12:57:37.245251 ignition[755]: no config URL provided May 13 12:57:37.245256 ignition[755]: reading system config file "/usr/lib/ignition/user.ign" May 13 12:57:37.245264 ignition[755]: no config at "/usr/lib/ignition/user.ign" May 13 12:57:37.245287 ignition[755]: op(1): [started] loading QEMU firmware config module May 13 12:57:37.245292 ignition[755]: op(1): executing: "modprobe" "qemu_fw_cfg" May 13 12:57:37.253105 ignition[755]: op(1): [finished] loading QEMU firmware config module May 13 12:57:37.271346 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 12:57:37.275211 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 12:57:37.297213 ignition[755]: parsing config with SHA512: e785cc011e4401582689418c098e07d86778dce3a0d607548450b96c434b66873e14defaec2cfb6724a7e1ff2398a489d1932ba3d84e6633eb1ce82db0fc998c May 13 12:57:37.300864 unknown[755]: fetched base config from "system" May 13 12:57:37.301009 unknown[755]: fetched user config from "qemu" May 13 12:57:37.301321 ignition[755]: fetch-offline: fetch-offline passed May 13 12:57:37.301374 ignition[755]: Ignition finished successfully May 13 12:57:37.303800 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 12:57:37.319069 systemd-networkd[856]: lo: Link UP May 13 12:57:37.319080 systemd-networkd[856]: lo: Gained carrier May 13 12:57:37.320555 systemd-networkd[856]: Enumeration completed May 13 12:57:37.320892 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:57:37.320896 systemd-networkd[856]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 12:57:37.321938 systemd-networkd[856]: eth0: Link UP May 13 12:57:37.321941 systemd-networkd[856]: eth0: Gained carrier May 13 12:57:37.321949 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:57:37.322260 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 12:57:37.323629 systemd[1]: Reached target network.target - Network. May 13 12:57:37.327012 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 13 12:57:37.329089 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 12:57:37.363575 systemd-networkd[856]: eth0: DHCPv4 address 10.0.0.121/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 12:57:37.372802 ignition[860]: Ignition 2.21.0 May 13 12:57:37.372814 ignition[860]: Stage: kargs May 13 12:57:37.372951 ignition[860]: no configs at "/usr/lib/ignition/base.d" May 13 12:57:37.372962 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:57:37.373905 ignition[860]: kargs: kargs passed May 13 12:57:37.373943 ignition[860]: Ignition finished successfully May 13 12:57:37.381679 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 12:57:37.384716 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 12:57:37.421589 ignition[869]: Ignition 2.21.0 May 13 12:57:37.421601 ignition[869]: Stage: disks May 13 12:57:37.421719 ignition[869]: no configs at "/usr/lib/ignition/base.d" May 13 12:57:37.421731 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:57:37.422801 ignition[869]: disks: disks passed May 13 12:57:37.425648 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 12:57:37.422852 ignition[869]: Ignition finished successfully May 13 12:57:37.427104 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 12:57:37.429043 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 12:57:37.429257 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 12:57:37.429776 systemd[1]: Reached target sysinit.target - System Initialization. May 13 12:57:37.430100 systemd[1]: Reached target basic.target - Basic System. May 13 12:57:37.431411 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 12:57:37.469825 systemd-fsck[880]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 13 12:57:37.477345 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 12:57:37.478357 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 12:57:37.584530 kernel: EXT4-fs (vda9): mounted filesystem ebf7ca75-051f-4154-b098-5ec24084105d r/w with ordered data mode. Quota mode: none. May 13 12:57:37.584664 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 12:57:37.586306 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 12:57:37.588648 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 12:57:37.590519 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 12:57:37.591863 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 13 12:57:37.591902 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 12:57:37.591925 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 12:57:37.605637 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 12:57:37.607154 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 12:57:37.612722 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (888) May 13 12:57:37.612743 kernel: BTRFS info (device vda6): first mount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 12:57:37.612760 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 12:57:37.612770 kernel: BTRFS info (device vda6): using free-space-tree May 13 12:57:37.618055 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 12:57:37.642035 initrd-setup-root[912]: cut: /sysroot/etc/passwd: No such file or directory May 13 12:57:37.646942 initrd-setup-root[919]: cut: /sysroot/etc/group: No such file or directory May 13 12:57:37.651594 initrd-setup-root[926]: cut: /sysroot/etc/shadow: No such file or directory May 13 12:57:37.656107 initrd-setup-root[933]: cut: /sysroot/etc/gshadow: No such file or directory May 13 12:57:37.740414 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 12:57:37.743568 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 12:57:37.746135 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 12:57:37.766519 kernel: BTRFS info (device vda6): last unmount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 12:57:37.783688 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 12:57:37.798166 ignition[1002]: INFO : Ignition 2.21.0 May 13 12:57:37.798166 ignition[1002]: INFO : Stage: mount May 13 12:57:37.799898 ignition[1002]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 12:57:37.799898 ignition[1002]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:57:37.802225 ignition[1002]: INFO : mount: mount passed May 13 12:57:37.802225 ignition[1002]: INFO : Ignition finished successfully May 13 12:57:37.806437 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 12:57:37.808572 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 12:57:38.106433 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 12:57:38.108340 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 12:57:38.143983 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (1014) May 13 12:57:38.144013 kernel: BTRFS info (device vda6): first mount of filesystem 00c8da9a-330c-44ff-bf12-f9831c2c14e1 May 13 12:57:38.144025 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 12:57:38.144880 kernel: BTRFS info (device vda6): using free-space-tree May 13 12:57:38.148942 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 12:57:38.182277 ignition[1031]: INFO : Ignition 2.21.0 May 13 12:57:38.182277 ignition[1031]: INFO : Stage: files May 13 12:57:38.184216 ignition[1031]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 12:57:38.184216 ignition[1031]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:57:38.184216 ignition[1031]: DEBUG : files: compiled without relabeling support, skipping May 13 12:57:38.187751 ignition[1031]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 12:57:38.187751 ignition[1031]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 12:57:38.187751 ignition[1031]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 12:57:38.187751 ignition[1031]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 12:57:38.193348 ignition[1031]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 12:57:38.193348 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 12:57:38.193348 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 13 12:57:38.187794 unknown[1031]: wrote ssh authorized keys file for user: core May 13 12:57:38.236006 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 12:57:38.403987 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 12:57:38.403987 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 12:57:38.407940 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 12:57:38.407940 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 12:57:38.407940 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 12:57:38.407940 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 12:57:38.407940 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 12:57:38.407940 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 12:57:38.407940 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 12:57:38.420073 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 12:57:38.420073 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 12:57:38.420073 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 12:57:38.426649 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 12:57:38.426649 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 12:57:38.431673 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 May 13 12:57:38.740692 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 12:57:39.075616 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 13 12:57:39.075616 ignition[1031]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 12:57:39.079122 ignition[1031]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 12:57:39.084796 ignition[1031]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 12:57:39.084796 ignition[1031]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 12:57:39.084796 ignition[1031]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 12:57:39.089587 ignition[1031]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 12:57:39.089587 ignition[1031]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 12:57:39.089587 ignition[1031]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 12:57:39.089587 ignition[1031]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 13 12:57:39.103564 ignition[1031]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 13 12:57:39.107245 ignition[1031]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 13 12:57:39.108979 ignition[1031]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 13 12:57:39.108979 ignition[1031]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 13 12:57:39.111853 ignition[1031]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 13 12:57:39.111853 ignition[1031]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 12:57:39.111853 ignition[1031]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 12:57:39.111853 ignition[1031]: INFO : files: files passed May 13 12:57:39.111853 ignition[1031]: INFO : Ignition finished successfully May 13 12:57:39.114414 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 12:57:39.116252 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 12:57:39.119819 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 12:57:39.129389 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 12:57:39.129578 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 12:57:39.132718 initrd-setup-root-after-ignition[1060]: grep: /sysroot/oem/oem-release: No such file or directory May 13 12:57:39.137092 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 12:57:39.137092 initrd-setup-root-after-ignition[1062]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 12:57:39.140191 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 12:57:39.143387 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 12:57:39.143677 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 12:57:39.146820 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 12:57:39.188904 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 12:57:39.189987 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 12:57:39.191465 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 12:57:39.193540 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 12:57:39.195488 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 12:57:39.197932 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 12:57:39.224330 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 12:57:39.225730 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 12:57:39.250613 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 12:57:39.250764 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 12:57:39.254083 systemd[1]: Stopped target timers.target - Timer Units. May 13 12:57:39.255232 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 12:57:39.255338 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 12:57:39.260012 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 12:57:39.260147 systemd[1]: Stopped target basic.target - Basic System. May 13 12:57:39.262889 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 12:57:39.263791 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 12:57:39.264116 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 12:57:39.264456 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 13 12:57:39.264960 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 12:57:39.265289 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 12:57:39.265808 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 12:57:39.266130 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 12:57:39.266462 systemd[1]: Stopped target swap.target - Swaps. May 13 12:57:39.266938 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 12:57:39.267046 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 12:57:39.282549 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 12:57:39.283658 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 12:57:39.283945 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 12:57:39.284106 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 12:57:39.287735 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 12:57:39.287846 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 12:57:39.291993 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 12:57:39.292103 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 12:57:39.293085 systemd[1]: Stopped target paths.target - Path Units. May 13 12:57:39.293333 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 12:57:39.298016 systemd-networkd[856]: eth0: Gained IPv6LL May 13 12:57:39.301554 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 12:57:39.304246 systemd[1]: Stopped target slices.target - Slice Units. May 13 12:57:39.304378 systemd[1]: Stopped target sockets.target - Socket Units. May 13 12:57:39.306075 systemd[1]: iscsid.socket: Deactivated successfully. May 13 12:57:39.306164 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 12:57:39.307820 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 12:57:39.307905 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 12:57:39.309539 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 12:57:39.309653 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 12:57:39.311327 systemd[1]: ignition-files.service: Deactivated successfully. May 13 12:57:39.311438 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 12:57:39.316173 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 12:57:39.319225 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 12:57:39.320067 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 12:57:39.320179 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 12:57:39.321863 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 12:57:39.321959 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 12:57:39.333003 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 12:57:39.333110 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 12:57:39.342201 ignition[1086]: INFO : Ignition 2.21.0 May 13 12:57:39.342201 ignition[1086]: INFO : Stage: umount May 13 12:57:39.343978 ignition[1086]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 12:57:39.343978 ignition[1086]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 12:57:39.343978 ignition[1086]: INFO : umount: umount passed May 13 12:57:39.343978 ignition[1086]: INFO : Ignition finished successfully May 13 12:57:39.347336 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 12:57:39.348022 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 12:57:39.348160 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 12:57:39.349659 systemd[1]: Stopped target network.target - Network. May 13 12:57:39.351689 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 12:57:39.351755 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 12:57:39.352035 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 12:57:39.352079 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 12:57:39.352356 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 12:57:39.352399 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 12:57:39.353253 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 12:57:39.353294 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 12:57:39.353876 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 12:57:39.360698 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 12:57:39.368373 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 12:57:39.368549 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 12:57:39.372254 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 12:57:39.372588 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 12:57:39.372637 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 12:57:39.376484 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 12:57:39.380314 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 12:57:39.380447 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 12:57:39.384580 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 12:57:39.384734 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 13 12:57:39.387072 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 12:57:39.387110 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 12:57:39.390287 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 12:57:39.392290 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 12:57:39.392342 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 12:57:39.393768 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 12:57:39.393829 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 12:57:39.398441 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 12:57:39.398488 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 12:57:39.399478 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 12:57:39.400983 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 12:57:39.416228 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 12:57:39.417737 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 12:57:39.421205 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 12:57:39.421385 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 12:57:39.422472 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 12:57:39.422528 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 12:57:39.425750 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 12:57:39.425783 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 12:57:39.426779 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 12:57:39.426830 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 12:57:39.429146 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 12:57:39.429193 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 12:57:39.433507 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 12:57:39.433558 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 12:57:39.438215 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 12:57:39.438569 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 13 12:57:39.438622 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 13 12:57:39.443052 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 12:57:39.443100 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 12:57:39.446775 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 13 12:57:39.446826 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 12:57:39.450405 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 12:57:39.450460 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 12:57:39.451714 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 12:57:39.451759 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:57:39.467215 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 12:57:39.467324 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 12:57:39.508211 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 12:57:39.508346 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 12:57:39.509527 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 12:57:39.511172 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 12:57:39.511230 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 12:57:39.516465 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 12:57:39.538941 systemd[1]: Switching root. May 13 12:57:39.577625 systemd-journald[220]: Journal stopped May 13 12:57:40.682070 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). May 13 12:57:40.682143 kernel: SELinux: policy capability network_peer_controls=1 May 13 12:57:40.682157 kernel: SELinux: policy capability open_perms=1 May 13 12:57:40.682168 kernel: SELinux: policy capability extended_socket_class=1 May 13 12:57:40.682179 kernel: SELinux: policy capability always_check_network=0 May 13 12:57:40.682194 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 12:57:40.682205 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 12:57:40.682219 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 12:57:40.682230 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 12:57:40.682240 kernel: SELinux: policy capability userspace_initial_context=0 May 13 12:57:40.682258 kernel: audit: type=1403 audit(1747141059.910:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 12:57:40.682274 systemd[1]: Successfully loaded SELinux policy in 47.465ms. May 13 12:57:40.682300 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 16.195ms. May 13 12:57:40.682313 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 12:57:40.682325 systemd[1]: Detected virtualization kvm. May 13 12:57:40.682339 systemd[1]: Detected architecture x86-64. May 13 12:57:40.682351 systemd[1]: Detected first boot. May 13 12:57:40.682363 systemd[1]: Initializing machine ID from VM UUID. May 13 12:57:40.682375 zram_generator::config[1135]: No configuration found. May 13 12:57:40.682396 kernel: Guest personality initialized and is inactive May 13 12:57:40.682407 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 13 12:57:40.682419 kernel: Initialized host personality May 13 12:57:40.682429 kernel: NET: Registered PF_VSOCK protocol family May 13 12:57:40.682440 systemd[1]: Populated /etc with preset unit settings. May 13 12:57:40.682455 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 12:57:40.682467 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 12:57:40.682478 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 12:57:40.682490 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 12:57:40.682515 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 12:57:40.682527 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 12:57:40.682539 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 12:57:40.682551 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 12:57:40.682562 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 12:57:40.682577 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 12:57:40.682594 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 12:57:40.682608 systemd[1]: Created slice user.slice - User and Session Slice. May 13 12:57:40.682620 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 12:57:40.682632 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 12:57:40.682644 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 12:57:40.682656 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 12:57:40.682667 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 12:57:40.682681 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 12:57:40.682693 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 13 12:57:40.682706 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 12:57:40.682718 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 12:57:40.682729 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 12:57:40.682741 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 12:57:40.682753 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 12:57:40.682764 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 12:57:40.682778 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 12:57:40.682789 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 12:57:40.682801 systemd[1]: Reached target slices.target - Slice Units. May 13 12:57:40.682813 systemd[1]: Reached target swap.target - Swaps. May 13 12:57:40.682824 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 12:57:40.682836 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 12:57:40.682848 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 12:57:40.682860 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 12:57:40.682872 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 12:57:40.682884 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 12:57:40.682897 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 12:57:40.682909 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 12:57:40.682921 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 12:57:40.682933 systemd[1]: Mounting media.mount - External Media Directory... May 13 12:57:40.682945 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:40.682957 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 12:57:40.682969 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 12:57:40.682981 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 12:57:40.682994 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 12:57:40.683006 systemd[1]: Reached target machines.target - Containers. May 13 12:57:40.683018 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 12:57:40.683029 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:57:40.683041 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 12:57:40.683053 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 12:57:40.683064 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 12:57:40.683076 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 12:57:40.683090 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:57:40.683101 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 12:57:40.683114 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 12:57:40.683126 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 12:57:40.683138 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 12:57:40.683150 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 12:57:40.683161 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 12:57:40.683173 systemd[1]: Stopped systemd-fsck-usr.service. May 13 12:57:40.683185 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:57:40.683199 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 12:57:40.683211 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 12:57:40.683222 kernel: fuse: init (API version 7.41) May 13 12:57:40.683234 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 12:57:40.683245 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 12:57:40.683257 kernel: loop: module loaded May 13 12:57:40.683268 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 12:57:40.683280 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 12:57:40.683294 systemd[1]: verity-setup.service: Deactivated successfully. May 13 12:57:40.683305 systemd[1]: Stopped verity-setup.service. May 13 12:57:40.683318 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:40.683331 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 12:57:40.683343 kernel: ACPI: bus type drm_connector registered May 13 12:57:40.683354 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 12:57:40.683366 systemd[1]: Mounted media.mount - External Media Directory. May 13 12:57:40.683408 systemd-journald[1211]: Collecting audit messages is disabled. May 13 12:57:40.683433 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 12:57:40.683445 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 12:57:40.683459 systemd-journald[1211]: Journal started May 13 12:57:40.683481 systemd-journald[1211]: Runtime Journal (/run/log/journal/99d3586b71114028a116347ec40c9b1c) is 6M, max 48.5M, 42.4M free. May 13 12:57:40.422712 systemd[1]: Queued start job for default target multi-user.target. May 13 12:57:40.445303 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 13 12:57:40.445752 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 12:57:40.685294 systemd[1]: Started systemd-journald.service - Journal Service. May 13 12:57:40.686124 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 12:57:40.687425 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 12:57:40.688898 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 12:57:40.690422 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 12:57:40.690659 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 12:57:40.692104 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 12:57:40.692310 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 12:57:40.693747 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 12:57:40.693952 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 12:57:40.695278 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:57:40.695518 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:57:40.697004 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 12:57:40.697210 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 12:57:40.698579 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 12:57:40.698787 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 12:57:40.700279 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 12:57:40.701766 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 12:57:40.703769 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 12:57:40.705379 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 12:57:40.721710 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 12:57:40.724466 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 12:57:40.726911 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 12:57:40.728131 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 12:57:40.728216 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 12:57:40.730396 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 12:57:40.738614 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 12:57:40.740868 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:57:40.743181 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 12:57:40.745671 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 12:57:40.746873 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 12:57:40.751984 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 12:57:40.753425 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 12:57:40.754621 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 12:57:40.760827 systemd-journald[1211]: Time spent on flushing to /var/log/journal/99d3586b71114028a116347ec40c9b1c is 17.358ms for 1062 entries. May 13 12:57:40.760827 systemd-journald[1211]: System Journal (/var/log/journal/99d3586b71114028a116347ec40c9b1c) is 8M, max 195.6M, 187.6M free. May 13 12:57:40.791151 systemd-journald[1211]: Received client request to flush runtime journal. May 13 12:57:40.791189 kernel: loop0: detected capacity change from 0 to 113872 May 13 12:57:40.761647 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 12:57:40.764664 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 12:57:40.767865 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 12:57:40.769635 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 12:57:40.771624 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 12:57:40.774921 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 12:57:40.781605 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 12:57:40.789288 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 12:57:40.790963 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 12:57:40.793172 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 12:57:40.804545 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 12:57:40.816069 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 13 12:57:40.816653 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 13 12:57:40.821523 kernel: loop1: detected capacity change from 0 to 205544 May 13 12:57:40.823956 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 12:57:40.827592 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 12:57:40.831845 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 12:57:40.847522 kernel: loop2: detected capacity change from 0 to 146240 May 13 12:57:40.866298 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 12:57:40.871254 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 12:57:40.879638 kernel: loop3: detected capacity change from 0 to 113872 May 13 12:57:40.894532 kernel: loop4: detected capacity change from 0 to 205544 May 13 12:57:40.901985 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. May 13 12:57:40.902008 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. May 13 12:57:40.903518 kernel: loop5: detected capacity change from 0 to 146240 May 13 12:57:40.907674 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 12:57:40.918038 (sd-merge)[1276]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 13 12:57:40.918735 (sd-merge)[1276]: Merged extensions into '/usr'. May 13 12:57:40.923817 systemd[1]: Reload requested from client PID 1252 ('systemd-sysext') (unit systemd-sysext.service)... May 13 12:57:40.923836 systemd[1]: Reloading... May 13 12:57:40.992540 zram_generator::config[1304]: No configuration found. May 13 12:57:41.083256 ldconfig[1247]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 12:57:41.092136 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:57:41.172330 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 12:57:41.172626 systemd[1]: Reloading finished in 248 ms. May 13 12:57:41.195935 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 12:57:41.197602 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 12:57:41.214887 systemd[1]: Starting ensure-sysext.service... May 13 12:57:41.216767 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 12:57:41.236655 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 13 12:57:41.236696 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 13 12:57:41.236987 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 12:57:41.237239 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 12:57:41.238197 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 12:57:41.238472 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. May 13 12:57:41.238565 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. May 13 12:57:41.257079 systemd[1]: Reload requested from client PID 1341 ('systemctl') (unit ensure-sysext.service)... May 13 12:57:41.257097 systemd[1]: Reloading... May 13 12:57:41.307757 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. May 13 12:57:41.307773 systemd-tmpfiles[1342]: Skipping /boot May 13 12:57:41.310790 zram_generator::config[1372]: No configuration found. May 13 12:57:41.321601 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. May 13 12:57:41.321615 systemd-tmpfiles[1342]: Skipping /boot May 13 12:57:41.401922 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:57:41.480841 systemd[1]: Reloading finished in 223 ms. May 13 12:57:41.501292 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 12:57:41.522279 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 12:57:41.529171 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 12:57:41.531680 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 12:57:41.551828 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 12:57:41.556125 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 12:57:41.560337 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 12:57:41.565651 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 12:57:41.575996 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 12:57:41.583121 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:41.583333 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:57:41.588909 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 12:57:41.592865 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:57:41.596176 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 12:57:41.598003 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:57:41.598172 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:57:41.598352 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:41.602061 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 12:57:41.603993 systemd-udevd[1413]: Using default interface naming scheme 'v255'. May 13 12:57:41.605644 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:57:41.606141 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:57:41.608520 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 12:57:41.609649 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 12:57:41.612095 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 12:57:41.612325 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 12:57:41.615723 augenrules[1437]: No rules May 13 12:57:41.617263 systemd[1]: audit-rules.service: Deactivated successfully. May 13 12:57:41.617707 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 12:57:41.619615 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 12:57:41.627369 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:41.627664 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:57:41.629059 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 12:57:41.633162 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:57:41.639893 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 12:57:41.641449 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:57:41.641590 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:57:41.643802 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 12:57:41.644889 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:41.645952 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 12:57:41.649256 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 12:57:41.652618 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 12:57:41.654953 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 12:57:41.655174 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 12:57:41.656885 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:57:41.657097 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:57:41.659063 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 12:57:41.659293 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 12:57:41.668524 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 12:57:41.678748 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:41.680674 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 12:57:41.681810 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 12:57:41.684688 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 12:57:41.686653 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 12:57:41.688699 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 12:57:41.696730 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 12:57:41.697892 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 12:57:41.697939 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 12:57:41.702759 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 12:57:41.703839 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 12:57:41.703872 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 12:57:41.706655 systemd[1]: Finished ensure-sysext.service. May 13 12:57:41.707823 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 12:57:41.708888 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 12:57:41.710453 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 12:57:41.710691 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 12:57:41.712115 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 12:57:41.712320 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 12:57:41.714010 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 12:57:41.714214 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 12:57:41.723474 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 12:57:41.723669 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 12:57:41.732670 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 12:57:41.738001 augenrules[1485]: /sbin/augenrules: No change May 13 12:57:41.749770 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 13 12:57:41.757228 augenrules[1520]: No rules May 13 12:57:41.758422 systemd[1]: audit-rules.service: Deactivated successfully. May 13 12:57:41.760103 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 12:57:41.793886 systemd-resolved[1411]: Positive Trust Anchors: May 13 12:57:41.794172 systemd-resolved[1411]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 12:57:41.794262 systemd-resolved[1411]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 12:57:41.798381 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 12:57:41.798802 systemd-resolved[1411]: Defaulting to hostname 'linux'. May 13 12:57:41.802206 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 12:57:41.803427 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 12:57:41.805817 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 12:57:41.813513 kernel: mousedev: PS/2 mouse device common for all mice May 13 12:57:41.819524 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 May 13 12:57:41.824592 kernel: ACPI: button: Power Button [PWRF] May 13 12:57:41.831612 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 12:57:41.850906 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device May 13 12:57:41.852300 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 13 12:57:41.852476 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 13 12:57:41.909338 systemd-networkd[1494]: lo: Link UP May 13 12:57:41.909347 systemd-networkd[1494]: lo: Gained carrier May 13 12:57:41.912305 systemd-networkd[1494]: Enumeration completed May 13 12:57:41.912472 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 12:57:41.913052 systemd-networkd[1494]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:57:41.913144 systemd-networkd[1494]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 12:57:41.913770 systemd[1]: Reached target network.target - Network. May 13 12:57:41.914146 systemd-networkd[1494]: eth0: Link UP May 13 12:57:41.914490 systemd-networkd[1494]: eth0: Gained carrier May 13 12:57:41.914580 systemd-networkd[1494]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 12:57:41.916778 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 12:57:41.919738 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 12:57:41.943561 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 12:57:41.945546 systemd[1]: Reached target time-set.target - System Time Set. May 13 12:57:41.949828 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 12:57:41.951557 systemd-networkd[1494]: eth0: DHCPv4 address 10.0.0.121/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 12:57:41.952211 systemd-timesyncd[1509]: Network configuration changed, trying to establish connection. May 13 12:57:43.286342 systemd-resolved[1411]: Clock change detected. Flushing caches. May 13 12:57:43.286461 systemd-timesyncd[1509]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 13 12:57:43.286554 systemd-timesyncd[1509]: Initial clock synchronization to Tue 2025-05-13 12:57:43.286311 UTC. May 13 12:57:43.309220 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 12:57:43.316275 kernel: kvm_amd: TSC scaling supported May 13 12:57:43.316316 kernel: kvm_amd: Nested Virtualization enabled May 13 12:57:43.316329 kernel: kvm_amd: Nested Paging enabled May 13 12:57:43.316340 kernel: kvm_amd: LBR virtualization supported May 13 12:57:43.316355 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported May 13 12:57:43.316932 kernel: kvm_amd: Virtual GIF supported May 13 12:57:43.376914 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 12:57:43.378687 systemd[1]: Reached target sysinit.target - System Initialization. May 13 12:57:43.379868 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 12:57:43.381183 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 12:57:43.382459 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 13 12:57:43.384206 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 12:57:43.385442 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 12:57:43.386732 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 12:57:43.388011 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 12:57:43.388046 systemd[1]: Reached target paths.target - Path Units. May 13 12:57:43.388975 systemd[1]: Reached target timers.target - Timer Units. May 13 12:57:43.390864 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 12:57:43.393403 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 12:57:43.396877 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 12:57:43.398332 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 12:57:43.399679 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 12:57:43.403332 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 12:57:43.404880 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 12:57:43.406613 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 12:57:43.408482 systemd[1]: Reached target sockets.target - Socket Units. May 13 12:57:43.409583 systemd[1]: Reached target basic.target - Basic System. May 13 12:57:43.410569 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 12:57:43.410597 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 12:57:43.411511 systemd[1]: Starting containerd.service - containerd container runtime... May 13 12:57:43.414285 kernel: EDAC MC: Ver: 3.0.0 May 13 12:57:43.414362 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 12:57:43.423596 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 12:57:43.425796 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 12:57:43.428429 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 12:57:43.429478 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 12:57:43.431407 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 13 12:57:43.433902 jq[1570]: false May 13 12:57:43.433982 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 12:57:43.436244 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 12:57:43.440370 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 12:57:43.442084 google_oslogin_nss_cache[1572]: oslogin_cache_refresh[1572]: Refreshing passwd entry cache May 13 12:57:43.442333 oslogin_cache_refresh[1572]: Refreshing passwd entry cache May 13 12:57:43.442569 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 12:57:43.446744 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 12:57:43.448683 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 12:57:43.449123 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 12:57:43.449741 systemd[1]: Starting update-engine.service - Update Engine... May 13 12:57:43.451682 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 12:57:43.455771 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 12:57:43.457397 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 12:57:43.462319 extend-filesystems[1571]: Found loop3 May 13 12:57:43.462319 extend-filesystems[1571]: Found loop4 May 13 12:57:43.462319 extend-filesystems[1571]: Found loop5 May 13 12:57:43.462319 extend-filesystems[1571]: Found sr0 May 13 12:57:43.462319 extend-filesystems[1571]: Found vda May 13 12:57:43.462319 extend-filesystems[1571]: Found vda1 May 13 12:57:43.462319 extend-filesystems[1571]: Found vda2 May 13 12:57:43.462319 extend-filesystems[1571]: Found vda3 May 13 12:57:43.462319 extend-filesystems[1571]: Found usr May 13 12:57:43.462319 extend-filesystems[1571]: Found vda4 May 13 12:57:43.462319 extend-filesystems[1571]: Found vda6 May 13 12:57:43.462319 extend-filesystems[1571]: Found vda7 May 13 12:57:43.462319 extend-filesystems[1571]: Found vda9 May 13 12:57:43.462319 extend-filesystems[1571]: Checking size of /dev/vda9 May 13 12:57:43.484425 google_oslogin_nss_cache[1572]: oslogin_cache_refresh[1572]: Failure getting users, quitting May 13 12:57:43.484425 google_oslogin_nss_cache[1572]: oslogin_cache_refresh[1572]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 13 12:57:43.484425 google_oslogin_nss_cache[1572]: oslogin_cache_refresh[1572]: Refreshing group entry cache May 13 12:57:43.459583 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 12:57:43.459150 oslogin_cache_refresh[1572]: Failure getting users, quitting May 13 12:57:43.484595 jq[1582]: true May 13 12:57:43.485774 extend-filesystems[1571]: Resized partition /dev/vda9 May 13 12:57:43.461638 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 12:57:43.459167 oslogin_cache_refresh[1572]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 13 12:57:43.461874 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 12:57:43.459215 oslogin_cache_refresh[1572]: Refreshing group entry cache May 13 12:57:43.487072 update_engine[1580]: I20250513 12:57:43.482234 1580 main.cc:92] Flatcar Update Engine starting May 13 12:57:43.465811 systemd[1]: motdgen.service: Deactivated successfully. May 13 12:57:43.466080 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 12:57:43.488391 tar[1589]: linux-amd64/helm May 13 12:57:43.488626 jq[1593]: true May 13 12:57:43.488452 oslogin_cache_refresh[1572]: Failure getting groups, quitting May 13 12:57:43.488792 google_oslogin_nss_cache[1572]: oslogin_cache_refresh[1572]: Failure getting groups, quitting May 13 12:57:43.488792 google_oslogin_nss_cache[1572]: oslogin_cache_refresh[1572]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 13 12:57:43.488465 oslogin_cache_refresh[1572]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 13 12:57:43.490289 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 13 12:57:43.491318 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 13 12:57:43.498315 extend-filesystems[1605]: resize2fs 1.47.2 (1-Jan-2025) May 13 12:57:43.509366 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 13 12:57:43.493792 (ntainerd)[1594]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 12:57:43.523969 dbus-daemon[1568]: [system] SELinux support is enabled May 13 12:57:43.524115 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 12:57:43.527546 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 12:57:43.527567 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 12:57:43.529330 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 12:57:43.529346 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 12:57:43.535559 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 13 12:57:43.561135 update_engine[1580]: I20250513 12:57:43.538162 1580 update_check_scheduler.cc:74] Next update check in 6m9s May 13 12:57:43.539945 systemd[1]: Started update-engine.service - Update Engine. May 13 12:57:43.544357 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 12:57:43.563272 extend-filesystems[1605]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 13 12:57:43.563272 extend-filesystems[1605]: old_desc_blocks = 1, new_desc_blocks = 1 May 13 12:57:43.563272 extend-filesystems[1605]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 13 12:57:43.573143 extend-filesystems[1571]: Resized filesystem in /dev/vda9 May 13 12:57:43.566239 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 12:57:43.568355 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 12:57:43.578746 bash[1626]: Updated "/home/core/.ssh/authorized_keys" May 13 12:57:43.578636 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 12:57:43.581134 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 13 12:57:43.590190 systemd-logind[1578]: Watching system buttons on /dev/input/event2 (Power Button) May 13 12:57:43.590489 systemd-logind[1578]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 13 12:57:43.590821 systemd-logind[1578]: New seat seat0. May 13 12:57:43.593548 systemd[1]: Started systemd-logind.service - User Login Management. May 13 12:57:43.620237 locksmithd[1619]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 12:57:43.692366 sshd_keygen[1591]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 12:57:43.694918 containerd[1594]: time="2025-05-13T12:57:43Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 12:57:43.697279 containerd[1594]: time="2025-05-13T12:57:43.696980137Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 13 12:57:43.706607 containerd[1594]: time="2025-05-13T12:57:43.706566147Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.592µs" May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.706884805Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.706912076Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707074921Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707089559Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707126398Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707206378Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707217559Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707512041Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707526258Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707536798Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707546266Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 12:57:43.708278 containerd[1594]: time="2025-05-13T12:57:43.707631766Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 12:57:43.708511 containerd[1594]: time="2025-05-13T12:57:43.707835769Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 12:57:43.708511 containerd[1594]: time="2025-05-13T12:57:43.707864733Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 12:57:43.708511 containerd[1594]: time="2025-05-13T12:57:43.707874401Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 12:57:43.708511 containerd[1594]: time="2025-05-13T12:57:43.707912783Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 12:57:43.708511 containerd[1594]: time="2025-05-13T12:57:43.708162031Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 12:57:43.708511 containerd[1594]: time="2025-05-13T12:57:43.708221963Z" level=info msg="metadata content store policy set" policy=shared May 13 12:57:43.714092 containerd[1594]: time="2025-05-13T12:57:43.714073889Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 12:57:43.714176 containerd[1594]: time="2025-05-13T12:57:43.714163718Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 12:57:43.714226 containerd[1594]: time="2025-05-13T12:57:43.714215144Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 12:57:43.714288 containerd[1594]: time="2025-05-13T12:57:43.714275217Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 12:57:43.714335 containerd[1594]: time="2025-05-13T12:57:43.714324930Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 12:57:43.714391 containerd[1594]: time="2025-05-13T12:57:43.714379693Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 12:57:43.714446 containerd[1594]: time="2025-05-13T12:57:43.714434336Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 12:57:43.714497 containerd[1594]: time="2025-05-13T12:57:43.714486273Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 12:57:43.714550 containerd[1594]: time="2025-05-13T12:57:43.714538100Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 12:57:43.714600 containerd[1594]: time="2025-05-13T12:57:43.714589968Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 12:57:43.714643 containerd[1594]: time="2025-05-13T12:57:43.714632648Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 12:57:43.714688 containerd[1594]: time="2025-05-13T12:57:43.714678564Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 12:57:43.714830 containerd[1594]: time="2025-05-13T12:57:43.714814960Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 12:57:43.714896 containerd[1594]: time="2025-05-13T12:57:43.714883719Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 12:57:43.714950 containerd[1594]: time="2025-05-13T12:57:43.714939553Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 12:57:43.714995 containerd[1594]: time="2025-05-13T12:57:43.714984267Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 12:57:43.715043 containerd[1594]: time="2025-05-13T12:57:43.715032357Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 12:57:43.715090 containerd[1594]: time="2025-05-13T12:57:43.715080307Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 12:57:43.715156 containerd[1594]: time="2025-05-13T12:57:43.715143746Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 12:57:43.715208 containerd[1594]: time="2025-05-13T12:57:43.715196405Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 12:57:43.715268 containerd[1594]: time="2025-05-13T12:57:43.715243904Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 12:57:43.715325 containerd[1594]: time="2025-05-13T12:57:43.715313064Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 12:57:43.715370 containerd[1594]: time="2025-05-13T12:57:43.715360002Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 12:57:43.715486 containerd[1594]: time="2025-05-13T12:57:43.715472623Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 12:57:43.715534 containerd[1594]: time="2025-05-13T12:57:43.715524681Z" level=info msg="Start snapshots syncer" May 13 12:57:43.715612 containerd[1594]: time="2025-05-13T12:57:43.715599411Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 12:57:43.716802 containerd[1594]: time="2025-05-13T12:57:43.716715534Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 12:57:43.717201 containerd[1594]: time="2025-05-13T12:57:43.717164617Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 12:57:43.718128 containerd[1594]: time="2025-05-13T12:57:43.718097356Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 12:57:43.718305 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 12:57:43.718823 containerd[1594]: time="2025-05-13T12:57:43.718772162Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 12:57:43.718823 containerd[1594]: time="2025-05-13T12:57:43.718822687Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 12:57:43.718891 containerd[1594]: time="2025-05-13T12:57:43.718836603Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 12:57:43.718891 containerd[1594]: time="2025-05-13T12:57:43.718848696Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 12:57:43.718891 containerd[1594]: time="2025-05-13T12:57:43.718861259Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 12:57:43.718891 containerd[1594]: time="2025-05-13T12:57:43.718872240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 12:57:43.718891 containerd[1594]: time="2025-05-13T12:57:43.718883581Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 12:57:43.718978 containerd[1594]: time="2025-05-13T12:57:43.718937743Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 12:57:43.718978 containerd[1594]: time="2025-05-13T12:57:43.718948954Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 12:57:43.718978 containerd[1594]: time="2025-05-13T12:57:43.718959504Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 12:57:43.719671 containerd[1594]: time="2025-05-13T12:57:43.719650229Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 12:57:43.719786 containerd[1594]: time="2025-05-13T12:57:43.719771607Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719825488Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719840947Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719849343Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719858780Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719869971Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719883286Z" level=info msg="runtime interface created" May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719895209Z" level=info msg="created NRI interface" May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719906179Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719918332Z" level=info msg="Connect containerd service" May 13 12:57:43.719975 containerd[1594]: time="2025-05-13T12:57:43.719943118Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 12:57:43.721194 containerd[1594]: time="2025-05-13T12:57:43.721172294Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 12:57:43.722973 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 12:57:43.737598 systemd[1]: issuegen.service: Deactivated successfully. May 13 12:57:43.737929 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 12:57:43.742646 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 12:57:43.759156 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 12:57:43.763232 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 12:57:43.768095 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 13 12:57:43.769806 systemd[1]: Reached target getty.target - Login Prompts. May 13 12:57:43.815664 containerd[1594]: time="2025-05-13T12:57:43.815605824Z" level=info msg="Start subscribing containerd event" May 13 12:57:43.815855 containerd[1594]: time="2025-05-13T12:57:43.815814235Z" level=info msg="Start recovering state" May 13 12:57:43.816061 containerd[1594]: time="2025-05-13T12:57:43.815936505Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 12:57:43.816061 containerd[1594]: time="2025-05-13T12:57:43.816001547Z" level=info msg="Start event monitor" May 13 12:57:43.816061 containerd[1594]: time="2025-05-13T12:57:43.816015443Z" level=info msg="Start cni network conf syncer for default" May 13 12:57:43.816061 containerd[1594]: time="2025-05-13T12:57:43.816059105Z" level=info msg="Start streaming server" May 13 12:57:43.816061 containerd[1594]: time="2025-05-13T12:57:43.816073051Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 12:57:43.816061 containerd[1594]: time="2025-05-13T12:57:43.816080745Z" level=info msg="runtime interface starting up..." May 13 12:57:43.816313 containerd[1594]: time="2025-05-13T12:57:43.816086576Z" level=info msg="starting plugins..." May 13 12:57:43.816313 containerd[1594]: time="2025-05-13T12:57:43.816111613Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 12:57:43.816313 containerd[1594]: time="2025-05-13T12:57:43.816016885Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 12:57:43.816502 containerd[1594]: time="2025-05-13T12:57:43.816396487Z" level=info msg="containerd successfully booted in 0.122073s" May 13 12:57:43.816525 systemd[1]: Started containerd.service - containerd container runtime. May 13 12:57:43.931179 tar[1589]: linux-amd64/LICENSE May 13 12:57:43.931291 tar[1589]: linux-amd64/README.md May 13 12:57:43.954315 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 12:57:44.594450 systemd-networkd[1494]: eth0: Gained IPv6LL May 13 12:57:44.597982 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 12:57:44.599926 systemd[1]: Reached target network-online.target - Network is Online. May 13 12:57:44.602853 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 13 12:57:44.605416 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:57:44.607759 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 12:57:44.640441 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 12:57:44.642185 systemd[1]: coreos-metadata.service: Deactivated successfully. May 13 12:57:44.642538 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 13 12:57:44.645197 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 12:57:45.236817 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:57:45.238443 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 12:57:45.239844 systemd[1]: Startup finished in 2.852s (kernel) + 5.268s (initrd) + 4.042s (userspace) = 12.163s. May 13 12:57:45.241011 (kubelet)[1699]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 12:57:45.633434 kubelet[1699]: E0513 12:57:45.633316 1699 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 12:57:45.637389 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 12:57:45.637577 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 12:57:45.637932 systemd[1]: kubelet.service: Consumed 880ms CPU time, 235.5M memory peak. May 13 12:57:49.523637 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 12:57:49.525282 systemd[1]: Started sshd@0-10.0.0.121:22-10.0.0.1:57588.service - OpenSSH per-connection server daemon (10.0.0.1:57588). May 13 12:57:49.586452 sshd[1712]: Accepted publickey for core from 10.0.0.1 port 57588 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:57:49.588333 sshd-session[1712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:57:49.594488 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 12:57:49.595588 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 12:57:49.601820 systemd-logind[1578]: New session 1 of user core. May 13 12:57:49.633110 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 12:57:49.636043 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 12:57:49.664398 (systemd)[1716]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 12:57:49.666965 systemd-logind[1578]: New session c1 of user core. May 13 12:57:49.814194 systemd[1716]: Queued start job for default target default.target. May 13 12:57:49.836573 systemd[1716]: Created slice app.slice - User Application Slice. May 13 12:57:49.836600 systemd[1716]: Reached target paths.target - Paths. May 13 12:57:49.836643 systemd[1716]: Reached target timers.target - Timers. May 13 12:57:49.838183 systemd[1716]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 12:57:49.848473 systemd[1716]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 12:57:49.848595 systemd[1716]: Reached target sockets.target - Sockets. May 13 12:57:49.848640 systemd[1716]: Reached target basic.target - Basic System. May 13 12:57:49.848677 systemd[1716]: Reached target default.target - Main User Target. May 13 12:57:49.848707 systemd[1716]: Startup finished in 174ms. May 13 12:57:49.849164 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 12:57:49.851086 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 12:57:49.920141 systemd[1]: Started sshd@1-10.0.0.121:22-10.0.0.1:57598.service - OpenSSH per-connection server daemon (10.0.0.1:57598). May 13 12:57:49.960876 sshd[1727]: Accepted publickey for core from 10.0.0.1 port 57598 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:57:49.962350 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:57:49.966462 systemd-logind[1578]: New session 2 of user core. May 13 12:57:49.982422 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 12:57:50.035213 sshd[1729]: Connection closed by 10.0.0.1 port 57598 May 13 12:57:50.035542 sshd-session[1727]: pam_unix(sshd:session): session closed for user core May 13 12:57:50.047894 systemd[1]: sshd@1-10.0.0.121:22-10.0.0.1:57598.service: Deactivated successfully. May 13 12:57:50.049790 systemd[1]: session-2.scope: Deactivated successfully. May 13 12:57:50.050509 systemd-logind[1578]: Session 2 logged out. Waiting for processes to exit. May 13 12:57:50.053471 systemd[1]: Started sshd@2-10.0.0.121:22-10.0.0.1:57610.service - OpenSSH per-connection server daemon (10.0.0.1:57610). May 13 12:57:50.054041 systemd-logind[1578]: Removed session 2. May 13 12:57:50.112174 sshd[1735]: Accepted publickey for core from 10.0.0.1 port 57610 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:57:50.113505 sshd-session[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:57:50.118417 systemd-logind[1578]: New session 3 of user core. May 13 12:57:50.136476 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 12:57:50.185018 sshd[1737]: Connection closed by 10.0.0.1 port 57610 May 13 12:57:50.185412 sshd-session[1735]: pam_unix(sshd:session): session closed for user core May 13 12:57:50.198038 systemd[1]: sshd@2-10.0.0.121:22-10.0.0.1:57610.service: Deactivated successfully. May 13 12:57:50.200161 systemd[1]: session-3.scope: Deactivated successfully. May 13 12:57:50.201009 systemd-logind[1578]: Session 3 logged out. Waiting for processes to exit. May 13 12:57:50.204766 systemd[1]: Started sshd@3-10.0.0.121:22-10.0.0.1:57620.service - OpenSSH per-connection server daemon (10.0.0.1:57620). May 13 12:57:50.205637 systemd-logind[1578]: Removed session 3. May 13 12:57:50.256671 sshd[1743]: Accepted publickey for core from 10.0.0.1 port 57620 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:57:50.258191 sshd-session[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:57:50.262512 systemd-logind[1578]: New session 4 of user core. May 13 12:57:50.271374 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 12:57:50.323992 sshd[1745]: Connection closed by 10.0.0.1 port 57620 May 13 12:57:50.324330 sshd-session[1743]: pam_unix(sshd:session): session closed for user core May 13 12:57:50.335646 systemd[1]: sshd@3-10.0.0.121:22-10.0.0.1:57620.service: Deactivated successfully. May 13 12:57:50.337097 systemd[1]: session-4.scope: Deactivated successfully. May 13 12:57:50.337831 systemd-logind[1578]: Session 4 logged out. Waiting for processes to exit. May 13 12:57:50.340399 systemd[1]: Started sshd@4-10.0.0.121:22-10.0.0.1:57634.service - OpenSSH per-connection server daemon (10.0.0.1:57634). May 13 12:57:50.340927 systemd-logind[1578]: Removed session 4. May 13 12:57:50.393699 sshd[1751]: Accepted publickey for core from 10.0.0.1 port 57634 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:57:50.394974 sshd-session[1751]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:57:50.398967 systemd-logind[1578]: New session 5 of user core. May 13 12:57:50.409379 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 12:57:50.465788 sudo[1755]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 12:57:50.466101 sudo[1755]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:57:50.486423 sudo[1755]: pam_unix(sudo:session): session closed for user root May 13 12:57:50.487999 sshd[1754]: Connection closed by 10.0.0.1 port 57634 May 13 12:57:50.488410 sshd-session[1751]: pam_unix(sshd:session): session closed for user core May 13 12:57:50.508939 systemd[1]: sshd@4-10.0.0.121:22-10.0.0.1:57634.service: Deactivated successfully. May 13 12:57:50.510733 systemd[1]: session-5.scope: Deactivated successfully. May 13 12:57:50.511429 systemd-logind[1578]: Session 5 logged out. Waiting for processes to exit. May 13 12:57:50.514463 systemd[1]: Started sshd@5-10.0.0.121:22-10.0.0.1:57646.service - OpenSSH per-connection server daemon (10.0.0.1:57646). May 13 12:57:50.515005 systemd-logind[1578]: Removed session 5. May 13 12:57:50.574928 sshd[1761]: Accepted publickey for core from 10.0.0.1 port 57646 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:57:50.576238 sshd-session[1761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:57:50.580436 systemd-logind[1578]: New session 6 of user core. May 13 12:57:50.590408 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 12:57:50.644150 sudo[1765]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 12:57:50.644469 sudo[1765]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:57:50.984451 sudo[1765]: pam_unix(sudo:session): session closed for user root May 13 12:57:50.990704 sudo[1764]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 12:57:50.991012 sudo[1764]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:57:51.000779 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 12:57:51.046920 augenrules[1787]: No rules May 13 12:57:51.048790 systemd[1]: audit-rules.service: Deactivated successfully. May 13 12:57:51.049079 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 12:57:51.050235 sudo[1764]: pam_unix(sudo:session): session closed for user root May 13 12:57:51.051747 sshd[1763]: Connection closed by 10.0.0.1 port 57646 May 13 12:57:51.052013 sshd-session[1761]: pam_unix(sshd:session): session closed for user core May 13 12:57:51.059886 systemd[1]: sshd@5-10.0.0.121:22-10.0.0.1:57646.service: Deactivated successfully. May 13 12:57:51.061708 systemd[1]: session-6.scope: Deactivated successfully. May 13 12:57:51.062421 systemd-logind[1578]: Session 6 logged out. Waiting for processes to exit. May 13 12:57:51.065180 systemd[1]: Started sshd@6-10.0.0.121:22-10.0.0.1:57648.service - OpenSSH per-connection server daemon (10.0.0.1:57648). May 13 12:57:51.065754 systemd-logind[1578]: Removed session 6. May 13 12:57:51.111356 sshd[1796]: Accepted publickey for core from 10.0.0.1 port 57648 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:57:51.112955 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:57:51.117703 systemd-logind[1578]: New session 7 of user core. May 13 12:57:51.127440 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 12:57:51.180119 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 12:57:51.180499 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 12:57:51.667089 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 12:57:51.688587 (dockerd)[1820]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 12:57:52.126980 dockerd[1820]: time="2025-05-13T12:57:52.126847358Z" level=info msg="Starting up" May 13 12:57:52.128855 dockerd[1820]: time="2025-05-13T12:57:52.128828153Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 12:57:52.242799 dockerd[1820]: time="2025-05-13T12:57:52.242745009Z" level=info msg="Loading containers: start." May 13 12:57:52.252284 kernel: Initializing XFRM netlink socket May 13 12:57:52.478609 systemd-networkd[1494]: docker0: Link UP May 13 12:57:52.484242 dockerd[1820]: time="2025-05-13T12:57:52.484207884Z" level=info msg="Loading containers: done." May 13 12:57:52.498487 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3906639699-merged.mount: Deactivated successfully. May 13 12:57:52.499970 dockerd[1820]: time="2025-05-13T12:57:52.499925686Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 12:57:52.500070 dockerd[1820]: time="2025-05-13T12:57:52.500039820Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 13 12:57:52.500186 dockerd[1820]: time="2025-05-13T12:57:52.500161919Z" level=info msg="Initializing buildkit" May 13 12:57:52.529898 dockerd[1820]: time="2025-05-13T12:57:52.529837673Z" level=info msg="Completed buildkit initialization" May 13 12:57:52.535467 dockerd[1820]: time="2025-05-13T12:57:52.535434150Z" level=info msg="Daemon has completed initialization" May 13 12:57:52.535548 dockerd[1820]: time="2025-05-13T12:57:52.535486017Z" level=info msg="API listen on /run/docker.sock" May 13 12:57:52.535655 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 12:57:53.547302 containerd[1594]: time="2025-05-13T12:57:53.547243215Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 13 12:57:54.145558 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1945101266.mount: Deactivated successfully. May 13 12:57:55.001649 containerd[1594]: time="2025-05-13T12:57:55.001594372Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:55.002826 containerd[1594]: time="2025-05-13T12:57:55.002772051Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960987" May 13 12:57:55.003860 containerd[1594]: time="2025-05-13T12:57:55.003828081Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:55.006099 containerd[1594]: time="2025-05-13T12:57:55.006074074Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:55.007100 containerd[1594]: time="2025-05-13T12:57:55.007074401Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 1.459773478s" May 13 12:57:55.007153 containerd[1594]: time="2025-05-13T12:57:55.007103074Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 13 12:57:55.008495 containerd[1594]: time="2025-05-13T12:57:55.008464819Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 13 12:57:55.887959 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 12:57:55.889968 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:57:56.062976 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:57:56.066534 (kubelet)[2094]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 12:57:56.101644 kubelet[2094]: E0513 12:57:56.101600 2094 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 12:57:56.108179 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 12:57:56.108549 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 12:57:56.109135 systemd[1]: kubelet.service: Consumed 185ms CPU time, 94M memory peak. May 13 12:57:56.419025 containerd[1594]: time="2025-05-13T12:57:56.418967669Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:56.419902 containerd[1594]: time="2025-05-13T12:57:56.419839083Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713776" May 13 12:57:56.421069 containerd[1594]: time="2025-05-13T12:57:56.421019277Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:56.423461 containerd[1594]: time="2025-05-13T12:57:56.423432103Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:56.424185 containerd[1594]: time="2025-05-13T12:57:56.424134600Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 1.415634185s" May 13 12:57:56.424185 containerd[1594]: time="2025-05-13T12:57:56.424175738Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 13 12:57:56.424768 containerd[1594]: time="2025-05-13T12:57:56.424714498Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 13 12:57:58.037911 containerd[1594]: time="2025-05-13T12:57:58.037845150Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:58.038849 containerd[1594]: time="2025-05-13T12:57:58.038788079Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780386" May 13 12:57:58.040079 containerd[1594]: time="2025-05-13T12:57:58.040032202Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:58.042502 containerd[1594]: time="2025-05-13T12:57:58.042456430Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:58.043236 containerd[1594]: time="2025-05-13T12:57:58.043202038Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 1.618447285s" May 13 12:57:58.043236 containerd[1594]: time="2025-05-13T12:57:58.043233217Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 13 12:57:58.043937 containerd[1594]: time="2025-05-13T12:57:58.043765365Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 13 12:57:58.971478 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount507633994.mount: Deactivated successfully. May 13 12:57:59.228878 containerd[1594]: time="2025-05-13T12:57:59.228757331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:59.229579 containerd[1594]: time="2025-05-13T12:57:59.229538356Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354625" May 13 12:57:59.230766 containerd[1594]: time="2025-05-13T12:57:59.230714472Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:59.232465 containerd[1594]: time="2025-05-13T12:57:59.232426263Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:57:59.232956 containerd[1594]: time="2025-05-13T12:57:59.232916813Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 1.18912042s" May 13 12:57:59.232956 containerd[1594]: time="2025-05-13T12:57:59.232951418Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 13 12:57:59.233384 containerd[1594]: time="2025-05-13T12:57:59.233358912Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 12:57:59.766777 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1326153539.mount: Deactivated successfully. May 13 12:58:01.135173 containerd[1594]: time="2025-05-13T12:58:01.135117678Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:01.136005 containerd[1594]: time="2025-05-13T12:58:01.135985766Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" May 13 12:58:01.137239 containerd[1594]: time="2025-05-13T12:58:01.137210353Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:01.139553 containerd[1594]: time="2025-05-13T12:58:01.139533170Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:01.140444 containerd[1594]: time="2025-05-13T12:58:01.140412349Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.907028651s" May 13 12:58:01.140444 containerd[1594]: time="2025-05-13T12:58:01.140440592Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 13 12:58:01.141096 containerd[1594]: time="2025-05-13T12:58:01.141010461Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 12:58:01.560901 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3324121445.mount: Deactivated successfully. May 13 12:58:01.567014 containerd[1594]: time="2025-05-13T12:58:01.566971763Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 12:58:01.567759 containerd[1594]: time="2025-05-13T12:58:01.567720437Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 13 12:58:01.568976 containerd[1594]: time="2025-05-13T12:58:01.568934084Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 12:58:01.570809 containerd[1594]: time="2025-05-13T12:58:01.570777652Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 12:58:01.571380 containerd[1594]: time="2025-05-13T12:58:01.571349925Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 430.315488ms" May 13 12:58:01.571380 containerd[1594]: time="2025-05-13T12:58:01.571377196Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 13 12:58:01.571848 containerd[1594]: time="2025-05-13T12:58:01.571825367Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 13 12:58:02.067103 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1502189308.mount: Deactivated successfully. May 13 12:58:03.853529 containerd[1594]: time="2025-05-13T12:58:03.853456615Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:03.854332 containerd[1594]: time="2025-05-13T12:58:03.854283726Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" May 13 12:58:03.855572 containerd[1594]: time="2025-05-13T12:58:03.855517030Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:03.858093 containerd[1594]: time="2025-05-13T12:58:03.858055451Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:03.859072 containerd[1594]: time="2025-05-13T12:58:03.859038836Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.287188261s" May 13 12:58:03.859138 containerd[1594]: time="2025-05-13T12:58:03.859073030Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 13 12:58:06.359031 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 12:58:06.360981 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:58:06.689895 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 12:58:06.689990 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 12:58:06.690326 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:58:06.690517 systemd[1]: kubelet.service: Consumed 116ms CPU time, 81.9M memory peak. May 13 12:58:06.693768 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:58:06.721466 systemd[1]: Reload requested from client PID 2248 ('systemctl') (unit session-7.scope)... May 13 12:58:06.721480 systemd[1]: Reloading... May 13 12:58:06.805281 zram_generator::config[2291]: No configuration found. May 13 12:58:07.114373 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:58:07.229069 systemd[1]: Reloading finished in 507 ms. May 13 12:58:07.300935 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 12:58:07.301055 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 12:58:07.301467 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:58:07.301520 systemd[1]: kubelet.service: Consumed 129ms CPU time, 83.5M memory peak. May 13 12:58:07.303350 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:58:07.455809 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:58:07.471598 (kubelet)[2339]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 12:58:07.506742 kubelet[2339]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:58:07.506742 kubelet[2339]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 12:58:07.506742 kubelet[2339]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:58:07.507768 kubelet[2339]: I0513 12:58:07.507716 2339 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 12:58:07.876089 kubelet[2339]: I0513 12:58:07.875975 2339 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 12:58:07.876089 kubelet[2339]: I0513 12:58:07.876019 2339 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 12:58:07.876348 kubelet[2339]: I0513 12:58:07.876274 2339 server.go:929] "Client rotation is on, will bootstrap in background" May 13 12:58:07.907281 kubelet[2339]: E0513 12:58:07.907189 2339 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.121:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:07.907424 kubelet[2339]: I0513 12:58:07.907396 2339 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 12:58:07.915011 kubelet[2339]: I0513 12:58:07.914984 2339 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 12:58:07.921117 kubelet[2339]: I0513 12:58:07.921067 2339 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 12:58:07.922177 kubelet[2339]: I0513 12:58:07.922138 2339 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 12:58:07.922371 kubelet[2339]: I0513 12:58:07.922325 2339 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 12:58:07.922560 kubelet[2339]: I0513 12:58:07.922359 2339 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 12:58:07.922560 kubelet[2339]: I0513 12:58:07.922558 2339 topology_manager.go:138] "Creating topology manager with none policy" May 13 12:58:07.922673 kubelet[2339]: I0513 12:58:07.922571 2339 container_manager_linux.go:300] "Creating device plugin manager" May 13 12:58:07.922712 kubelet[2339]: I0513 12:58:07.922699 2339 state_mem.go:36] "Initialized new in-memory state store" May 13 12:58:07.924044 kubelet[2339]: I0513 12:58:07.924013 2339 kubelet.go:408] "Attempting to sync node with API server" May 13 12:58:07.924044 kubelet[2339]: I0513 12:58:07.924035 2339 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 12:58:07.924134 kubelet[2339]: I0513 12:58:07.924074 2339 kubelet.go:314] "Adding apiserver pod source" May 13 12:58:07.924134 kubelet[2339]: I0513 12:58:07.924094 2339 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 12:58:07.927446 kubelet[2339]: W0513 12:58:07.927233 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:07.927446 kubelet[2339]: E0513 12:58:07.927330 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:07.928239 kubelet[2339]: I0513 12:58:07.928221 2339 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 13 12:58:07.929364 kubelet[2339]: W0513 12:58:07.929337 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:07.929479 kubelet[2339]: E0513 12:58:07.929457 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:07.929879 kubelet[2339]: I0513 12:58:07.929857 2339 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 12:58:07.930371 kubelet[2339]: W0513 12:58:07.930355 2339 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 12:58:07.932155 kubelet[2339]: I0513 12:58:07.931955 2339 server.go:1269] "Started kubelet" May 13 12:58:07.933007 kubelet[2339]: I0513 12:58:07.932930 2339 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 12:58:07.935473 kubelet[2339]: I0513 12:58:07.935449 2339 server.go:460] "Adding debug handlers to kubelet server" May 13 12:58:07.937671 kubelet[2339]: I0513 12:58:07.937612 2339 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 12:58:07.937937 kubelet[2339]: I0513 12:58:07.937880 2339 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 12:58:07.937971 kubelet[2339]: I0513 12:58:07.937947 2339 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 12:58:07.938634 kubelet[2339]: I0513 12:58:07.938464 2339 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 12:58:07.939029 kubelet[2339]: I0513 12:58:07.938826 2339 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 12:58:07.939029 kubelet[2339]: I0513 12:58:07.938991 2339 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 12:58:07.939094 kubelet[2339]: I0513 12:58:07.939063 2339 reconciler.go:26] "Reconciler: start to sync state" May 13 12:58:07.939598 kubelet[2339]: W0513 12:58:07.939557 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:07.939645 kubelet[2339]: E0513 12:58:07.939608 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:07.940031 kubelet[2339]: E0513 12:58:07.937778 2339 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.121:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.121:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183f17900eac19bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 12:58:07.931931067 +0000 UTC m=+0.456801530,LastTimestamp:2025-05-13 12:58:07.931931067 +0000 UTC m=+0.456801530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 12:58:07.940537 kubelet[2339]: E0513 12:58:07.940469 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:07.940624 kubelet[2339]: E0513 12:58:07.940594 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.121:6443: connect: connection refused" interval="200ms" May 13 12:58:07.941526 kubelet[2339]: E0513 12:58:07.941499 2339 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 12:58:07.941915 kubelet[2339]: I0513 12:58:07.941881 2339 factory.go:221] Registration of the containerd container factory successfully May 13 12:58:07.941915 kubelet[2339]: I0513 12:58:07.941902 2339 factory.go:221] Registration of the systemd container factory successfully May 13 12:58:07.942012 kubelet[2339]: I0513 12:58:07.941993 2339 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 12:58:07.954107 kubelet[2339]: I0513 12:58:07.954047 2339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 12:58:07.954436 kubelet[2339]: I0513 12:58:07.954415 2339 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 12:58:07.954436 kubelet[2339]: I0513 12:58:07.954432 2339 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 12:58:07.954495 kubelet[2339]: I0513 12:58:07.954445 2339 state_mem.go:36] "Initialized new in-memory state store" May 13 12:58:07.955959 kubelet[2339]: I0513 12:58:07.955940 2339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 12:58:07.956022 kubelet[2339]: I0513 12:58:07.955970 2339 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 12:58:07.956022 kubelet[2339]: I0513 12:58:07.955991 2339 kubelet.go:2321] "Starting kubelet main sync loop" May 13 12:58:07.956060 kubelet[2339]: E0513 12:58:07.956037 2339 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 12:58:08.041418 kubelet[2339]: E0513 12:58:08.041364 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:08.056565 kubelet[2339]: E0513 12:58:08.056527 2339 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 12:58:08.141184 kubelet[2339]: E0513 12:58:08.141092 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.121:6443: connect: connection refused" interval="400ms" May 13 12:58:08.142170 kubelet[2339]: E0513 12:58:08.142133 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:08.242710 kubelet[2339]: E0513 12:58:08.242655 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:08.256834 kubelet[2339]: E0513 12:58:08.256812 2339 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 12:58:08.343344 kubelet[2339]: E0513 12:58:08.343292 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:08.351484 kubelet[2339]: W0513 12:58:08.351407 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:08.351556 kubelet[2339]: E0513 12:58:08.351488 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:08.351954 kubelet[2339]: I0513 12:58:08.351926 2339 policy_none.go:49] "None policy: Start" May 13 12:58:08.352708 kubelet[2339]: I0513 12:58:08.352664 2339 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 12:58:08.352708 kubelet[2339]: I0513 12:58:08.352691 2339 state_mem.go:35] "Initializing new in-memory state store" May 13 12:58:08.363660 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 12:58:08.380429 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 12:58:08.383309 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 12:58:08.391065 kubelet[2339]: I0513 12:58:08.391030 2339 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 12:58:08.391318 kubelet[2339]: I0513 12:58:08.391225 2339 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 12:58:08.391318 kubelet[2339]: I0513 12:58:08.391238 2339 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 12:58:08.391491 kubelet[2339]: I0513 12:58:08.391474 2339 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 12:58:08.392381 kubelet[2339]: E0513 12:58:08.392364 2339 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 13 12:58:08.492462 kubelet[2339]: I0513 12:58:08.492406 2339 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:58:08.492825 kubelet[2339]: E0513 12:58:08.492788 2339 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.121:6443/api/v1/nodes\": dial tcp 10.0.0.121:6443: connect: connection refused" node="localhost" May 13 12:58:08.542449 kubelet[2339]: E0513 12:58:08.542397 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.121:6443: connect: connection refused" interval="800ms" May 13 12:58:08.664128 systemd[1]: Created slice kubepods-burstable-podd7772bd17f2f480dc1e281037e106512.slice - libcontainer container kubepods-burstable-podd7772bd17f2f480dc1e281037e106512.slice. May 13 12:58:08.689331 systemd[1]: Created slice kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice - libcontainer container kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice. May 13 12:58:08.693967 kubelet[2339]: I0513 12:58:08.693948 2339 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:58:08.694265 kubelet[2339]: E0513 12:58:08.694235 2339 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.121:6443/api/v1/nodes\": dial tcp 10.0.0.121:6443: connect: connection refused" node="localhost" May 13 12:58:08.716153 systemd[1]: Created slice kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice - libcontainer container kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice. May 13 12:58:08.743303 kubelet[2339]: I0513 12:58:08.743248 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:08.743303 kubelet[2339]: I0513 12:58:08.743297 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:08.743433 kubelet[2339]: I0513 12:58:08.743313 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:08.743433 kubelet[2339]: I0513 12:58:08.743327 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 13 12:58:08.743433 kubelet[2339]: I0513 12:58:08.743345 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d7772bd17f2f480dc1e281037e106512-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d7772bd17f2f480dc1e281037e106512\") " pod="kube-system/kube-apiserver-localhost" May 13 12:58:08.743433 kubelet[2339]: I0513 12:58:08.743362 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d7772bd17f2f480dc1e281037e106512-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d7772bd17f2f480dc1e281037e106512\") " pod="kube-system/kube-apiserver-localhost" May 13 12:58:08.743433 kubelet[2339]: I0513 12:58:08.743389 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:08.743573 kubelet[2339]: I0513 12:58:08.743409 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:08.743573 kubelet[2339]: I0513 12:58:08.743427 2339 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d7772bd17f2f480dc1e281037e106512-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d7772bd17f2f480dc1e281037e106512\") " pod="kube-system/kube-apiserver-localhost" May 13 12:58:08.835819 kubelet[2339]: W0513 12:58:08.835763 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:08.835819 kubelet[2339]: E0513 12:58:08.835817 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:08.988539 containerd[1594]: time="2025-05-13T12:58:08.988433634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d7772bd17f2f480dc1e281037e106512,Namespace:kube-system,Attempt:0,}" May 13 12:58:09.015235 containerd[1594]: time="2025-05-13T12:58:09.015184762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,}" May 13 12:58:09.018663 containerd[1594]: time="2025-05-13T12:58:09.018627700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,}" May 13 12:58:09.096268 kubelet[2339]: I0513 12:58:09.096221 2339 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:58:09.096614 kubelet[2339]: E0513 12:58:09.096583 2339 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.121:6443/api/v1/nodes\": dial tcp 10.0.0.121:6443: connect: connection refused" node="localhost" May 13 12:58:09.254763 kubelet[2339]: E0513 12:58:09.254623 2339 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.121:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.121:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183f17900eac19bb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 12:58:07.931931067 +0000 UTC m=+0.456801530,LastTimestamp:2025-05-13 12:58:07.931931067 +0000 UTC m=+0.456801530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 12:58:09.343177 kubelet[2339]: E0513 12:58:09.343134 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.121:6443: connect: connection refused" interval="1.6s" May 13 12:58:09.398849 kubelet[2339]: W0513 12:58:09.398791 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:09.398906 kubelet[2339]: E0513 12:58:09.398850 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:09.493556 kubelet[2339]: W0513 12:58:09.493504 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:09.493556 kubelet[2339]: E0513 12:58:09.493540 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:09.544140 containerd[1594]: time="2025-05-13T12:58:09.542943216Z" level=info msg="connecting to shim 42726452510a4a7ea9817d363d472ff5b64b46e7aa6ec308df32196bf17cdf90" address="unix:///run/containerd/s/b5e3dffc66dbfa958300c1ca276b5b178f3fd21d2b8441bd437b2e8529027945" namespace=k8s.io protocol=ttrpc version=3 May 13 12:58:09.545307 containerd[1594]: time="2025-05-13T12:58:09.545277304Z" level=info msg="connecting to shim 874f0137899fe5e77ed930e1bb00e70777b35c428dbea9611f93fcae63d7460e" address="unix:///run/containerd/s/3d8d5ce3d3c6ecc5a917000477085fb1c35120dcc060acb391f814ce0104ad07" namespace=k8s.io protocol=ttrpc version=3 May 13 12:58:09.551595 containerd[1594]: time="2025-05-13T12:58:09.551549989Z" level=info msg="connecting to shim bd2c507c72c68ffcde3b26d7d704cc485031807ba9b1c383941bcfc1237bfc5c" address="unix:///run/containerd/s/43b295862b25c231700972e75a40c24acad6834a20cfa93477c9556d7e65e86f" namespace=k8s.io protocol=ttrpc version=3 May 13 12:58:09.567651 systemd[1]: Started cri-containerd-874f0137899fe5e77ed930e1bb00e70777b35c428dbea9611f93fcae63d7460e.scope - libcontainer container 874f0137899fe5e77ed930e1bb00e70777b35c428dbea9611f93fcae63d7460e. May 13 12:58:09.571765 systemd[1]: Started cri-containerd-42726452510a4a7ea9817d363d472ff5b64b46e7aa6ec308df32196bf17cdf90.scope - libcontainer container 42726452510a4a7ea9817d363d472ff5b64b46e7aa6ec308df32196bf17cdf90. May 13 12:58:09.575294 systemd[1]: Started cri-containerd-bd2c507c72c68ffcde3b26d7d704cc485031807ba9b1c383941bcfc1237bfc5c.scope - libcontainer container bd2c507c72c68ffcde3b26d7d704cc485031807ba9b1c383941bcfc1237bfc5c. May 13 12:58:09.615941 containerd[1594]: time="2025-05-13T12:58:09.615809873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,} returns sandbox id \"42726452510a4a7ea9817d363d472ff5b64b46e7aa6ec308df32196bf17cdf90\"" May 13 12:58:09.619240 containerd[1594]: time="2025-05-13T12:58:09.619206113Z" level=info msg="CreateContainer within sandbox \"42726452510a4a7ea9817d363d472ff5b64b46e7aa6ec308df32196bf17cdf90\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 12:58:09.619797 containerd[1594]: time="2025-05-13T12:58:09.619773197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d7772bd17f2f480dc1e281037e106512,Namespace:kube-system,Attempt:0,} returns sandbox id \"874f0137899fe5e77ed930e1bb00e70777b35c428dbea9611f93fcae63d7460e\"" May 13 12:58:09.622042 containerd[1594]: time="2025-05-13T12:58:09.622015883Z" level=info msg="CreateContainer within sandbox \"874f0137899fe5e77ed930e1bb00e70777b35c428dbea9611f93fcae63d7460e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 12:58:09.631504 containerd[1594]: time="2025-05-13T12:58:09.631426775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,} returns sandbox id \"bd2c507c72c68ffcde3b26d7d704cc485031807ba9b1c383941bcfc1237bfc5c\"" May 13 12:58:09.633372 containerd[1594]: time="2025-05-13T12:58:09.633337830Z" level=info msg="CreateContainer within sandbox \"bd2c507c72c68ffcde3b26d7d704cc485031807ba9b1c383941bcfc1237bfc5c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 12:58:09.635142 containerd[1594]: time="2025-05-13T12:58:09.635113681Z" level=info msg="Container b9dea1e6e1fe929bab3f87c5a53a8ba6de4c28202e8fbc3a7027c618f817527b: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:09.636893 containerd[1594]: time="2025-05-13T12:58:09.636858965Z" level=info msg="Container ecba91b8b01fb80da065f9601ed49b76803bd4ad1972bf8f340ca0d086b2cb68: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:09.643420 containerd[1594]: time="2025-05-13T12:58:09.643385988Z" level=info msg="CreateContainer within sandbox \"42726452510a4a7ea9817d363d472ff5b64b46e7aa6ec308df32196bf17cdf90\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b9dea1e6e1fe929bab3f87c5a53a8ba6de4c28202e8fbc3a7027c618f817527b\"" May 13 12:58:09.643888 containerd[1594]: time="2025-05-13T12:58:09.643860458Z" level=info msg="StartContainer for \"b9dea1e6e1fe929bab3f87c5a53a8ba6de4c28202e8fbc3a7027c618f817527b\"" May 13 12:58:09.644796 containerd[1594]: time="2025-05-13T12:58:09.644777468Z" level=info msg="Container 34c7c602cb5183efe3588a3712ae09a9b9d67bdd6370dfe89c8a123ab856a6f0: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:09.645090 containerd[1594]: time="2025-05-13T12:58:09.645058966Z" level=info msg="connecting to shim b9dea1e6e1fe929bab3f87c5a53a8ba6de4c28202e8fbc3a7027c618f817527b" address="unix:///run/containerd/s/b5e3dffc66dbfa958300c1ca276b5b178f3fd21d2b8441bd437b2e8529027945" protocol=ttrpc version=3 May 13 12:58:09.651914 containerd[1594]: time="2025-05-13T12:58:09.651878257Z" level=info msg="CreateContainer within sandbox \"874f0137899fe5e77ed930e1bb00e70777b35c428dbea9611f93fcae63d7460e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ecba91b8b01fb80da065f9601ed49b76803bd4ad1972bf8f340ca0d086b2cb68\"" May 13 12:58:09.652505 containerd[1594]: time="2025-05-13T12:58:09.652477901Z" level=info msg="StartContainer for \"ecba91b8b01fb80da065f9601ed49b76803bd4ad1972bf8f340ca0d086b2cb68\"" May 13 12:58:09.653502 containerd[1594]: time="2025-05-13T12:58:09.653463821Z" level=info msg="connecting to shim ecba91b8b01fb80da065f9601ed49b76803bd4ad1972bf8f340ca0d086b2cb68" address="unix:///run/containerd/s/3d8d5ce3d3c6ecc5a917000477085fb1c35120dcc060acb391f814ce0104ad07" protocol=ttrpc version=3 May 13 12:58:09.655009 containerd[1594]: time="2025-05-13T12:58:09.654984954Z" level=info msg="CreateContainer within sandbox \"bd2c507c72c68ffcde3b26d7d704cc485031807ba9b1c383941bcfc1237bfc5c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"34c7c602cb5183efe3588a3712ae09a9b9d67bdd6370dfe89c8a123ab856a6f0\"" May 13 12:58:09.655326 containerd[1594]: time="2025-05-13T12:58:09.655298993Z" level=info msg="StartContainer for \"34c7c602cb5183efe3588a3712ae09a9b9d67bdd6370dfe89c8a123ab856a6f0\"" May 13 12:58:09.656292 containerd[1594]: time="2025-05-13T12:58:09.656052316Z" level=info msg="connecting to shim 34c7c602cb5183efe3588a3712ae09a9b9d67bdd6370dfe89c8a123ab856a6f0" address="unix:///run/containerd/s/43b295862b25c231700972e75a40c24acad6834a20cfa93477c9556d7e65e86f" protocol=ttrpc version=3 May 13 12:58:09.661625 kubelet[2339]: W0513 12:58:09.661476 2339 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.121:6443: connect: connection refused May 13 12:58:09.662013 kubelet[2339]: E0513 12:58:09.661980 2339 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.121:6443: connect: connection refused" logger="UnhandledError" May 13 12:58:09.667395 systemd[1]: Started cri-containerd-b9dea1e6e1fe929bab3f87c5a53a8ba6de4c28202e8fbc3a7027c618f817527b.scope - libcontainer container b9dea1e6e1fe929bab3f87c5a53a8ba6de4c28202e8fbc3a7027c618f817527b. May 13 12:58:09.672447 systemd[1]: Started cri-containerd-ecba91b8b01fb80da065f9601ed49b76803bd4ad1972bf8f340ca0d086b2cb68.scope - libcontainer container ecba91b8b01fb80da065f9601ed49b76803bd4ad1972bf8f340ca0d086b2cb68. May 13 12:58:09.676560 systemd[1]: Started cri-containerd-34c7c602cb5183efe3588a3712ae09a9b9d67bdd6370dfe89c8a123ab856a6f0.scope - libcontainer container 34c7c602cb5183efe3588a3712ae09a9b9d67bdd6370dfe89c8a123ab856a6f0. May 13 12:58:09.716851 containerd[1594]: time="2025-05-13T12:58:09.716491052Z" level=info msg="StartContainer for \"b9dea1e6e1fe929bab3f87c5a53a8ba6de4c28202e8fbc3a7027c618f817527b\" returns successfully" May 13 12:58:09.722471 containerd[1594]: time="2025-05-13T12:58:09.722414482Z" level=info msg="StartContainer for \"ecba91b8b01fb80da065f9601ed49b76803bd4ad1972bf8f340ca0d086b2cb68\" returns successfully" May 13 12:58:09.730120 containerd[1594]: time="2025-05-13T12:58:09.730080101Z" level=info msg="StartContainer for \"34c7c602cb5183efe3588a3712ae09a9b9d67bdd6370dfe89c8a123ab856a6f0\" returns successfully" May 13 12:58:09.898131 kubelet[2339]: I0513 12:58:09.897957 2339 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:58:10.946393 kubelet[2339]: E0513 12:58:10.946345 2339 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 13 12:58:11.103287 kubelet[2339]: I0513 12:58:11.101849 2339 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 13 12:58:11.103287 kubelet[2339]: E0513 12:58:11.101887 2339 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" May 13 12:58:11.115287 kubelet[2339]: E0513 12:58:11.114287 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.214832 kubelet[2339]: E0513 12:58:11.214686 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.315686 kubelet[2339]: E0513 12:58:11.315642 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.416792 kubelet[2339]: E0513 12:58:11.416731 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.517822 kubelet[2339]: E0513 12:58:11.517690 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.618361 kubelet[2339]: E0513 12:58:11.618317 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.718953 kubelet[2339]: E0513 12:58:11.718910 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.819525 kubelet[2339]: E0513 12:58:11.819422 2339 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:11.928936 kubelet[2339]: I0513 12:58:11.928885 2339 apiserver.go:52] "Watching apiserver" May 13 12:58:11.939805 kubelet[2339]: I0513 12:58:11.939767 2339 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 12:58:12.897090 systemd[1]: Reload requested from client PID 2616 ('systemctl') (unit session-7.scope)... May 13 12:58:12.897107 systemd[1]: Reloading... May 13 12:58:12.984315 zram_generator::config[2662]: No configuration found. May 13 12:58:13.078014 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 12:58:13.211912 systemd[1]: Reloading finished in 314 ms. May 13 12:58:13.246615 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:58:13.273599 systemd[1]: kubelet.service: Deactivated successfully. May 13 12:58:13.273942 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:58:13.273996 systemd[1]: kubelet.service: Consumed 852ms CPU time, 118.1M memory peak. May 13 12:58:13.275870 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 12:58:13.460515 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 12:58:13.465489 (kubelet)[2704]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 12:58:13.509484 kubelet[2704]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:58:13.509484 kubelet[2704]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 12:58:13.509484 kubelet[2704]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 12:58:13.509841 kubelet[2704]: I0513 12:58:13.509515 2704 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 12:58:13.515982 kubelet[2704]: I0513 12:58:13.515942 2704 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 12:58:13.515982 kubelet[2704]: I0513 12:58:13.515972 2704 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 12:58:13.516276 kubelet[2704]: I0513 12:58:13.516225 2704 server.go:929] "Client rotation is on, will bootstrap in background" May 13 12:58:13.517591 kubelet[2704]: I0513 12:58:13.517561 2704 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 12:58:13.519393 kubelet[2704]: I0513 12:58:13.519347 2704 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 12:58:13.523067 kubelet[2704]: I0513 12:58:13.523036 2704 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 12:58:13.528181 kubelet[2704]: I0513 12:58:13.528148 2704 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 12:58:13.528347 kubelet[2704]: I0513 12:58:13.528322 2704 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 12:58:13.528702 kubelet[2704]: I0513 12:58:13.528672 2704 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 12:58:13.528878 kubelet[2704]: I0513 12:58:13.528698 2704 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 12:58:13.528878 kubelet[2704]: I0513 12:58:13.528877 2704 topology_manager.go:138] "Creating topology manager with none policy" May 13 12:58:13.528982 kubelet[2704]: I0513 12:58:13.528888 2704 container_manager_linux.go:300] "Creating device plugin manager" May 13 12:58:13.528982 kubelet[2704]: I0513 12:58:13.528918 2704 state_mem.go:36] "Initialized new in-memory state store" May 13 12:58:13.529030 kubelet[2704]: I0513 12:58:13.529025 2704 kubelet.go:408] "Attempting to sync node with API server" May 13 12:58:13.529054 kubelet[2704]: I0513 12:58:13.529037 2704 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 12:58:13.529079 kubelet[2704]: I0513 12:58:13.529068 2704 kubelet.go:314] "Adding apiserver pod source" May 13 12:58:13.529103 kubelet[2704]: I0513 12:58:13.529083 2704 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 12:58:13.530053 kubelet[2704]: I0513 12:58:13.529641 2704 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 13 12:58:13.530053 kubelet[2704]: I0513 12:58:13.529992 2704 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 12:58:13.530615 kubelet[2704]: I0513 12:58:13.530512 2704 server.go:1269] "Started kubelet" May 13 12:58:13.531929 kubelet[2704]: I0513 12:58:13.531874 2704 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 12:58:13.532737 kubelet[2704]: I0513 12:58:13.532150 2704 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 12:58:13.532737 kubelet[2704]: I0513 12:58:13.532202 2704 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 12:58:13.532737 kubelet[2704]: I0513 12:58:13.532301 2704 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 12:58:13.533089 kubelet[2704]: I0513 12:58:13.533065 2704 server.go:460] "Adding debug handlers to kubelet server" May 13 12:58:13.536777 kubelet[2704]: I0513 12:58:13.536232 2704 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 12:58:13.538290 kubelet[2704]: E0513 12:58:13.537400 2704 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 13 12:58:13.538290 kubelet[2704]: I0513 12:58:13.537446 2704 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 12:58:13.538290 kubelet[2704]: I0513 12:58:13.537571 2704 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 12:58:13.538290 kubelet[2704]: I0513 12:58:13.537677 2704 reconciler.go:26] "Reconciler: start to sync state" May 13 12:58:13.541288 kubelet[2704]: E0513 12:58:13.540429 2704 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 12:58:13.541537 kubelet[2704]: I0513 12:58:13.541493 2704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 12:58:13.544501 kubelet[2704]: I0513 12:58:13.543591 2704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 12:58:13.544501 kubelet[2704]: I0513 12:58:13.543623 2704 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 12:58:13.544501 kubelet[2704]: I0513 12:58:13.543821 2704 kubelet.go:2321] "Starting kubelet main sync loop" May 13 12:58:13.544501 kubelet[2704]: E0513 12:58:13.543857 2704 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 12:58:13.548831 kubelet[2704]: I0513 12:58:13.545099 2704 factory.go:221] Registration of the containerd container factory successfully May 13 12:58:13.548831 kubelet[2704]: I0513 12:58:13.545119 2704 factory.go:221] Registration of the systemd container factory successfully May 13 12:58:13.548831 kubelet[2704]: I0513 12:58:13.545198 2704 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 12:58:13.600064 kubelet[2704]: I0513 12:58:13.600026 2704 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 12:58:13.600064 kubelet[2704]: I0513 12:58:13.600048 2704 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 12:58:13.600064 kubelet[2704]: I0513 12:58:13.600065 2704 state_mem.go:36] "Initialized new in-memory state store" May 13 12:58:13.600226 kubelet[2704]: I0513 12:58:13.600206 2704 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 12:58:13.600272 kubelet[2704]: I0513 12:58:13.600216 2704 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 12:58:13.600272 kubelet[2704]: I0513 12:58:13.600235 2704 policy_none.go:49] "None policy: Start" May 13 12:58:13.600896 kubelet[2704]: I0513 12:58:13.600873 2704 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 12:58:13.600946 kubelet[2704]: I0513 12:58:13.600900 2704 state_mem.go:35] "Initializing new in-memory state store" May 13 12:58:13.601065 kubelet[2704]: I0513 12:58:13.601048 2704 state_mem.go:75] "Updated machine memory state" May 13 12:58:13.605569 kubelet[2704]: I0513 12:58:13.605447 2704 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 12:58:13.605663 kubelet[2704]: I0513 12:58:13.605637 2704 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 12:58:13.605690 kubelet[2704]: I0513 12:58:13.605652 2704 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 12:58:13.606037 kubelet[2704]: I0513 12:58:13.605823 2704 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 12:58:13.649580 kubelet[2704]: E0513 12:58:13.649518 2704 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 13 12:58:13.708030 kubelet[2704]: I0513 12:58:13.707987 2704 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 13 12:58:13.715067 kubelet[2704]: I0513 12:58:13.715035 2704 kubelet_node_status.go:111] "Node was previously registered" node="localhost" May 13 12:58:13.715215 kubelet[2704]: I0513 12:58:13.715115 2704 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 13 12:58:13.838518 kubelet[2704]: I0513 12:58:13.838473 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 13 12:58:13.838518 kubelet[2704]: I0513 12:58:13.838506 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d7772bd17f2f480dc1e281037e106512-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d7772bd17f2f480dc1e281037e106512\") " pod="kube-system/kube-apiserver-localhost" May 13 12:58:13.838518 kubelet[2704]: I0513 12:58:13.838528 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d7772bd17f2f480dc1e281037e106512-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d7772bd17f2f480dc1e281037e106512\") " pod="kube-system/kube-apiserver-localhost" May 13 12:58:13.838518 kubelet[2704]: I0513 12:58:13.838544 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:13.838811 kubelet[2704]: I0513 12:58:13.838558 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:13.838811 kubelet[2704]: I0513 12:58:13.838573 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:13.838811 kubelet[2704]: I0513 12:58:13.838589 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:13.838811 kubelet[2704]: I0513 12:58:13.838602 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d7772bd17f2f480dc1e281037e106512-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d7772bd17f2f480dc1e281037e106512\") " pod="kube-system/kube-apiserver-localhost" May 13 12:58:13.838811 kubelet[2704]: I0513 12:58:13.838639 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 13 12:58:14.529867 kubelet[2704]: I0513 12:58:14.529830 2704 apiserver.go:52] "Watching apiserver" May 13 12:58:14.538224 kubelet[2704]: I0513 12:58:14.538185 2704 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 12:58:14.588046 kubelet[2704]: E0513 12:58:14.588003 2704 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 13 12:58:14.588541 kubelet[2704]: E0513 12:58:14.588292 2704 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 13 12:58:14.588541 kubelet[2704]: E0513 12:58:14.588504 2704 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 13 12:58:14.596394 kubelet[2704]: I0513 12:58:14.596335 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.596323873 podStartE2EDuration="1.596323873s" podCreationTimestamp="2025-05-13 12:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:58:14.596202496 +0000 UTC m=+1.126830493" watchObservedRunningTime="2025-05-13 12:58:14.596323873 +0000 UTC m=+1.126951870" May 13 12:58:14.611189 kubelet[2704]: I0513 12:58:14.611103 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.611086895 podStartE2EDuration="2.611086895s" podCreationTimestamp="2025-05-13 12:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:58:14.603778526 +0000 UTC m=+1.134406523" watchObservedRunningTime="2025-05-13 12:58:14.611086895 +0000 UTC m=+1.141714892" May 13 12:58:14.618402 kubelet[2704]: I0513 12:58:14.618328 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.618305825 podStartE2EDuration="1.618305825s" podCreationTimestamp="2025-05-13 12:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:58:14.611430719 +0000 UTC m=+1.142058716" watchObservedRunningTime="2025-05-13 12:58:14.618305825 +0000 UTC m=+1.148933822" May 13 12:58:18.326330 sudo[1799]: pam_unix(sudo:session): session closed for user root May 13 12:58:18.540423 sshd[1798]: Connection closed by 10.0.0.1 port 57648 May 13 12:58:18.581823 sshd-session[1796]: pam_unix(sshd:session): session closed for user core May 13 12:58:18.586180 systemd[1]: sshd@6-10.0.0.121:22-10.0.0.1:57648.service: Deactivated successfully. May 13 12:58:18.588086 systemd[1]: session-7.scope: Deactivated successfully. May 13 12:58:18.588305 systemd[1]: session-7.scope: Consumed 4.674s CPU time, 222.5M memory peak. May 13 12:58:18.589536 systemd-logind[1578]: Session 7 logged out. Waiting for processes to exit. May 13 12:58:18.590937 systemd-logind[1578]: Removed session 7. May 13 12:58:18.845225 kubelet[2704]: I0513 12:58:18.845112 2704 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 12:58:18.845621 containerd[1594]: time="2025-05-13T12:58:18.845428674Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 12:58:18.846044 kubelet[2704]: I0513 12:58:18.845995 2704 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 12:58:19.997511 systemd[1]: Created slice kubepods-besteffort-podad4347b4_974e_4b11_a2ed_169f4e65f2cf.slice - libcontainer container kubepods-besteffort-podad4347b4_974e_4b11_a2ed_169f4e65f2cf.slice. May 13 12:58:20.083401 kubelet[2704]: I0513 12:58:20.083369 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ad4347b4-974e-4b11-a2ed-169f4e65f2cf-kube-proxy\") pod \"kube-proxy-bgml8\" (UID: \"ad4347b4-974e-4b11-a2ed-169f4e65f2cf\") " pod="kube-system/kube-proxy-bgml8" May 13 12:58:20.083401 kubelet[2704]: I0513 12:58:20.083401 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad4347b4-974e-4b11-a2ed-169f4e65f2cf-xtables-lock\") pod \"kube-proxy-bgml8\" (UID: \"ad4347b4-974e-4b11-a2ed-169f4e65f2cf\") " pod="kube-system/kube-proxy-bgml8" May 13 12:58:20.083785 kubelet[2704]: I0513 12:58:20.083418 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad4347b4-974e-4b11-a2ed-169f4e65f2cf-lib-modules\") pod \"kube-proxy-bgml8\" (UID: \"ad4347b4-974e-4b11-a2ed-169f4e65f2cf\") " pod="kube-system/kube-proxy-bgml8" May 13 12:58:20.083785 kubelet[2704]: I0513 12:58:20.083433 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmnfb\" (UniqueName: \"kubernetes.io/projected/ad4347b4-974e-4b11-a2ed-169f4e65f2cf-kube-api-access-dmnfb\") pod \"kube-proxy-bgml8\" (UID: \"ad4347b4-974e-4b11-a2ed-169f4e65f2cf\") " pod="kube-system/kube-proxy-bgml8" May 13 12:58:20.423534 systemd[1]: Created slice kubepods-besteffort-podf23d9f84_aad4_4bfc_92e0_e61ab4d4d47c.slice - libcontainer container kubepods-besteffort-podf23d9f84_aad4_4bfc_92e0_e61ab4d4d47c.slice. May 13 12:58:20.486151 kubelet[2704]: I0513 12:58:20.486120 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f23d9f84-aad4-4bfc-92e0-e61ab4d4d47c-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-b7m56\" (UID: \"f23d9f84-aad4-4bfc-92e0-e61ab4d4d47c\") " pod="tigera-operator/tigera-operator-6f6897fdc5-b7m56" May 13 12:58:20.486269 kubelet[2704]: I0513 12:58:20.486154 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfqx4\" (UniqueName: \"kubernetes.io/projected/f23d9f84-aad4-4bfc-92e0-e61ab4d4d47c-kube-api-access-rfqx4\") pod \"tigera-operator-6f6897fdc5-b7m56\" (UID: \"f23d9f84-aad4-4bfc-92e0-e61ab4d4d47c\") " pod="tigera-operator/tigera-operator-6f6897fdc5-b7m56" May 13 12:58:20.663086 containerd[1594]: time="2025-05-13T12:58:20.663041851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bgml8,Uid:ad4347b4-974e-4b11-a2ed-169f4e65f2cf,Namespace:kube-system,Attempt:0,}" May 13 12:58:20.726430 containerd[1594]: time="2025-05-13T12:58:20.726307132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-b7m56,Uid:f23d9f84-aad4-4bfc-92e0-e61ab4d4d47c,Namespace:tigera-operator,Attempt:0,}" May 13 12:58:21.709382 containerd[1594]: time="2025-05-13T12:58:21.709333325Z" level=info msg="connecting to shim 445e34892fc6363dae92baac07a97f618f9c4bf73da53e7c417e207e7393b226" address="unix:///run/containerd/s/a2c6b8c643a27ea91b56f32f11bfe997b7d1ab718fa4fe25eac76390abaf9663" namespace=k8s.io protocol=ttrpc version=3 May 13 12:58:21.730405 systemd[1]: Started cri-containerd-445e34892fc6363dae92baac07a97f618f9c4bf73da53e7c417e207e7393b226.scope - libcontainer container 445e34892fc6363dae92baac07a97f618f9c4bf73da53e7c417e207e7393b226. May 13 12:58:21.744158 containerd[1594]: time="2025-05-13T12:58:21.744118302Z" level=info msg="connecting to shim f8e193c528c7f08ea840dd68dea0ad61595b7f812ce5a251c134afa1fc8417db" address="unix:///run/containerd/s/d72718642fbe5f0908a0bdbaaf8f3f7f58ada9111f3c49707615c292f233460f" namespace=k8s.io protocol=ttrpc version=3 May 13 12:58:21.763145 containerd[1594]: time="2025-05-13T12:58:21.763083616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bgml8,Uid:ad4347b4-974e-4b11-a2ed-169f4e65f2cf,Namespace:kube-system,Attempt:0,} returns sandbox id \"445e34892fc6363dae92baac07a97f618f9c4bf73da53e7c417e207e7393b226\"" May 13 12:58:21.770028 containerd[1594]: time="2025-05-13T12:58:21.769978526Z" level=info msg="CreateContainer within sandbox \"445e34892fc6363dae92baac07a97f618f9c4bf73da53e7c417e207e7393b226\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 12:58:21.775439 systemd[1]: Started cri-containerd-f8e193c528c7f08ea840dd68dea0ad61595b7f812ce5a251c134afa1fc8417db.scope - libcontainer container f8e193c528c7f08ea840dd68dea0ad61595b7f812ce5a251c134afa1fc8417db. May 13 12:58:21.781989 containerd[1594]: time="2025-05-13T12:58:21.781946695Z" level=info msg="Container 7b3feccda5de6a46a9e3fe090eed5e7378ad0cbf2f06d6392cd563ba7fb2bcdf: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:21.791656 containerd[1594]: time="2025-05-13T12:58:21.791626385Z" level=info msg="CreateContainer within sandbox \"445e34892fc6363dae92baac07a97f618f9c4bf73da53e7c417e207e7393b226\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7b3feccda5de6a46a9e3fe090eed5e7378ad0cbf2f06d6392cd563ba7fb2bcdf\"" May 13 12:58:21.792480 containerd[1594]: time="2025-05-13T12:58:21.792417992Z" level=info msg="StartContainer for \"7b3feccda5de6a46a9e3fe090eed5e7378ad0cbf2f06d6392cd563ba7fb2bcdf\"" May 13 12:58:21.794307 containerd[1594]: time="2025-05-13T12:58:21.794243863Z" level=info msg="connecting to shim 7b3feccda5de6a46a9e3fe090eed5e7378ad0cbf2f06d6392cd563ba7fb2bcdf" address="unix:///run/containerd/s/a2c6b8c643a27ea91b56f32f11bfe997b7d1ab718fa4fe25eac76390abaf9663" protocol=ttrpc version=3 May 13 12:58:21.819531 systemd[1]: Started cri-containerd-7b3feccda5de6a46a9e3fe090eed5e7378ad0cbf2f06d6392cd563ba7fb2bcdf.scope - libcontainer container 7b3feccda5de6a46a9e3fe090eed5e7378ad0cbf2f06d6392cd563ba7fb2bcdf. May 13 12:58:21.824412 containerd[1594]: time="2025-05-13T12:58:21.824363996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-b7m56,Uid:f23d9f84-aad4-4bfc-92e0-e61ab4d4d47c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f8e193c528c7f08ea840dd68dea0ad61595b7f812ce5a251c134afa1fc8417db\"" May 13 12:58:21.828118 containerd[1594]: time="2025-05-13T12:58:21.828068680Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 12:58:21.863389 containerd[1594]: time="2025-05-13T12:58:21.863351802Z" level=info msg="StartContainer for \"7b3feccda5de6a46a9e3fe090eed5e7378ad0cbf2f06d6392cd563ba7fb2bcdf\" returns successfully" May 13 12:58:22.641605 kubelet[2704]: I0513 12:58:22.641553 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bgml8" podStartSLOduration=3.6415374209999998 podStartE2EDuration="3.641537421s" podCreationTimestamp="2025-05-13 12:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:58:22.607671905 +0000 UTC m=+9.138299902" watchObservedRunningTime="2025-05-13 12:58:22.641537421 +0000 UTC m=+9.172165418" May 13 12:58:22.705927 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1021367022.mount: Deactivated successfully. May 13 12:58:24.225211 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1320672875.mount: Deactivated successfully. May 13 12:58:24.921497 containerd[1594]: time="2025-05-13T12:58:24.921397960Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:25.012980 containerd[1594]: time="2025-05-13T12:58:25.012896994Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 13 12:58:25.078892 containerd[1594]: time="2025-05-13T12:58:25.078817386Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:25.186048 containerd[1594]: time="2025-05-13T12:58:25.185915732Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:25.186622 containerd[1594]: time="2025-05-13T12:58:25.186586301Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 3.358378496s" May 13 12:58:25.186622 containerd[1594]: time="2025-05-13T12:58:25.186613283Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 13 12:58:25.188436 containerd[1594]: time="2025-05-13T12:58:25.188383941Z" level=info msg="CreateContainer within sandbox \"f8e193c528c7f08ea840dd68dea0ad61595b7f812ce5a251c134afa1fc8417db\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 12:58:25.525015 containerd[1594]: time="2025-05-13T12:58:25.524833182Z" level=info msg="Container a7547d5f77fad38f4585b165a015232bdaa48b9f7467ecfc4af7612f99dfc749: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:25.592947 containerd[1594]: time="2025-05-13T12:58:25.592892534Z" level=info msg="CreateContainer within sandbox \"f8e193c528c7f08ea840dd68dea0ad61595b7f812ce5a251c134afa1fc8417db\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a7547d5f77fad38f4585b165a015232bdaa48b9f7467ecfc4af7612f99dfc749\"" May 13 12:58:25.593400 containerd[1594]: time="2025-05-13T12:58:25.593365246Z" level=info msg="StartContainer for \"a7547d5f77fad38f4585b165a015232bdaa48b9f7467ecfc4af7612f99dfc749\"" May 13 12:58:25.594362 containerd[1594]: time="2025-05-13T12:58:25.594302164Z" level=info msg="connecting to shim a7547d5f77fad38f4585b165a015232bdaa48b9f7467ecfc4af7612f99dfc749" address="unix:///run/containerd/s/d72718642fbe5f0908a0bdbaaf8f3f7f58ada9111f3c49707615c292f233460f" protocol=ttrpc version=3 May 13 12:58:25.644408 systemd[1]: Started cri-containerd-a7547d5f77fad38f4585b165a015232bdaa48b9f7467ecfc4af7612f99dfc749.scope - libcontainer container a7547d5f77fad38f4585b165a015232bdaa48b9f7467ecfc4af7612f99dfc749. May 13 12:58:25.730185 containerd[1594]: time="2025-05-13T12:58:25.730132769Z" level=info msg="StartContainer for \"a7547d5f77fad38f4585b165a015232bdaa48b9f7467ecfc4af7612f99dfc749\" returns successfully" May 13 12:58:26.614918 kubelet[2704]: I0513 12:58:26.614828 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-b7m56" podStartSLOduration=4.254365005 podStartE2EDuration="7.614771843s" podCreationTimestamp="2025-05-13 12:58:19 +0000 UTC" firstStartedPulling="2025-05-13 12:58:21.826830636 +0000 UTC m=+8.357458633" lastFinishedPulling="2025-05-13 12:58:25.187237474 +0000 UTC m=+11.717865471" observedRunningTime="2025-05-13 12:58:26.614652575 +0000 UTC m=+13.145280573" watchObservedRunningTime="2025-05-13 12:58:26.614771843 +0000 UTC m=+13.145399840" May 13 12:58:29.079783 update_engine[1580]: I20250513 12:58:29.079712 1580 update_attempter.cc:509] Updating boot flags... May 13 12:58:29.518877 systemd[1]: Created slice kubepods-besteffort-pod4ea0ff14_d689_405f_b2d4_0c4987ad0e9a.slice - libcontainer container kubepods-besteffort-pod4ea0ff14_d689_405f_b2d4_0c4987ad0e9a.slice. May 13 12:58:29.562066 systemd[1]: Created slice kubepods-besteffort-pod6f5a9517_847c_4101_8b62_b95d8d1852c2.slice - libcontainer container kubepods-besteffort-pod6f5a9517_847c_4101_8b62_b95d8d1852c2.slice. May 13 12:58:29.644148 kubelet[2704]: I0513 12:58:29.644112 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4ea0ff14-d689-405f-b2d4-0c4987ad0e9a-typha-certs\") pod \"calico-typha-6cc66f7cbd-4plvb\" (UID: \"4ea0ff14-d689-405f-b2d4-0c4987ad0e9a\") " pod="calico-system/calico-typha-6cc66f7cbd-4plvb" May 13 12:58:29.644148 kubelet[2704]: I0513 12:58:29.644149 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ea0ff14-d689-405f-b2d4-0c4987ad0e9a-tigera-ca-bundle\") pod \"calico-typha-6cc66f7cbd-4plvb\" (UID: \"4ea0ff14-d689-405f-b2d4-0c4987ad0e9a\") " pod="calico-system/calico-typha-6cc66f7cbd-4plvb" May 13 12:58:29.644673 kubelet[2704]: I0513 12:58:29.644165 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2nc\" (UniqueName: \"kubernetes.io/projected/4ea0ff14-d689-405f-b2d4-0c4987ad0e9a-kube-api-access-2x2nc\") pod \"calico-typha-6cc66f7cbd-4plvb\" (UID: \"4ea0ff14-d689-405f-b2d4-0c4987ad0e9a\") " pod="calico-system/calico-typha-6cc66f7cbd-4plvb" May 13 12:58:29.745230 kubelet[2704]: I0513 12:58:29.745177 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-policysync\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745230 kubelet[2704]: I0513 12:58:29.745220 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-bin-dir\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745419 kubelet[2704]: I0513 12:58:29.745243 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsjq8\" (UniqueName: \"kubernetes.io/projected/6f5a9517-847c-4101-8b62-b95d8d1852c2-kube-api-access-dsjq8\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745419 kubelet[2704]: I0513 12:58:29.745285 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-lib-modules\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745419 kubelet[2704]: I0513 12:58:29.745304 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-net-dir\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745419 kubelet[2704]: I0513 12:58:29.745363 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-flexvol-driver-host\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745419 kubelet[2704]: I0513 12:58:29.745401 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-log-dir\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745527 kubelet[2704]: I0513 12:58:29.745424 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-xtables-lock\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745527 kubelet[2704]: I0513 12:58:29.745443 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5a9517-847c-4101-8b62-b95d8d1852c2-tigera-ca-bundle\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745527 kubelet[2704]: I0513 12:58:29.745474 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6f5a9517-847c-4101-8b62-b95d8d1852c2-node-certs\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745527 kubelet[2704]: I0513 12:58:29.745494 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-run-calico\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.745527 kubelet[2704]: I0513 12:58:29.745515 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-lib-calico\") pod \"calico-node-krrmx\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " pod="calico-system/calico-node-krrmx" May 13 12:58:29.847359 kubelet[2704]: E0513 12:58:29.846869 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.847359 kubelet[2704]: W0513 12:58:29.846894 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.847359 kubelet[2704]: E0513 12:58:29.846919 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.847359 kubelet[2704]: E0513 12:58:29.847081 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.847359 kubelet[2704]: W0513 12:58:29.847090 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.847359 kubelet[2704]: E0513 12:58:29.847100 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.847359 kubelet[2704]: E0513 12:58:29.847247 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.847359 kubelet[2704]: W0513 12:58:29.847274 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.847359 kubelet[2704]: E0513 12:58:29.847285 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.848595 kubelet[2704]: E0513 12:58:29.848476 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.848595 kubelet[2704]: W0513 12:58:29.848490 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.848595 kubelet[2704]: E0513 12:58:29.848535 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.848835 kubelet[2704]: E0513 12:58:29.848814 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.848835 kubelet[2704]: W0513 12:58:29.848829 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.848967 kubelet[2704]: E0513 12:58:29.848881 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.849119 kubelet[2704]: E0513 12:58:29.849103 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.849119 kubelet[2704]: W0513 12:58:29.849114 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.849353 kubelet[2704]: E0513 12:58:29.849202 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.849353 kubelet[2704]: E0513 12:58:29.849339 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.849353 kubelet[2704]: W0513 12:58:29.849346 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.849460 kubelet[2704]: E0513 12:58:29.849425 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.849585 kubelet[2704]: E0513 12:58:29.849572 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.849585 kubelet[2704]: W0513 12:58:29.849583 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.849657 kubelet[2704]: E0513 12:58:29.849646 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.850365 kubelet[2704]: E0513 12:58:29.850353 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.850365 kubelet[2704]: W0513 12:58:29.850362 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.850469 kubelet[2704]: E0513 12:58:29.850442 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.850523 kubelet[2704]: E0513 12:58:29.850505 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.850523 kubelet[2704]: W0513 12:58:29.850517 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.850591 kubelet[2704]: E0513 12:58:29.850561 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.850646 kubelet[2704]: E0513 12:58:29.850632 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.850646 kubelet[2704]: W0513 12:58:29.850640 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.850730 kubelet[2704]: E0513 12:58:29.850712 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.850767 kubelet[2704]: E0513 12:58:29.850752 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.850767 kubelet[2704]: W0513 12:58:29.850757 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.850829 kubelet[2704]: E0513 12:58:29.850804 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.850896 kubelet[2704]: E0513 12:58:29.850882 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.850896 kubelet[2704]: W0513 12:58:29.850890 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.850964 kubelet[2704]: E0513 12:58:29.850901 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.851088 kubelet[2704]: E0513 12:58:29.851074 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.851088 kubelet[2704]: W0513 12:58:29.851086 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.851153 kubelet[2704]: E0513 12:58:29.851102 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.851394 kubelet[2704]: E0513 12:58:29.851285 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.851394 kubelet[2704]: W0513 12:58:29.851296 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.851394 kubelet[2704]: E0513 12:58:29.851309 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.851545 kubelet[2704]: E0513 12:58:29.851530 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.851545 kubelet[2704]: W0513 12:58:29.851540 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.851624 kubelet[2704]: E0513 12:58:29.851552 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.851906 kubelet[2704]: E0513 12:58:29.851869 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.851906 kubelet[2704]: W0513 12:58:29.851886 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.851906 kubelet[2704]: E0513 12:58:29.851897 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.852139 kubelet[2704]: E0513 12:58:29.852117 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.852139 kubelet[2704]: W0513 12:58:29.852126 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.852139 kubelet[2704]: E0513 12:58:29.852137 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.854622 kubelet[2704]: E0513 12:58:29.854603 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.854622 kubelet[2704]: W0513 12:58:29.854614 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.854622 kubelet[2704]: E0513 12:58:29.854623 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.947338 kubelet[2704]: E0513 12:58:29.947281 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.947338 kubelet[2704]: W0513 12:58:29.947303 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.947338 kubelet[2704]: E0513 12:58:29.947332 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.972600 kubelet[2704]: E0513 12:58:29.972566 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:29.972600 kubelet[2704]: W0513 12:58:29.972590 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:29.972748 kubelet[2704]: E0513 12:58:29.972616 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:29.995032 kubelet[2704]: E0513 12:58:29.994937 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:30.051565 kubelet[2704]: E0513 12:58:30.051530 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.051565 kubelet[2704]: W0513 12:58:30.051554 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.051565 kubelet[2704]: E0513 12:58:30.051576 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.051784 kubelet[2704]: E0513 12:58:30.051768 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.051784 kubelet[2704]: W0513 12:58:30.051782 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.051837 kubelet[2704]: E0513 12:58:30.051792 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.051993 kubelet[2704]: E0513 12:58:30.051969 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.051993 kubelet[2704]: W0513 12:58:30.051983 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.051993 kubelet[2704]: E0513 12:58:30.051993 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.052158 kubelet[2704]: E0513 12:58:30.052143 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.052158 kubelet[2704]: W0513 12:58:30.052155 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.052209 kubelet[2704]: E0513 12:58:30.052164 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.052385 kubelet[2704]: E0513 12:58:30.052370 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.052385 kubelet[2704]: W0513 12:58:30.052382 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.052435 kubelet[2704]: E0513 12:58:30.052392 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.052558 kubelet[2704]: E0513 12:58:30.052542 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.052558 kubelet[2704]: W0513 12:58:30.052554 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.052603 kubelet[2704]: E0513 12:58:30.052565 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.052733 kubelet[2704]: E0513 12:58:30.052718 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.052733 kubelet[2704]: W0513 12:58:30.052729 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.052781 kubelet[2704]: E0513 12:58:30.052739 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.052899 kubelet[2704]: E0513 12:58:30.052884 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.052899 kubelet[2704]: W0513 12:58:30.052896 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.052951 kubelet[2704]: E0513 12:58:30.052906 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.053076 kubelet[2704]: E0513 12:58:30.053060 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.053076 kubelet[2704]: W0513 12:58:30.053072 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.053122 kubelet[2704]: E0513 12:58:30.053082 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.053243 kubelet[2704]: E0513 12:58:30.053228 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.053243 kubelet[2704]: W0513 12:58:30.053240 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.053311 kubelet[2704]: E0513 12:58:30.053270 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.053447 kubelet[2704]: E0513 12:58:30.053430 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.053447 kubelet[2704]: W0513 12:58:30.053443 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.053493 kubelet[2704]: E0513 12:58:30.053453 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.053624 kubelet[2704]: E0513 12:58:30.053608 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.053624 kubelet[2704]: W0513 12:58:30.053620 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.053673 kubelet[2704]: E0513 12:58:30.053629 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.053798 kubelet[2704]: E0513 12:58:30.053783 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.053798 kubelet[2704]: W0513 12:58:30.053795 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.053845 kubelet[2704]: E0513 12:58:30.053806 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.053968 kubelet[2704]: E0513 12:58:30.053952 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.053968 kubelet[2704]: W0513 12:58:30.053964 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.054018 kubelet[2704]: E0513 12:58:30.053973 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.054134 kubelet[2704]: E0513 12:58:30.054118 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.054134 kubelet[2704]: W0513 12:58:30.054130 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.054178 kubelet[2704]: E0513 12:58:30.054139 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.054315 kubelet[2704]: E0513 12:58:30.054298 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.054315 kubelet[2704]: W0513 12:58:30.054311 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.054380 kubelet[2704]: E0513 12:58:30.054329 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.054503 kubelet[2704]: E0513 12:58:30.054488 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.054503 kubelet[2704]: W0513 12:58:30.054500 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.054545 kubelet[2704]: E0513 12:58:30.054509 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.054667 kubelet[2704]: E0513 12:58:30.054651 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.054667 kubelet[2704]: W0513 12:58:30.054664 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.054718 kubelet[2704]: E0513 12:58:30.054673 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.054832 kubelet[2704]: E0513 12:58:30.054816 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.054832 kubelet[2704]: W0513 12:58:30.054829 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.054875 kubelet[2704]: E0513 12:58:30.054838 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.055001 kubelet[2704]: E0513 12:58:30.054985 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.055001 kubelet[2704]: W0513 12:58:30.054997 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.055048 kubelet[2704]: E0513 12:58:30.055006 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.124065 containerd[1594]: time="2025-05-13T12:58:30.123936008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6cc66f7cbd-4plvb,Uid:4ea0ff14-d689-405f-b2d4-0c4987ad0e9a,Namespace:calico-system,Attempt:0,}" May 13 12:58:30.149458 kubelet[2704]: E0513 12:58:30.149423 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.149458 kubelet[2704]: W0513 12:58:30.149441 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.149458 kubelet[2704]: E0513 12:58:30.149457 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.149605 kubelet[2704]: I0513 12:58:30.149483 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7dd68750-995a-4166-a528-aef7a2785014-socket-dir\") pod \"csi-node-driver-q7cjm\" (UID: \"7dd68750-995a-4166-a528-aef7a2785014\") " pod="calico-system/csi-node-driver-q7cjm" May 13 12:58:30.149689 kubelet[2704]: E0513 12:58:30.149660 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.149689 kubelet[2704]: W0513 12:58:30.149676 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.149738 kubelet[2704]: E0513 12:58:30.149689 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.149738 kubelet[2704]: I0513 12:58:30.149706 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7dd68750-995a-4166-a528-aef7a2785014-varrun\") pod \"csi-node-driver-q7cjm\" (UID: \"7dd68750-995a-4166-a528-aef7a2785014\") " pod="calico-system/csi-node-driver-q7cjm" May 13 12:58:30.149905 kubelet[2704]: E0513 12:58:30.149878 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.149905 kubelet[2704]: W0513 12:58:30.149892 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.149905 kubelet[2704]: E0513 12:58:30.149906 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.150070 kubelet[2704]: E0513 12:58:30.150053 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.150070 kubelet[2704]: W0513 12:58:30.150062 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.150122 kubelet[2704]: E0513 12:58:30.150074 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.150244 kubelet[2704]: E0513 12:58:30.150219 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.150244 kubelet[2704]: W0513 12:58:30.150230 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.150244 kubelet[2704]: E0513 12:58:30.150241 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.150353 kubelet[2704]: I0513 12:58:30.150278 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7dd68750-995a-4166-a528-aef7a2785014-kubelet-dir\") pod \"csi-node-driver-q7cjm\" (UID: \"7dd68750-995a-4166-a528-aef7a2785014\") " pod="calico-system/csi-node-driver-q7cjm" May 13 12:58:30.150473 kubelet[2704]: E0513 12:58:30.150452 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.150473 kubelet[2704]: W0513 12:58:30.150467 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.150521 kubelet[2704]: E0513 12:58:30.150482 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.150693 kubelet[2704]: E0513 12:58:30.150676 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.150693 kubelet[2704]: W0513 12:58:30.150688 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.150744 kubelet[2704]: E0513 12:58:30.150702 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.150886 kubelet[2704]: E0513 12:58:30.150861 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.150886 kubelet[2704]: W0513 12:58:30.150877 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.150933 kubelet[2704]: E0513 12:58:30.150890 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.150933 kubelet[2704]: I0513 12:58:30.150909 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7dd68750-995a-4166-a528-aef7a2785014-registration-dir\") pod \"csi-node-driver-q7cjm\" (UID: \"7dd68750-995a-4166-a528-aef7a2785014\") " pod="calico-system/csi-node-driver-q7cjm" May 13 12:58:30.151057 kubelet[2704]: E0513 12:58:30.151042 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.151057 kubelet[2704]: W0513 12:58:30.151053 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.151105 kubelet[2704]: E0513 12:58:30.151064 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.151231 kubelet[2704]: E0513 12:58:30.151216 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.151231 kubelet[2704]: W0513 12:58:30.151227 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.151299 kubelet[2704]: E0513 12:58:30.151237 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.151415 kubelet[2704]: E0513 12:58:30.151401 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.151415 kubelet[2704]: W0513 12:58:30.151410 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.151464 kubelet[2704]: E0513 12:58:30.151421 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.151464 kubelet[2704]: I0513 12:58:30.151434 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ksgg\" (UniqueName: \"kubernetes.io/projected/7dd68750-995a-4166-a528-aef7a2785014-kube-api-access-8ksgg\") pod \"csi-node-driver-q7cjm\" (UID: \"7dd68750-995a-4166-a528-aef7a2785014\") " pod="calico-system/csi-node-driver-q7cjm" May 13 12:58:30.151615 kubelet[2704]: E0513 12:58:30.151600 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.151615 kubelet[2704]: W0513 12:58:30.151610 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.151667 kubelet[2704]: E0513 12:58:30.151621 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.151760 kubelet[2704]: E0513 12:58:30.151747 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.151760 kubelet[2704]: W0513 12:58:30.151756 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.151811 kubelet[2704]: E0513 12:58:30.151766 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.151944 kubelet[2704]: E0513 12:58:30.151928 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.151944 kubelet[2704]: W0513 12:58:30.151941 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.151992 kubelet[2704]: E0513 12:58:30.151952 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.152107 kubelet[2704]: E0513 12:58:30.152092 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.152107 kubelet[2704]: W0513 12:58:30.152105 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.152153 kubelet[2704]: E0513 12:58:30.152114 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.165800 containerd[1594]: time="2025-05-13T12:58:30.165757608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-krrmx,Uid:6f5a9517-847c-4101-8b62-b95d8d1852c2,Namespace:calico-system,Attempt:0,}" May 13 12:58:30.252718 kubelet[2704]: E0513 12:58:30.252241 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.252718 kubelet[2704]: W0513 12:58:30.252290 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.252718 kubelet[2704]: E0513 12:58:30.252313 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.252718 kubelet[2704]: E0513 12:58:30.252625 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.252718 kubelet[2704]: W0513 12:58:30.252633 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.252718 kubelet[2704]: E0513 12:58:30.252654 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.252957 kubelet[2704]: E0513 12:58:30.252910 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.252957 kubelet[2704]: W0513 12:58:30.252920 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.252957 kubelet[2704]: E0513 12:58:30.252935 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.253136 kubelet[2704]: E0513 12:58:30.253118 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.253136 kubelet[2704]: W0513 12:58:30.253132 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.253203 kubelet[2704]: E0513 12:58:30.253143 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.253432 kubelet[2704]: E0513 12:58:30.253354 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.253432 kubelet[2704]: W0513 12:58:30.253367 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.253432 kubelet[2704]: E0513 12:58:30.253381 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.253709 kubelet[2704]: E0513 12:58:30.253621 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.253709 kubelet[2704]: W0513 12:58:30.253630 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.253709 kubelet[2704]: E0513 12:58:30.253654 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.254340 kubelet[2704]: E0513 12:58:30.254214 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.254340 kubelet[2704]: W0513 12:58:30.254228 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.254340 kubelet[2704]: E0513 12:58:30.254275 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.254915 kubelet[2704]: E0513 12:58:30.254556 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.254915 kubelet[2704]: W0513 12:58:30.254565 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.254915 kubelet[2704]: E0513 12:58:30.254611 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.254915 kubelet[2704]: E0513 12:58:30.254770 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.254915 kubelet[2704]: W0513 12:58:30.254777 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.254915 kubelet[2704]: E0513 12:58:30.254855 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.255093 kubelet[2704]: E0513 12:58:30.254991 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.255093 kubelet[2704]: W0513 12:58:30.255000 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.255093 kubelet[2704]: E0513 12:58:30.255034 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.256533 kubelet[2704]: E0513 12:58:30.255287 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.256533 kubelet[2704]: W0513 12:58:30.255300 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.256533 kubelet[2704]: E0513 12:58:30.255310 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.256533 kubelet[2704]: E0513 12:58:30.256106 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.256533 kubelet[2704]: W0513 12:58:30.256114 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.256533 kubelet[2704]: E0513 12:58:30.256172 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.256533 kubelet[2704]: E0513 12:58:30.256359 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.256533 kubelet[2704]: W0513 12:58:30.256365 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.256533 kubelet[2704]: E0513 12:58:30.256404 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.256741 kubelet[2704]: E0513 12:58:30.256638 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.256741 kubelet[2704]: W0513 12:58:30.256645 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.256741 kubelet[2704]: E0513 12:58:30.256698 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.256860 kubelet[2704]: E0513 12:58:30.256822 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.256860 kubelet[2704]: W0513 12:58:30.256857 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.256934 kubelet[2704]: E0513 12:58:30.256919 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.257432 kubelet[2704]: E0513 12:58:30.257409 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.257432 kubelet[2704]: W0513 12:58:30.257421 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.257513 kubelet[2704]: E0513 12:58:30.257436 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.257726 kubelet[2704]: E0513 12:58:30.257628 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.257726 kubelet[2704]: W0513 12:58:30.257639 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.258307 kubelet[2704]: E0513 12:58:30.258279 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.258489 kubelet[2704]: E0513 12:58:30.258478 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.258489 kubelet[2704]: W0513 12:58:30.258488 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.258573 kubelet[2704]: E0513 12:58:30.258532 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.258661 kubelet[2704]: E0513 12:58:30.258651 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.258661 kubelet[2704]: W0513 12:58:30.258659 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.258876 kubelet[2704]: E0513 12:58:30.258816 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.258915 kubelet[2704]: E0513 12:58:30.258894 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.258915 kubelet[2704]: W0513 12:58:30.258900 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.258990 kubelet[2704]: E0513 12:58:30.258984 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.259174 kubelet[2704]: E0513 12:58:30.259158 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.259174 kubelet[2704]: W0513 12:58:30.259169 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.261526 kubelet[2704]: E0513 12:58:30.259180 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.261526 kubelet[2704]: E0513 12:58:30.259683 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.261526 kubelet[2704]: W0513 12:58:30.259692 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.261526 kubelet[2704]: E0513 12:58:30.259714 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.261526 kubelet[2704]: E0513 12:58:30.259903 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.261526 kubelet[2704]: W0513 12:58:30.259910 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.261526 kubelet[2704]: E0513 12:58:30.259921 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.261526 kubelet[2704]: E0513 12:58:30.260268 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.261526 kubelet[2704]: W0513 12:58:30.260276 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.261526 kubelet[2704]: E0513 12:58:30.260294 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.261795 kubelet[2704]: E0513 12:58:30.260491 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.261795 kubelet[2704]: W0513 12:58:30.260497 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.261795 kubelet[2704]: E0513 12:58:30.260505 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.279160 kubelet[2704]: E0513 12:58:30.279102 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:30.279160 kubelet[2704]: W0513 12:58:30.279132 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:30.279160 kubelet[2704]: E0513 12:58:30.279156 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:30.279827 containerd[1594]: time="2025-05-13T12:58:30.279783969Z" level=info msg="connecting to shim b99f7b75180a4b8ad3f7ea735c25a7356f2e38d9947f6ca5f07f4d83a2af8826" address="unix:///run/containerd/s/9022632795a6096741db20c0c30de8d5146c3d569dd51a16305b646bdb3d3b12" namespace=k8s.io protocol=ttrpc version=3 May 13 12:58:30.280301 containerd[1594]: time="2025-05-13T12:58:30.280077657Z" level=info msg="connecting to shim b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe" address="unix:///run/containerd/s/786be93baebb7c9c5e268c4e62eb87fc755386413199055817f8baa37eda0865" namespace=k8s.io protocol=ttrpc version=3 May 13 12:58:30.319496 systemd[1]: Started cri-containerd-b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe.scope - libcontainer container b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe. May 13 12:58:30.323225 systemd[1]: Started cri-containerd-b99f7b75180a4b8ad3f7ea735c25a7356f2e38d9947f6ca5f07f4d83a2af8826.scope - libcontainer container b99f7b75180a4b8ad3f7ea735c25a7356f2e38d9947f6ca5f07f4d83a2af8826. May 13 12:58:30.601916 containerd[1594]: time="2025-05-13T12:58:30.601873853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6cc66f7cbd-4plvb,Uid:4ea0ff14-d689-405f-b2d4-0c4987ad0e9a,Namespace:calico-system,Attempt:0,} returns sandbox id \"b99f7b75180a4b8ad3f7ea735c25a7356f2e38d9947f6ca5f07f4d83a2af8826\"" May 13 12:58:30.602901 containerd[1594]: time="2025-05-13T12:58:30.602858713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 12:58:30.799905 containerd[1594]: time="2025-05-13T12:58:30.799708610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-krrmx,Uid:6f5a9517-847c-4101-8b62-b95d8d1852c2,Namespace:calico-system,Attempt:0,} returns sandbox id \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\"" May 13 12:58:31.544366 kubelet[2704]: E0513 12:58:31.544324 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:33.544500 kubelet[2704]: E0513 12:58:33.544444 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:34.370936 containerd[1594]: time="2025-05-13T12:58:34.370884319Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:34.371782 containerd[1594]: time="2025-05-13T12:58:34.371756581Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 13 12:58:34.373169 containerd[1594]: time="2025-05-13T12:58:34.373116596Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:34.375441 containerd[1594]: time="2025-05-13T12:58:34.375416292Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:34.376098 containerd[1594]: time="2025-05-13T12:58:34.376051284Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.77314438s" May 13 12:58:34.376128 containerd[1594]: time="2025-05-13T12:58:34.376100187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 13 12:58:34.377128 containerd[1594]: time="2025-05-13T12:58:34.377107514Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 12:58:34.386637 containerd[1594]: time="2025-05-13T12:58:34.386585944Z" level=info msg="CreateContainer within sandbox \"b99f7b75180a4b8ad3f7ea735c25a7356f2e38d9947f6ca5f07f4d83a2af8826\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 12:58:34.395080 containerd[1594]: time="2025-05-13T12:58:34.395025246Z" level=info msg="Container daed6297227b87c1d351906f65d4dfb8c1e3db3ba3004e1f4787d18e5899f337: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:34.403355 containerd[1594]: time="2025-05-13T12:58:34.403311418Z" level=info msg="CreateContainer within sandbox \"b99f7b75180a4b8ad3f7ea735c25a7356f2e38d9947f6ca5f07f4d83a2af8826\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"daed6297227b87c1d351906f65d4dfb8c1e3db3ba3004e1f4787d18e5899f337\"" May 13 12:58:34.403656 containerd[1594]: time="2025-05-13T12:58:34.403626264Z" level=info msg="StartContainer for \"daed6297227b87c1d351906f65d4dfb8c1e3db3ba3004e1f4787d18e5899f337\"" May 13 12:58:34.404598 containerd[1594]: time="2025-05-13T12:58:34.404571975Z" level=info msg="connecting to shim daed6297227b87c1d351906f65d4dfb8c1e3db3ba3004e1f4787d18e5899f337" address="unix:///run/containerd/s/9022632795a6096741db20c0c30de8d5146c3d569dd51a16305b646bdb3d3b12" protocol=ttrpc version=3 May 13 12:58:34.429656 systemd[1]: Started cri-containerd-daed6297227b87c1d351906f65d4dfb8c1e3db3ba3004e1f4787d18e5899f337.scope - libcontainer container daed6297227b87c1d351906f65d4dfb8c1e3db3ba3004e1f4787d18e5899f337. May 13 12:58:34.491269 containerd[1594]: time="2025-05-13T12:58:34.491204955Z" level=info msg="StartContainer for \"daed6297227b87c1d351906f65d4dfb8c1e3db3ba3004e1f4787d18e5899f337\" returns successfully" May 13 12:58:34.636813 kubelet[2704]: I0513 12:58:34.636668 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6cc66f7cbd-4plvb" podStartSLOduration=1.8623733150000001 podStartE2EDuration="5.63665525s" podCreationTimestamp="2025-05-13 12:58:29 +0000 UTC" firstStartedPulling="2025-05-13 12:58:30.602644156 +0000 UTC m=+17.133272143" lastFinishedPulling="2025-05-13 12:58:34.376926081 +0000 UTC m=+20.907554078" observedRunningTime="2025-05-13 12:58:34.636513561 +0000 UTC m=+21.167141578" watchObservedRunningTime="2025-05-13 12:58:34.63665525 +0000 UTC m=+21.167283257" May 13 12:58:34.689824 kubelet[2704]: E0513 12:58:34.689790 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.689824 kubelet[2704]: W0513 12:58:34.689815 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.689984 kubelet[2704]: E0513 12:58:34.689836 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.690085 kubelet[2704]: E0513 12:58:34.690068 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.690085 kubelet[2704]: W0513 12:58:34.690077 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.690085 kubelet[2704]: E0513 12:58:34.690085 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.690298 kubelet[2704]: E0513 12:58:34.690245 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.690298 kubelet[2704]: W0513 12:58:34.690279 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.690298 kubelet[2704]: E0513 12:58:34.690295 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.690462 kubelet[2704]: E0513 12:58:34.690444 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.690462 kubelet[2704]: W0513 12:58:34.690454 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.690462 kubelet[2704]: E0513 12:58:34.690461 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.690634 kubelet[2704]: E0513 12:58:34.690615 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.690634 kubelet[2704]: W0513 12:58:34.690626 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.690634 kubelet[2704]: E0513 12:58:34.690635 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.690800 kubelet[2704]: E0513 12:58:34.690783 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.690800 kubelet[2704]: W0513 12:58:34.690792 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.690800 kubelet[2704]: E0513 12:58:34.690800 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.690952 kubelet[2704]: E0513 12:58:34.690936 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.690952 kubelet[2704]: W0513 12:58:34.690945 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.690952 kubelet[2704]: E0513 12:58:34.690953 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.691107 kubelet[2704]: E0513 12:58:34.691091 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.691107 kubelet[2704]: W0513 12:58:34.691100 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.691107 kubelet[2704]: E0513 12:58:34.691107 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.691301 kubelet[2704]: E0513 12:58:34.691274 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.691301 kubelet[2704]: W0513 12:58:34.691292 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.691359 kubelet[2704]: E0513 12:58:34.691315 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.691476 kubelet[2704]: E0513 12:58:34.691459 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.691476 kubelet[2704]: W0513 12:58:34.691468 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.691476 kubelet[2704]: E0513 12:58:34.691476 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.691631 kubelet[2704]: E0513 12:58:34.691614 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.691631 kubelet[2704]: W0513 12:58:34.691624 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.691631 kubelet[2704]: E0513 12:58:34.691632 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.691792 kubelet[2704]: E0513 12:58:34.691775 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.691792 kubelet[2704]: W0513 12:58:34.691785 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.691792 kubelet[2704]: E0513 12:58:34.691792 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.691948 kubelet[2704]: E0513 12:58:34.691932 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.691948 kubelet[2704]: W0513 12:58:34.691941 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.691948 kubelet[2704]: E0513 12:58:34.691948 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.692101 kubelet[2704]: E0513 12:58:34.692084 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.692101 kubelet[2704]: W0513 12:58:34.692094 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.692101 kubelet[2704]: E0513 12:58:34.692101 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.692271 kubelet[2704]: E0513 12:58:34.692236 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.692271 kubelet[2704]: W0513 12:58:34.692245 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.692271 kubelet[2704]: E0513 12:58:34.692272 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.790628 kubelet[2704]: E0513 12:58:34.790588 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.790628 kubelet[2704]: W0513 12:58:34.790615 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.790628 kubelet[2704]: E0513 12:58:34.790639 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.790914 kubelet[2704]: E0513 12:58:34.790894 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.790914 kubelet[2704]: W0513 12:58:34.790907 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.790985 kubelet[2704]: E0513 12:58:34.790923 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.791310 kubelet[2704]: E0513 12:58:34.791267 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.791310 kubelet[2704]: W0513 12:58:34.791304 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.791401 kubelet[2704]: E0513 12:58:34.791337 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.791557 kubelet[2704]: E0513 12:58:34.791541 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.791557 kubelet[2704]: W0513 12:58:34.791551 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.791626 kubelet[2704]: E0513 12:58:34.791566 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.791825 kubelet[2704]: E0513 12:58:34.791782 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.791825 kubelet[2704]: W0513 12:58:34.791798 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.791825 kubelet[2704]: E0513 12:58:34.791816 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.792054 kubelet[2704]: E0513 12:58:34.792034 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.792054 kubelet[2704]: W0513 12:58:34.792047 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.792120 kubelet[2704]: E0513 12:58:34.792063 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.792267 kubelet[2704]: E0513 12:58:34.792234 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.792267 kubelet[2704]: W0513 12:58:34.792247 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.792351 kubelet[2704]: E0513 12:58:34.792291 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.792756 kubelet[2704]: E0513 12:58:34.792702 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.792756 kubelet[2704]: W0513 12:58:34.792736 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.792937 kubelet[2704]: E0513 12:58:34.792774 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.793010 kubelet[2704]: E0513 12:58:34.792980 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.793010 kubelet[2704]: W0513 12:58:34.792995 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.793090 kubelet[2704]: E0513 12:58:34.793014 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.793239 kubelet[2704]: E0513 12:58:34.793217 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.793239 kubelet[2704]: W0513 12:58:34.793234 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.793355 kubelet[2704]: E0513 12:58:34.793272 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.793529 kubelet[2704]: E0513 12:58:34.793511 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.793529 kubelet[2704]: W0513 12:58:34.793525 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.793675 kubelet[2704]: E0513 12:58:34.793543 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.793740 kubelet[2704]: E0513 12:58:34.793718 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.793740 kubelet[2704]: W0513 12:58:34.793734 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.793804 kubelet[2704]: E0513 12:58:34.793750 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.793951 kubelet[2704]: E0513 12:58:34.793921 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.793951 kubelet[2704]: W0513 12:58:34.793937 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.794026 kubelet[2704]: E0513 12:58:34.793971 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.794146 kubelet[2704]: E0513 12:58:34.794127 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.794146 kubelet[2704]: W0513 12:58:34.794140 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.794214 kubelet[2704]: E0513 12:58:34.794151 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.794386 kubelet[2704]: E0513 12:58:34.794358 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.794386 kubelet[2704]: W0513 12:58:34.794374 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.794386 kubelet[2704]: E0513 12:58:34.794385 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.794593 kubelet[2704]: E0513 12:58:34.794574 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.794593 kubelet[2704]: W0513 12:58:34.794587 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.794657 kubelet[2704]: E0513 12:58:34.794598 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.794822 kubelet[2704]: E0513 12:58:34.794802 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.794822 kubelet[2704]: W0513 12:58:34.794815 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.794896 kubelet[2704]: E0513 12:58:34.794827 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:34.795248 kubelet[2704]: E0513 12:58:34.795220 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:34.795248 kubelet[2704]: W0513 12:58:34.795234 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:34.795360 kubelet[2704]: E0513 12:58:34.795245 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.544650 kubelet[2704]: E0513 12:58:35.544580 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:35.627497 kubelet[2704]: I0513 12:58:35.627462 2704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:58:35.696137 kubelet[2704]: E0513 12:58:35.696099 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.696137 kubelet[2704]: W0513 12:58:35.696124 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.696137 kubelet[2704]: E0513 12:58:35.696149 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.696716 kubelet[2704]: E0513 12:58:35.696389 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.696716 kubelet[2704]: W0513 12:58:35.696398 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.696716 kubelet[2704]: E0513 12:58:35.696408 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.696716 kubelet[2704]: E0513 12:58:35.696635 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.696716 kubelet[2704]: W0513 12:58:35.696646 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.696716 kubelet[2704]: E0513 12:58:35.696657 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.696901 kubelet[2704]: E0513 12:58:35.696839 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.696901 kubelet[2704]: W0513 12:58:35.696849 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.696901 kubelet[2704]: E0513 12:58:35.696860 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.697062 kubelet[2704]: E0513 12:58:35.697045 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.697062 kubelet[2704]: W0513 12:58:35.697057 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.697127 kubelet[2704]: E0513 12:58:35.697067 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.697295 kubelet[2704]: E0513 12:58:35.697241 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.697295 kubelet[2704]: W0513 12:58:35.697268 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.697295 kubelet[2704]: E0513 12:58:35.697287 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.697470 kubelet[2704]: E0513 12:58:35.697452 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.697470 kubelet[2704]: W0513 12:58:35.697463 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.697538 kubelet[2704]: E0513 12:58:35.697472 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.697660 kubelet[2704]: E0513 12:58:35.697643 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.697660 kubelet[2704]: W0513 12:58:35.697654 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.697727 kubelet[2704]: E0513 12:58:35.697663 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.697858 kubelet[2704]: E0513 12:58:35.697841 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.697858 kubelet[2704]: W0513 12:58:35.697853 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.697919 kubelet[2704]: E0513 12:58:35.697862 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.698046 kubelet[2704]: E0513 12:58:35.698029 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.698046 kubelet[2704]: W0513 12:58:35.698041 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.698105 kubelet[2704]: E0513 12:58:35.698051 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.698243 kubelet[2704]: E0513 12:58:35.698226 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.698243 kubelet[2704]: W0513 12:58:35.698237 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.698333 kubelet[2704]: E0513 12:58:35.698247 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.698480 kubelet[2704]: E0513 12:58:35.698462 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.698480 kubelet[2704]: W0513 12:58:35.698475 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.698546 kubelet[2704]: E0513 12:58:35.698484 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.698681 kubelet[2704]: E0513 12:58:35.698663 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.698681 kubelet[2704]: W0513 12:58:35.698674 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.698738 kubelet[2704]: E0513 12:58:35.698685 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.698883 kubelet[2704]: E0513 12:58:35.698866 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.698883 kubelet[2704]: W0513 12:58:35.698877 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.698946 kubelet[2704]: E0513 12:58:35.698888 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.699083 kubelet[2704]: E0513 12:58:35.699066 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.699083 kubelet[2704]: W0513 12:58:35.699078 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.699142 kubelet[2704]: E0513 12:58:35.699088 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.796695 kubelet[2704]: E0513 12:58:35.796591 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.796695 kubelet[2704]: W0513 12:58:35.796612 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.796695 kubelet[2704]: E0513 12:58:35.796632 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.796932 kubelet[2704]: E0513 12:58:35.796889 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.796932 kubelet[2704]: W0513 12:58:35.796916 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.797076 kubelet[2704]: E0513 12:58:35.796951 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.797909 kubelet[2704]: E0513 12:58:35.797876 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.797909 kubelet[2704]: W0513 12:58:35.797891 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.798090 kubelet[2704]: E0513 12:58:35.797933 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.798171 kubelet[2704]: E0513 12:58:35.798144 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.798171 kubelet[2704]: W0513 12:58:35.798164 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.798291 kubelet[2704]: E0513 12:58:35.798228 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.798451 kubelet[2704]: E0513 12:58:35.798426 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.798451 kubelet[2704]: W0513 12:58:35.798441 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.798505 kubelet[2704]: E0513 12:58:35.798476 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.798662 kubelet[2704]: E0513 12:58:35.798646 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.798662 kubelet[2704]: W0513 12:58:35.798660 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.798716 kubelet[2704]: E0513 12:58:35.798692 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.798860 kubelet[2704]: E0513 12:58:35.798837 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.798860 kubelet[2704]: W0513 12:58:35.798850 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.798914 kubelet[2704]: E0513 12:58:35.798863 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.799054 kubelet[2704]: E0513 12:58:35.799038 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.799054 kubelet[2704]: W0513 12:58:35.799050 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.799103 kubelet[2704]: E0513 12:58:35.799064 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.799366 kubelet[2704]: E0513 12:58:35.799350 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.799366 kubelet[2704]: W0513 12:58:35.799363 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.799430 kubelet[2704]: E0513 12:58:35.799379 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.799594 kubelet[2704]: E0513 12:58:35.799568 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.799594 kubelet[2704]: W0513 12:58:35.799584 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.799644 kubelet[2704]: E0513 12:58:35.799604 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.799769 kubelet[2704]: E0513 12:58:35.799756 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.799769 kubelet[2704]: W0513 12:58:35.799765 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.799816 kubelet[2704]: E0513 12:58:35.799777 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.799989 kubelet[2704]: E0513 12:58:35.799970 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.800019 kubelet[2704]: W0513 12:58:35.799988 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.800041 kubelet[2704]: E0513 12:58:35.800025 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.800176 kubelet[2704]: E0513 12:58:35.800159 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.800176 kubelet[2704]: W0513 12:58:35.800172 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.800230 kubelet[2704]: E0513 12:58:35.800203 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.800376 kubelet[2704]: E0513 12:58:35.800360 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.800376 kubelet[2704]: W0513 12:58:35.800370 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.800477 kubelet[2704]: E0513 12:58:35.800385 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.800572 kubelet[2704]: E0513 12:58:35.800552 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.800572 kubelet[2704]: W0513 12:58:35.800563 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.800640 kubelet[2704]: E0513 12:58:35.800579 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.800818 kubelet[2704]: E0513 12:58:35.800792 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.800818 kubelet[2704]: W0513 12:58:35.800808 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.800882 kubelet[2704]: E0513 12:58:35.800823 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.801034 kubelet[2704]: E0513 12:58:35.801016 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.801034 kubelet[2704]: W0513 12:58:35.801027 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.801102 kubelet[2704]: E0513 12:58:35.801035 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:35.801323 kubelet[2704]: E0513 12:58:35.801309 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 12:58:35.801323 kubelet[2704]: W0513 12:58:35.801319 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 12:58:35.801388 kubelet[2704]: E0513 12:58:35.801326 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 12:58:36.225502 containerd[1594]: time="2025-05-13T12:58:36.225350769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:36.257351 containerd[1594]: time="2025-05-13T12:58:36.257247966Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 13 12:58:36.296331 containerd[1594]: time="2025-05-13T12:58:36.296279405Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:36.331341 containerd[1594]: time="2025-05-13T12:58:36.331291683Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:36.331753 containerd[1594]: time="2025-05-13T12:58:36.331720324Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 1.954585418s" May 13 12:58:36.331784 containerd[1594]: time="2025-05-13T12:58:36.331759128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 13 12:58:36.333646 containerd[1594]: time="2025-05-13T12:58:36.333575584Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 12:58:36.532988 containerd[1594]: time="2025-05-13T12:58:36.532933770Z" level=info msg="Container 6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:36.692070 containerd[1594]: time="2025-05-13T12:58:36.691821785Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\"" May 13 12:58:36.692510 containerd[1594]: time="2025-05-13T12:58:36.692480612Z" level=info msg="StartContainer for \"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\"" May 13 12:58:36.694355 containerd[1594]: time="2025-05-13T12:58:36.694320852Z" level=info msg="connecting to shim 6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5" address="unix:///run/containerd/s/786be93baebb7c9c5e268c4e62eb87fc755386413199055817f8baa37eda0865" protocol=ttrpc version=3 May 13 12:58:36.721451 systemd[1]: Started cri-containerd-6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5.scope - libcontainer container 6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5. May 13 12:58:36.810884 systemd[1]: cri-containerd-6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5.scope: Deactivated successfully. May 13 12:58:36.811299 systemd[1]: cri-containerd-6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5.scope: Consumed 41ms CPU time, 8.1M memory peak, 6.3M written to disk. May 13 12:58:36.812595 containerd[1594]: time="2025-05-13T12:58:36.812563715Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\" id:\"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\" pid:3431 exited_at:{seconds:1747141116 nanos:812033662}" May 13 12:58:37.004967 containerd[1594]: time="2025-05-13T12:58:37.004893697Z" level=info msg="received exit event container_id:\"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\" id:\"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\" pid:3431 exited_at:{seconds:1747141116 nanos:812033662}" May 13 12:58:37.006524 containerd[1594]: time="2025-05-13T12:58:37.006501797Z" level=info msg="StartContainer for \"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\" returns successfully" May 13 12:58:37.029246 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5-rootfs.mount: Deactivated successfully. May 13 12:58:37.544615 kubelet[2704]: E0513 12:58:37.544551 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:38.636839 containerd[1594]: time="2025-05-13T12:58:38.636793225Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 12:58:39.544576 kubelet[2704]: E0513 12:58:39.544535 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:41.544732 kubelet[2704]: E0513 12:58:41.544680 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:42.457192 kubelet[2704]: I0513 12:58:42.457128 2704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:58:43.544338 kubelet[2704]: E0513 12:58:43.544300 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:44.865576 systemd[1]: Started sshd@7-10.0.0.121:22-10.0.0.1:55214.service - OpenSSH per-connection server daemon (10.0.0.1:55214). May 13 12:58:44.916733 sshd[3479]: Accepted publickey for core from 10.0.0.1 port 55214 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:58:44.919505 sshd-session[3479]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:58:44.926927 systemd-logind[1578]: New session 8 of user core. May 13 12:58:44.939392 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 12:58:45.545184 kubelet[2704]: E0513 12:58:45.545146 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:45.725179 sshd[3481]: Connection closed by 10.0.0.1 port 55214 May 13 12:58:45.725528 sshd-session[3479]: pam_unix(sshd:session): session closed for user core May 13 12:58:45.729537 systemd[1]: sshd@7-10.0.0.121:22-10.0.0.1:55214.service: Deactivated successfully. May 13 12:58:45.731420 systemd[1]: session-8.scope: Deactivated successfully. May 13 12:58:45.732291 systemd-logind[1578]: Session 8 logged out. Waiting for processes to exit. May 13 12:58:45.733658 systemd-logind[1578]: Removed session 8. May 13 12:58:45.759768 containerd[1594]: time="2025-05-13T12:58:45.759709339Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:45.763441 containerd[1594]: time="2025-05-13T12:58:45.763277945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 13 12:58:45.764653 containerd[1594]: time="2025-05-13T12:58:45.764623990Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:45.766999 containerd[1594]: time="2025-05-13T12:58:45.766947930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:45.767467 containerd[1594]: time="2025-05-13T12:58:45.767443385Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 7.130608782s" May 13 12:58:45.767467 containerd[1594]: time="2025-05-13T12:58:45.767471067Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 13 12:58:45.769016 containerd[1594]: time="2025-05-13T12:58:45.768988586Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 12:58:45.779117 containerd[1594]: time="2025-05-13T12:58:45.779048325Z" level=info msg="Container 197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:45.788713 containerd[1594]: time="2025-05-13T12:58:45.788680898Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\"" May 13 12:58:45.789244 containerd[1594]: time="2025-05-13T12:58:45.789217670Z" level=info msg="StartContainer for \"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\"" May 13 12:58:45.790750 containerd[1594]: time="2025-05-13T12:58:45.790727875Z" level=info msg="connecting to shim 197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f" address="unix:///run/containerd/s/786be93baebb7c9c5e268c4e62eb87fc755386413199055817f8baa37eda0865" protocol=ttrpc version=3 May 13 12:58:45.818394 systemd[1]: Started cri-containerd-197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f.scope - libcontainer container 197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f. May 13 12:58:45.856979 containerd[1594]: time="2025-05-13T12:58:45.856938224Z" level=info msg="StartContainer for \"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\" returns successfully" May 13 12:58:46.813605 containerd[1594]: time="2025-05-13T12:58:46.813539122Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 12:58:46.816927 systemd[1]: cri-containerd-197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f.scope: Deactivated successfully. May 13 12:58:46.817359 systemd[1]: cri-containerd-197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f.scope: Consumed 531ms CPU time, 160.5M memory peak, 8K read from disk, 154M written to disk. May 13 12:58:46.818776 containerd[1594]: time="2025-05-13T12:58:46.818732326Z" level=info msg="received exit event container_id:\"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\" id:\"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\" pid:3512 exited_at:{seconds:1747141126 nanos:818530225}" May 13 12:58:46.818901 containerd[1594]: time="2025-05-13T12:58:46.818829589Z" level=info msg="TaskExit event in podsandbox handler container_id:\"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\" id:\"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\" pid:3512 exited_at:{seconds:1747141126 nanos:818530225}" May 13 12:58:46.842276 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f-rootfs.mount: Deactivated successfully. May 13 12:58:46.888240 kubelet[2704]: I0513 12:58:46.887919 2704 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 13 12:58:47.400368 systemd[1]: Created slice kubepods-besteffort-podc5c7ee4c_244c_48f3_844f_9864efd99cd6.slice - libcontainer container kubepods-besteffort-podc5c7ee4c_244c_48f3_844f_9864efd99cd6.slice. May 13 12:58:47.422611 systemd[1]: Created slice kubepods-burstable-pode7d6f99e_b52c_4124_8898_cdb6a2a240c9.slice - libcontainer container kubepods-burstable-pode7d6f99e_b52c_4124_8898_cdb6a2a240c9.slice. May 13 12:58:47.429958 systemd[1]: Created slice kubepods-besteffort-pod509974ac_8c85_45a5_a90c_6e89c9a54779.slice - libcontainer container kubepods-besteffort-pod509974ac_8c85_45a5_a90c_6e89c9a54779.slice. May 13 12:58:47.436066 systemd[1]: Created slice kubepods-besteffort-pod13643b3d_f618_4901_ac95_9d9e3c0fb895.slice - libcontainer container kubepods-besteffort-pod13643b3d_f618_4901_ac95_9d9e3c0fb895.slice. May 13 12:58:47.440279 systemd[1]: Created slice kubepods-besteffort-poda9d392a7_21ed_46f8_a204_24615b7b6d08.slice - libcontainer container kubepods-besteffort-poda9d392a7_21ed_46f8_a204_24615b7b6d08.slice. May 13 12:58:47.444906 systemd[1]: Created slice kubepods-burstable-pod9ba775bd_6563_42d6_824b_2dbfad12b002.slice - libcontainer container kubepods-burstable-pod9ba775bd_6563_42d6_824b_2dbfad12b002.slice. May 13 12:58:47.550680 systemd[1]: Created slice kubepods-besteffort-pod7dd68750_995a_4166_a528_aef7a2785014.slice - libcontainer container kubepods-besteffort-pod7dd68750_995a_4166_a528_aef7a2785014.slice. May 13 12:58:47.553028 containerd[1594]: time="2025-05-13T12:58:47.552986089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,}" May 13 12:58:47.579315 kubelet[2704]: I0513 12:58:47.579271 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2289\" (UniqueName: \"kubernetes.io/projected/c5c7ee4c-244c-48f3-844f-9864efd99cd6-kube-api-access-l2289\") pod \"calico-kube-controllers-647bb7c459-594xm\" (UID: \"c5c7ee4c-244c-48f3-844f-9864efd99cd6\") " pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:58:47.579402 kubelet[2704]: I0513 12:58:47.579322 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b59sr\" (UniqueName: \"kubernetes.io/projected/13643b3d-f618-4901-ac95-9d9e3c0fb895-kube-api-access-b59sr\") pod \"calico-apiserver-7854cf7bc7-llgzn\" (UID: \"13643b3d-f618-4901-ac95-9d9e3c0fb895\") " pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:58:47.579402 kubelet[2704]: I0513 12:58:47.579341 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6jv\" (UniqueName: \"kubernetes.io/projected/a9d392a7-21ed-46f8-a204-24615b7b6d08-kube-api-access-4c6jv\") pod \"calico-apiserver-5d4cb57867-c2wmg\" (UID: \"a9d392a7-21ed-46f8-a204-24615b7b6d08\") " pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:58:47.579402 kubelet[2704]: I0513 12:58:47.579355 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/509974ac-8c85-45a5-a90c-6e89c9a54779-calico-apiserver-certs\") pod \"calico-apiserver-5d4cb57867-2r646\" (UID: \"509974ac-8c85-45a5-a90c-6e89c9a54779\") " pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:58:47.579402 kubelet[2704]: I0513 12:58:47.579369 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkxd\" (UniqueName: \"kubernetes.io/projected/9ba775bd-6563-42d6-824b-2dbfad12b002-kube-api-access-xmkxd\") pod \"coredns-6f6b679f8f-cn4wk\" (UID: \"9ba775bd-6563-42d6-824b-2dbfad12b002\") " pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:58:47.579402 kubelet[2704]: I0513 12:58:47.579386 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7d6f99e-b52c-4124-8898-cdb6a2a240c9-config-volume\") pod \"coredns-6f6b679f8f-xs6bj\" (UID: \"e7d6f99e-b52c-4124-8898-cdb6a2a240c9\") " pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:58:47.579630 kubelet[2704]: I0513 12:58:47.579403 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a9d392a7-21ed-46f8-a204-24615b7b6d08-calico-apiserver-certs\") pod \"calico-apiserver-5d4cb57867-c2wmg\" (UID: \"a9d392a7-21ed-46f8-a204-24615b7b6d08\") " pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:58:47.579630 kubelet[2704]: I0513 12:58:47.579417 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5c7ee4c-244c-48f3-844f-9864efd99cd6-tigera-ca-bundle\") pod \"calico-kube-controllers-647bb7c459-594xm\" (UID: \"c5c7ee4c-244c-48f3-844f-9864efd99cd6\") " pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:58:47.579630 kubelet[2704]: I0513 12:58:47.579430 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gl5b\" (UniqueName: \"kubernetes.io/projected/509974ac-8c85-45a5-a90c-6e89c9a54779-kube-api-access-2gl5b\") pod \"calico-apiserver-5d4cb57867-2r646\" (UID: \"509974ac-8c85-45a5-a90c-6e89c9a54779\") " pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:58:47.579630 kubelet[2704]: I0513 12:58:47.579445 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ba775bd-6563-42d6-824b-2dbfad12b002-config-volume\") pod \"coredns-6f6b679f8f-cn4wk\" (UID: \"9ba775bd-6563-42d6-824b-2dbfad12b002\") " pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:58:47.579630 kubelet[2704]: I0513 12:58:47.579461 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/13643b3d-f618-4901-ac95-9d9e3c0fb895-calico-apiserver-certs\") pod \"calico-apiserver-7854cf7bc7-llgzn\" (UID: \"13643b3d-f618-4901-ac95-9d9e3c0fb895\") " pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:58:47.579751 kubelet[2704]: I0513 12:58:47.579522 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfzqf\" (UniqueName: \"kubernetes.io/projected/e7d6f99e-b52c-4124-8898-cdb6a2a240c9-kube-api-access-kfzqf\") pod \"coredns-6f6b679f8f-xs6bj\" (UID: \"e7d6f99e-b52c-4124-8898-cdb6a2a240c9\") " pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:58:47.937739 containerd[1594]: time="2025-05-13T12:58:47.937673930Z" level=error msg="Failed to destroy network for sandbox \"5f66aa0e711b2616b4fd11c450a544de6e952ce6d98781fbb6d445cdd617fe06\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:47.939940 systemd[1]: run-netns-cni\x2d9ce77120\x2d343c\x2d715d\x2d77ef\x2d0e52517707db.mount: Deactivated successfully. May 13 12:58:47.989420 containerd[1594]: time="2025-05-13T12:58:47.989343178Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f66aa0e711b2616b4fd11c450a544de6e952ce6d98781fbb6d445cdd617fe06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:47.990174 kubelet[2704]: E0513 12:58:47.990114 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f66aa0e711b2616b4fd11c450a544de6e952ce6d98781fbb6d445cdd617fe06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:47.990541 kubelet[2704]: E0513 12:58:47.990187 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f66aa0e711b2616b4fd11c450a544de6e952ce6d98781fbb6d445cdd617fe06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:58:47.990541 kubelet[2704]: E0513 12:58:47.990205 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f66aa0e711b2616b4fd11c450a544de6e952ce6d98781fbb6d445cdd617fe06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:58:47.990541 kubelet[2704]: E0513 12:58:47.990251 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f66aa0e711b2616b4fd11c450a544de6e952ce6d98781fbb6d445cdd617fe06\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:58:48.003483 containerd[1594]: time="2025-05-13T12:58:48.003433436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,}" May 13 12:58:48.028513 containerd[1594]: time="2025-05-13T12:58:48.028472064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,}" May 13 12:58:48.033713 containerd[1594]: time="2025-05-13T12:58:48.033669030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,}" May 13 12:58:48.039146 containerd[1594]: time="2025-05-13T12:58:48.039113502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,}" May 13 12:58:48.043715 containerd[1594]: time="2025-05-13T12:58:48.043692855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,}" May 13 12:58:48.048234 containerd[1594]: time="2025-05-13T12:58:48.048206705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,}" May 13 12:58:48.220917 containerd[1594]: time="2025-05-13T12:58:48.219397510Z" level=error msg="Failed to destroy network for sandbox \"9e350f9cdc5ba4694c5061d677a0c2a44ecbeb9edfe1650db331224b52588728\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.221184 containerd[1594]: time="2025-05-13T12:58:48.221142887Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e350f9cdc5ba4694c5061d677a0c2a44ecbeb9edfe1650db331224b52588728\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.221437 kubelet[2704]: E0513 12:58:48.221394 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e350f9cdc5ba4694c5061d677a0c2a44ecbeb9edfe1650db331224b52588728\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.221535 kubelet[2704]: E0513 12:58:48.221458 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e350f9cdc5ba4694c5061d677a0c2a44ecbeb9edfe1650db331224b52588728\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:58:48.221535 kubelet[2704]: E0513 12:58:48.221479 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e350f9cdc5ba4694c5061d677a0c2a44ecbeb9edfe1650db331224b52588728\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:58:48.221616 kubelet[2704]: E0513 12:58:48.221523 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e350f9cdc5ba4694c5061d677a0c2a44ecbeb9edfe1650db331224b52588728\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" podUID="c5c7ee4c-244c-48f3-844f-9864efd99cd6" May 13 12:58:48.231463 containerd[1594]: time="2025-05-13T12:58:48.231397176Z" level=error msg="Failed to destroy network for sandbox \"e385cd989443a35d6a62633a68d112a111d3f55e8e3a85a43a3c3521412b97c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.233725 containerd[1594]: time="2025-05-13T12:58:48.233665356Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e385cd989443a35d6a62633a68d112a111d3f55e8e3a85a43a3c3521412b97c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.233921 kubelet[2704]: E0513 12:58:48.233866 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e385cd989443a35d6a62633a68d112a111d3f55e8e3a85a43a3c3521412b97c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.233990 kubelet[2704]: E0513 12:58:48.233924 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e385cd989443a35d6a62633a68d112a111d3f55e8e3a85a43a3c3521412b97c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:58:48.233990 kubelet[2704]: E0513 12:58:48.233942 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e385cd989443a35d6a62633a68d112a111d3f55e8e3a85a43a3c3521412b97c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:58:48.233990 kubelet[2704]: E0513 12:58:48.233979 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e385cd989443a35d6a62633a68d112a111d3f55e8e3a85a43a3c3521412b97c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" podUID="509974ac-8c85-45a5-a90c-6e89c9a54779" May 13 12:58:48.241285 containerd[1594]: time="2025-05-13T12:58:48.241198852Z" level=error msg="Failed to destroy network for sandbox \"898199076ef53f6ef2deb801e5459f2cf5285911e909745645ec10e6369f7b61\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.243247 containerd[1594]: time="2025-05-13T12:58:48.243190843Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"898199076ef53f6ef2deb801e5459f2cf5285911e909745645ec10e6369f7b61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.243428 kubelet[2704]: E0513 12:58:48.243382 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898199076ef53f6ef2deb801e5459f2cf5285911e909745645ec10e6369f7b61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.243428 kubelet[2704]: E0513 12:58:48.243427 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898199076ef53f6ef2deb801e5459f2cf5285911e909745645ec10e6369f7b61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:58:48.243515 kubelet[2704]: E0513 12:58:48.243444 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"898199076ef53f6ef2deb801e5459f2cf5285911e909745645ec10e6369f7b61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:58:48.243515 kubelet[2704]: E0513 12:58:48.243476 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"898199076ef53f6ef2deb801e5459f2cf5285911e909745645ec10e6369f7b61\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xs6bj" podUID="e7d6f99e-b52c-4124-8898-cdb6a2a240c9" May 13 12:58:48.255938 containerd[1594]: time="2025-05-13T12:58:48.255786030Z" level=error msg="Failed to destroy network for sandbox \"b0402b5c9cbb3ee49fca61c179e2a897c4b0fb1e0b48c1a8107524b273c6c273\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.258134 containerd[1594]: time="2025-05-13T12:58:48.258091782Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0402b5c9cbb3ee49fca61c179e2a897c4b0fb1e0b48c1a8107524b273c6c273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.258986 kubelet[2704]: E0513 12:58:48.258933 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0402b5c9cbb3ee49fca61c179e2a897c4b0fb1e0b48c1a8107524b273c6c273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.259066 kubelet[2704]: E0513 12:58:48.258999 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0402b5c9cbb3ee49fca61c179e2a897c4b0fb1e0b48c1a8107524b273c6c273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:58:48.259066 kubelet[2704]: E0513 12:58:48.259023 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b0402b5c9cbb3ee49fca61c179e2a897c4b0fb1e0b48c1a8107524b273c6c273\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:58:48.259139 kubelet[2704]: E0513 12:58:48.259071 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b0402b5c9cbb3ee49fca61c179e2a897c4b0fb1e0b48c1a8107524b273c6c273\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" podUID="a9d392a7-21ed-46f8-a204-24615b7b6d08" May 13 12:58:48.260560 containerd[1594]: time="2025-05-13T12:58:48.260510216Z" level=error msg="Failed to destroy network for sandbox \"17225b92b97c6dbba263a011fd771cd8c46ddbf840bc06537bd64f1f3931c328\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.260686 containerd[1594]: time="2025-05-13T12:58:48.260649618Z" level=error msg="Failed to destroy network for sandbox \"46603c62c06d88ddf29ad9fe8f8adbc1a6ff90957f3e63f49aa3e56bc9e80811\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.261828 containerd[1594]: time="2025-05-13T12:58:48.261730394Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"17225b92b97c6dbba263a011fd771cd8c46ddbf840bc06537bd64f1f3931c328\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.262077 kubelet[2704]: E0513 12:58:48.262026 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17225b92b97c6dbba263a011fd771cd8c46ddbf840bc06537bd64f1f3931c328\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.262149 kubelet[2704]: E0513 12:58:48.262124 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17225b92b97c6dbba263a011fd771cd8c46ddbf840bc06537bd64f1f3931c328\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:58:48.262178 kubelet[2704]: E0513 12:58:48.262146 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17225b92b97c6dbba263a011fd771cd8c46ddbf840bc06537bd64f1f3931c328\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:58:48.262223 kubelet[2704]: E0513 12:58:48.262198 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"17225b92b97c6dbba263a011fd771cd8c46ddbf840bc06537bd64f1f3931c328\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" podUID="13643b3d-f618-4901-ac95-9d9e3c0fb895" May 13 12:58:48.262835 containerd[1594]: time="2025-05-13T12:58:48.262793705Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"46603c62c06d88ddf29ad9fe8f8adbc1a6ff90957f3e63f49aa3e56bc9e80811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.262993 kubelet[2704]: E0513 12:58:48.262894 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46603c62c06d88ddf29ad9fe8f8adbc1a6ff90957f3e63f49aa3e56bc9e80811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:48.262993 kubelet[2704]: E0513 12:58:48.262919 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46603c62c06d88ddf29ad9fe8f8adbc1a6ff90957f3e63f49aa3e56bc9e80811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:58:48.262993 kubelet[2704]: E0513 12:58:48.262933 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46603c62c06d88ddf29ad9fe8f8adbc1a6ff90957f3e63f49aa3e56bc9e80811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:58:48.263087 kubelet[2704]: E0513 12:58:48.262988 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"46603c62c06d88ddf29ad9fe8f8adbc1a6ff90957f3e63f49aa3e56bc9e80811\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-cn4wk" podUID="9ba775bd-6563-42d6-824b-2dbfad12b002" May 13 12:58:48.658472 containerd[1594]: time="2025-05-13T12:58:48.658437209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 12:58:48.859560 systemd[1]: run-netns-cni\x2da7575da7\x2d3b3b\x2d0425\x2d1dad\x2d8a7867d167e2.mount: Deactivated successfully. May 13 12:58:50.740903 systemd[1]: Started sshd@8-10.0.0.121:22-10.0.0.1:58234.service - OpenSSH per-connection server daemon (10.0.0.1:58234). May 13 12:58:50.800680 sshd[3810]: Accepted publickey for core from 10.0.0.1 port 58234 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:58:50.802453 sshd-session[3810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:58:50.808106 systemd-logind[1578]: New session 9 of user core. May 13 12:58:50.818510 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 12:58:50.927638 sshd[3812]: Connection closed by 10.0.0.1 port 58234 May 13 12:58:50.927974 sshd-session[3810]: pam_unix(sshd:session): session closed for user core May 13 12:58:50.932390 systemd[1]: sshd@8-10.0.0.121:22-10.0.0.1:58234.service: Deactivated successfully. May 13 12:58:50.934494 systemd[1]: session-9.scope: Deactivated successfully. May 13 12:58:50.935458 systemd-logind[1578]: Session 9 logged out. Waiting for processes to exit. May 13 12:58:50.936911 systemd-logind[1578]: Removed session 9. May 13 12:58:55.321614 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1073170839.mount: Deactivated successfully. May 13 12:58:55.944078 systemd[1]: Started sshd@9-10.0.0.121:22-10.0.0.1:58236.service - OpenSSH per-connection server daemon (10.0.0.1:58236). May 13 12:58:55.998416 sshd[3833]: Accepted publickey for core from 10.0.0.1 port 58236 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:58:55.999832 sshd-session[3833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:58:56.004150 systemd-logind[1578]: New session 10 of user core. May 13 12:58:56.010389 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 12:58:56.115855 sshd[3835]: Connection closed by 10.0.0.1 port 58236 May 13 12:58:56.116164 sshd-session[3833]: pam_unix(sshd:session): session closed for user core May 13 12:58:56.120415 systemd[1]: sshd@9-10.0.0.121:22-10.0.0.1:58236.service: Deactivated successfully. May 13 12:58:56.122411 systemd[1]: session-10.scope: Deactivated successfully. May 13 12:58:56.123215 systemd-logind[1578]: Session 10 logged out. Waiting for processes to exit. May 13 12:58:56.124463 systemd-logind[1578]: Removed session 10. May 13 12:58:56.736867 containerd[1594]: time="2025-05-13T12:58:56.736803152Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:56.792976 containerd[1594]: time="2025-05-13T12:58:56.792937919Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 13 12:58:56.830424 containerd[1594]: time="2025-05-13T12:58:56.830380443Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:56.866278 containerd[1594]: time="2025-05-13T12:58:56.866230665Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:58:56.866891 containerd[1594]: time="2025-05-13T12:58:56.866826204Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 8.208351335s" May 13 12:58:56.866891 containerd[1594]: time="2025-05-13T12:58:56.866877942Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 13 12:58:56.875711 containerd[1594]: time="2025-05-13T12:58:56.875665282Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 12:58:56.928941 containerd[1594]: time="2025-05-13T12:58:56.928882053Z" level=info msg="Container 536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:56.940814 containerd[1594]: time="2025-05-13T12:58:56.940758562Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\"" May 13 12:58:56.941403 containerd[1594]: time="2025-05-13T12:58:56.941363680Z" level=info msg="StartContainer for \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\"" May 13 12:58:56.942839 containerd[1594]: time="2025-05-13T12:58:56.942810750Z" level=info msg="connecting to shim 536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633" address="unix:///run/containerd/s/786be93baebb7c9c5e268c4e62eb87fc755386413199055817f8baa37eda0865" protocol=ttrpc version=3 May 13 12:58:56.974435 systemd[1]: Started cri-containerd-536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633.scope - libcontainer container 536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633. May 13 12:58:57.087024 containerd[1594]: time="2025-05-13T12:58:57.086973884Z" level=info msg="StartContainer for \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" returns successfully" May 13 12:58:57.105945 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 12:58:57.107173 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 12:58:57.129841 systemd[1]: cri-containerd-536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633.scope: Deactivated successfully. May 13 12:58:57.131338 containerd[1594]: time="2025-05-13T12:58:57.131289628Z" level=info msg="received exit event container_id:\"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" id:\"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" pid:3863 exit_status:1 exited_at:{seconds:1747141137 nanos:130617404}" May 13 12:58:57.131512 containerd[1594]: time="2025-05-13T12:58:57.131369999Z" level=info msg="TaskExit event in podsandbox handler container_id:\"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" id:\"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" pid:3863 exit_status:1 exited_at:{seconds:1747141137 nanos:130617404}" May 13 12:58:57.207675 kubelet[2704]: I0513 12:58:57.207604 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-krrmx" podStartSLOduration=2.142775224 podStartE2EDuration="28.207585128s" podCreationTimestamp="2025-05-13 12:58:29 +0000 UTC" firstStartedPulling="2025-05-13 12:58:30.802960124 +0000 UTC m=+17.333588121" lastFinishedPulling="2025-05-13 12:58:56.867770028 +0000 UTC m=+43.398398025" observedRunningTime="2025-05-13 12:58:57.207108392 +0000 UTC m=+43.737736409" watchObservedRunningTime="2025-05-13 12:58:57.207585128 +0000 UTC m=+43.738213125" May 13 12:58:57.872715 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633-rootfs.mount: Deactivated successfully. May 13 12:58:58.052060 containerd[1594]: time="2025-05-13T12:58:58.052002390Z" level=error msg="ExecSync for \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" failed" error="rpc error: code = NotFound desc = failed to exec in container: failed to create exec \"9bae893447f89eaf6b2f38406bd559aece30b1d91413958426376fccfc18cd84\": task 536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633 not found" May 13 12:58:58.052532 kubelet[2704]: E0513 12:58:58.052222 2704 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = failed to exec in container: failed to create exec \"9bae893447f89eaf6b2f38406bd559aece30b1d91413958426376fccfc18cd84\": task 536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633 not found" containerID="536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 13 12:58:58.052630 containerd[1594]: time="2025-05-13T12:58:58.052555510Z" level=error msg="ExecSync for \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 13 12:58:58.052675 kubelet[2704]: E0513 12:58:58.052635 2704 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 13 12:58:58.052812 containerd[1594]: time="2025-05-13T12:58:58.052790512Z" level=error msg="ExecSync for \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 13 12:58:58.052892 kubelet[2704]: E0513 12:58:58.052857 2704 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 13 12:58:58.199061 kubelet[2704]: I0513 12:58:58.198944 2704 scope.go:117] "RemoveContainer" containerID="536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633" May 13 12:58:58.201092 containerd[1594]: time="2025-05-13T12:58:58.201041095Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for container &ContainerMetadata{Name:calico-node,Attempt:1,}" May 13 12:58:58.327636 containerd[1594]: time="2025-05-13T12:58:58.327575172Z" level=info msg="Container 082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921: CDI devices from CRI Config.CDIDevices: []" May 13 12:58:58.339936 containerd[1594]: time="2025-05-13T12:58:58.339879078Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for &ContainerMetadata{Name:calico-node,Attempt:1,} returns container id \"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\"" May 13 12:58:58.340785 containerd[1594]: time="2025-05-13T12:58:58.340731319Z" level=info msg="StartContainer for \"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\"" May 13 12:58:58.342762 containerd[1594]: time="2025-05-13T12:58:58.342730206Z" level=info msg="connecting to shim 082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921" address="unix:///run/containerd/s/786be93baebb7c9c5e268c4e62eb87fc755386413199055817f8baa37eda0865" protocol=ttrpc version=3 May 13 12:58:58.370405 systemd[1]: Started cri-containerd-082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921.scope - libcontainer container 082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921. May 13 12:58:58.418445 containerd[1594]: time="2025-05-13T12:58:58.418393308Z" level=info msg="StartContainer for \"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\" returns successfully" May 13 12:58:58.471998 systemd[1]: cri-containerd-082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921.scope: Deactivated successfully. May 13 12:58:58.473140 containerd[1594]: time="2025-05-13T12:58:58.473008170Z" level=info msg="received exit event container_id:\"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\" id:\"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\" pid:3920 exit_status:1 exited_at:{seconds:1747141138 nanos:472686685}" May 13 12:58:58.473534 containerd[1594]: time="2025-05-13T12:58:58.473234274Z" level=info msg="TaskExit event in podsandbox handler container_id:\"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\" id:\"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\" pid:3920 exit_status:1 exited_at:{seconds:1747141138 nanos:472686685}" May 13 12:58:58.499202 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921-rootfs.mount: Deactivated successfully. May 13 12:58:58.545875 containerd[1594]: time="2025-05-13T12:58:58.545816175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,}" May 13 12:58:58.610125 containerd[1594]: time="2025-05-13T12:58:58.610056587Z" level=error msg="Failed to destroy network for sandbox \"2378849413ebe16386c82e36451c9089428b16f97ed269cade6774c96b0e7348\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:58.671188 containerd[1594]: time="2025-05-13T12:58:58.671118407Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2378849413ebe16386c82e36451c9089428b16f97ed269cade6774c96b0e7348\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:58.671458 kubelet[2704]: E0513 12:58:58.671405 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2378849413ebe16386c82e36451c9089428b16f97ed269cade6774c96b0e7348\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:58.671809 kubelet[2704]: E0513 12:58:58.671476 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2378849413ebe16386c82e36451c9089428b16f97ed269cade6774c96b0e7348\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:58:58.671809 kubelet[2704]: E0513 12:58:58.671500 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2378849413ebe16386c82e36451c9089428b16f97ed269cade6774c96b0e7348\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:58:58.672567 kubelet[2704]: E0513 12:58:58.672515 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2378849413ebe16386c82e36451c9089428b16f97ed269cade6774c96b0e7348\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-cn4wk" podUID="9ba775bd-6563-42d6-824b-2dbfad12b002" May 13 12:58:59.204650 kubelet[2704]: I0513 12:58:59.204615 2704 scope.go:117] "RemoveContainer" containerID="536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633" May 13 12:58:59.204981 kubelet[2704]: I0513 12:58:59.204948 2704 scope.go:117] "RemoveContainer" containerID="082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921" May 13 12:58:59.205097 kubelet[2704]: E0513 12:58:59.205073 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 10s restarting failed container=calico-node pod=calico-node-krrmx_calico-system(6f5a9517-847c-4101-8b62-b95d8d1852c2)\"" pod="calico-system/calico-node-krrmx" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" May 13 12:58:59.207332 containerd[1594]: time="2025-05-13T12:58:59.207290980Z" level=info msg="RemoveContainer for \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\"" May 13 12:58:59.224616 containerd[1594]: time="2025-05-13T12:58:59.224548472Z" level=info msg="RemoveContainer for \"536d0a1fca76c20e7bde50bf89c47b30d3270aff42e9315a89ae72343094a633\" returns successfully" May 13 12:58:59.545379 containerd[1594]: time="2025-05-13T12:58:59.545297958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,}" May 13 12:58:59.545533 containerd[1594]: time="2025-05-13T12:58:59.545416471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,}" May 13 12:58:59.709667 containerd[1594]: time="2025-05-13T12:58:59.709611363Z" level=error msg="Failed to destroy network for sandbox \"adba1b5e06193d420803d3c5ac53d0f06a8ec9eb7fbcd899a478a97b5e97f0fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:58:59.711959 systemd[1]: run-netns-cni\x2dea2099ce\x2db1ec\x2d8846\x2da832\x2d134053af784b.mount: Deactivated successfully. May 13 12:59:00.142215 containerd[1594]: time="2025-05-13T12:59:00.142153530Z" level=error msg="Failed to destroy network for sandbox \"e0b6d4b40f242c04e1eda4a34781eff9cf587ce794557c84aca4c1d63dc982ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:00.144453 systemd[1]: run-netns-cni\x2d9a4514d1\x2dbb45\x2d6992\x2d1b97\x2da04ab10819ca.mount: Deactivated successfully. May 13 12:59:00.164516 containerd[1594]: time="2025-05-13T12:59:00.164449526Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"adba1b5e06193d420803d3c5ac53d0f06a8ec9eb7fbcd899a478a97b5e97f0fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:00.164745 kubelet[2704]: E0513 12:59:00.164696 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adba1b5e06193d420803d3c5ac53d0f06a8ec9eb7fbcd899a478a97b5e97f0fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:00.165128 kubelet[2704]: E0513 12:59:00.164757 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adba1b5e06193d420803d3c5ac53d0f06a8ec9eb7fbcd899a478a97b5e97f0fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:59:00.165128 kubelet[2704]: E0513 12:59:00.164792 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"adba1b5e06193d420803d3c5ac53d0f06a8ec9eb7fbcd899a478a97b5e97f0fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:59:00.165128 kubelet[2704]: E0513 12:59:00.164837 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"adba1b5e06193d420803d3c5ac53d0f06a8ec9eb7fbcd899a478a97b5e97f0fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" podUID="509974ac-8c85-45a5-a90c-6e89c9a54779" May 13 12:59:00.192945 containerd[1594]: time="2025-05-13T12:59:00.192888525Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b6d4b40f242c04e1eda4a34781eff9cf587ce794557c84aca4c1d63dc982ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:00.193167 kubelet[2704]: E0513 12:59:00.193113 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b6d4b40f242c04e1eda4a34781eff9cf587ce794557c84aca4c1d63dc982ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:00.193205 kubelet[2704]: E0513 12:59:00.193176 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b6d4b40f242c04e1eda4a34781eff9cf587ce794557c84aca4c1d63dc982ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:59:00.193205 kubelet[2704]: E0513 12:59:00.193194 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0b6d4b40f242c04e1eda4a34781eff9cf587ce794557c84aca4c1d63dc982ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:59:00.193278 kubelet[2704]: E0513 12:59:00.193236 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0b6d4b40f242c04e1eda4a34781eff9cf587ce794557c84aca4c1d63dc982ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:59:00.211829 kubelet[2704]: I0513 12:59:00.211802 2704 scope.go:117] "RemoveContainer" containerID="082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921" May 13 12:59:00.211945 kubelet[2704]: E0513 12:59:00.211927 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 10s restarting failed container=calico-node pod=calico-node-krrmx_calico-system(6f5a9517-847c-4101-8b62-b95d8d1852c2)\"" pod="calico-system/calico-node-krrmx" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" May 13 12:59:01.132213 systemd[1]: Started sshd@10-10.0.0.121:22-10.0.0.1:53882.service - OpenSSH per-connection server daemon (10.0.0.1:53882). May 13 12:59:01.187307 sshd[4068]: Accepted publickey for core from 10.0.0.1 port 53882 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:01.189567 sshd-session[4068]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:01.204444 systemd-logind[1578]: New session 11 of user core. May 13 12:59:01.208517 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 12:59:01.333126 sshd[4070]: Connection closed by 10.0.0.1 port 53882 May 13 12:59:01.333446 sshd-session[4068]: pam_unix(sshd:session): session closed for user core May 13 12:59:01.345110 systemd[1]: sshd@10-10.0.0.121:22-10.0.0.1:53882.service: Deactivated successfully. May 13 12:59:01.347041 systemd[1]: session-11.scope: Deactivated successfully. May 13 12:59:01.347910 systemd-logind[1578]: Session 11 logged out. Waiting for processes to exit. May 13 12:59:01.350939 systemd[1]: Started sshd@11-10.0.0.121:22-10.0.0.1:53894.service - OpenSSH per-connection server daemon (10.0.0.1:53894). May 13 12:59:01.351834 systemd-logind[1578]: Removed session 11. May 13 12:59:01.400994 sshd[4085]: Accepted publickey for core from 10.0.0.1 port 53894 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:01.402985 sshd-session[4085]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:01.408693 systemd-logind[1578]: New session 12 of user core. May 13 12:59:01.417440 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 12:59:01.632914 sshd[4087]: Connection closed by 10.0.0.1 port 53894 May 13 12:59:01.633307 sshd-session[4085]: pam_unix(sshd:session): session closed for user core May 13 12:59:01.644127 systemd[1]: sshd@11-10.0.0.121:22-10.0.0.1:53894.service: Deactivated successfully. May 13 12:59:01.646030 systemd[1]: session-12.scope: Deactivated successfully. May 13 12:59:01.646869 systemd-logind[1578]: Session 12 logged out. Waiting for processes to exit. May 13 12:59:01.651926 systemd[1]: Started sshd@12-10.0.0.121:22-10.0.0.1:53908.service - OpenSSH per-connection server daemon (10.0.0.1:53908). May 13 12:59:01.653554 systemd-logind[1578]: Removed session 12. May 13 12:59:01.698823 sshd[4099]: Accepted publickey for core from 10.0.0.1 port 53908 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:01.700127 sshd-session[4099]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:01.704615 systemd-logind[1578]: New session 13 of user core. May 13 12:59:01.719406 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 12:59:01.835752 sshd[4101]: Connection closed by 10.0.0.1 port 53908 May 13 12:59:01.836065 sshd-session[4099]: pam_unix(sshd:session): session closed for user core May 13 12:59:01.840460 systemd[1]: sshd@12-10.0.0.121:22-10.0.0.1:53908.service: Deactivated successfully. May 13 12:59:01.842314 systemd[1]: session-13.scope: Deactivated successfully. May 13 12:59:01.843177 systemd-logind[1578]: Session 13 logged out. Waiting for processes to exit. May 13 12:59:01.844659 systemd-logind[1578]: Removed session 13. May 13 12:59:02.546031 containerd[1594]: time="2025-05-13T12:59:02.545746590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,}" May 13 12:59:02.546031 containerd[1594]: time="2025-05-13T12:59:02.545901161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:02.546031 containerd[1594]: time="2025-05-13T12:59:02.545899077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,}" May 13 12:59:02.546510 containerd[1594]: time="2025-05-13T12:59:02.545901983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:02.626885 containerd[1594]: time="2025-05-13T12:59:02.626819200Z" level=error msg="Failed to destroy network for sandbox \"7a3661783a9815696fb339d918e99e6edfbe7205e536e5c133a7008da5d543c6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.630019 systemd[1]: run-netns-cni\x2db3e715c3\x2deffd\x2d5cec\x2dc900\x2d1df784ed8345.mount: Deactivated successfully. May 13 12:59:02.630803 containerd[1594]: time="2025-05-13T12:59:02.630203788Z" level=error msg="Failed to destroy network for sandbox \"63142233c3694656eeb15b2d4f67e8c50d20b2de7406dde16e560b4c49f614f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.634200 systemd[1]: run-netns-cni\x2dadee8484\x2d1f49\x2dea37\x2db6c3\x2d0fd81f85824e.mount: Deactivated successfully. May 13 12:59:02.634849 containerd[1594]: time="2025-05-13T12:59:02.634776338Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a3661783a9815696fb339d918e99e6edfbe7205e536e5c133a7008da5d543c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.635368 kubelet[2704]: E0513 12:59:02.635317 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a3661783a9815696fb339d918e99e6edfbe7205e536e5c133a7008da5d543c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.636200 kubelet[2704]: E0513 12:59:02.635818 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a3661783a9815696fb339d918e99e6edfbe7205e536e5c133a7008da5d543c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:59:02.636200 kubelet[2704]: E0513 12:59:02.635850 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a3661783a9815696fb339d918e99e6edfbe7205e536e5c133a7008da5d543c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:59:02.636200 kubelet[2704]: E0513 12:59:02.635913 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a3661783a9815696fb339d918e99e6edfbe7205e536e5c133a7008da5d543c6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xs6bj" podUID="e7d6f99e-b52c-4124-8898-cdb6a2a240c9" May 13 12:59:02.636491 containerd[1594]: time="2025-05-13T12:59:02.636411981Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"63142233c3694656eeb15b2d4f67e8c50d20b2de7406dde16e560b4c49f614f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.637989 kubelet[2704]: E0513 12:59:02.637235 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63142233c3694656eeb15b2d4f67e8c50d20b2de7406dde16e560b4c49f614f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.638051 kubelet[2704]: E0513 12:59:02.638006 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63142233c3694656eeb15b2d4f67e8c50d20b2de7406dde16e560b4c49f614f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:59:02.638051 kubelet[2704]: E0513 12:59:02.638036 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63142233c3694656eeb15b2d4f67e8c50d20b2de7406dde16e560b4c49f614f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:59:02.638487 kubelet[2704]: E0513 12:59:02.638394 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63142233c3694656eeb15b2d4f67e8c50d20b2de7406dde16e560b4c49f614f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" podUID="c5c7ee4c-244c-48f3-844f-9864efd99cd6" May 13 12:59:02.646245 containerd[1594]: time="2025-05-13T12:59:02.646194569Z" level=error msg="Failed to destroy network for sandbox \"d477d75e9cb72b686ffdf74938022f7310f6d0537bce0ae9a5f0271fec462715\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.649042 containerd[1594]: time="2025-05-13T12:59:02.648986023Z" level=error msg="Failed to destroy network for sandbox \"0ff466cfbd1243f12d3a99bfe68de5db3b3ac4ccf9c8c1c4b4f528a10f00a1d4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.649127 containerd[1594]: time="2025-05-13T12:59:02.649035808Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d477d75e9cb72b686ffdf74938022f7310f6d0537bce0ae9a5f0271fec462715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.649485 kubelet[2704]: E0513 12:59:02.649442 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d477d75e9cb72b686ffdf74938022f7310f6d0537bce0ae9a5f0271fec462715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.649565 kubelet[2704]: E0513 12:59:02.649503 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d477d75e9cb72b686ffdf74938022f7310f6d0537bce0ae9a5f0271fec462715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:59:02.649565 kubelet[2704]: E0513 12:59:02.649523 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d477d75e9cb72b686ffdf74938022f7310f6d0537bce0ae9a5f0271fec462715\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:59:02.649638 kubelet[2704]: E0513 12:59:02.649602 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d477d75e9cb72b686ffdf74938022f7310f6d0537bce0ae9a5f0271fec462715\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" podUID="13643b3d-f618-4901-ac95-9d9e3c0fb895" May 13 12:59:02.649818 systemd[1]: run-netns-cni\x2d69a21efb\x2d26e4\x2d111b\x2d83cf\x2d05f4c13fc6f4.mount: Deactivated successfully. May 13 12:59:02.650677 containerd[1594]: time="2025-05-13T12:59:02.650642225Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ff466cfbd1243f12d3a99bfe68de5db3b3ac4ccf9c8c1c4b4f528a10f00a1d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.650953 kubelet[2704]: E0513 12:59:02.650912 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ff466cfbd1243f12d3a99bfe68de5db3b3ac4ccf9c8c1c4b4f528a10f00a1d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:02.651073 kubelet[2704]: E0513 12:59:02.650968 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ff466cfbd1243f12d3a99bfe68de5db3b3ac4ccf9c8c1c4b4f528a10f00a1d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:59:02.651073 kubelet[2704]: E0513 12:59:02.650994 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ff466cfbd1243f12d3a99bfe68de5db3b3ac4ccf9c8c1c4b4f528a10f00a1d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:59:02.651073 kubelet[2704]: E0513 12:59:02.651044 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ff466cfbd1243f12d3a99bfe68de5db3b3ac4ccf9c8c1c4b4f528a10f00a1d4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" podUID="a9d392a7-21ed-46f8-a204-24615b7b6d08" May 13 12:59:03.551099 systemd[1]: run-netns-cni\x2d3626cea5\x2dd9c6\x2db506\x2d8b74\x2dd0b11a8af739.mount: Deactivated successfully. May 13 12:59:06.848075 systemd[1]: Started sshd@13-10.0.0.121:22-10.0.0.1:53912.service - OpenSSH per-connection server daemon (10.0.0.1:53912). May 13 12:59:06.910093 sshd[4262]: Accepted publickey for core from 10.0.0.1 port 53912 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:06.911875 sshd-session[4262]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:06.917068 systemd-logind[1578]: New session 14 of user core. May 13 12:59:06.926440 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 12:59:07.040670 sshd[4264]: Connection closed by 10.0.0.1 port 53912 May 13 12:59:07.041055 sshd-session[4262]: pam_unix(sshd:session): session closed for user core May 13 12:59:07.046054 systemd[1]: sshd@13-10.0.0.121:22-10.0.0.1:53912.service: Deactivated successfully. May 13 12:59:07.048151 systemd[1]: session-14.scope: Deactivated successfully. May 13 12:59:07.049207 systemd-logind[1578]: Session 14 logged out. Waiting for processes to exit. May 13 12:59:07.050568 systemd-logind[1578]: Removed session 14. May 13 12:59:10.547288 containerd[1594]: time="2025-05-13T12:59:10.546500930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,}" May 13 12:59:10.594858 containerd[1594]: time="2025-05-13T12:59:10.594800180Z" level=error msg="Failed to destroy network for sandbox \"52578a6dcca8251d48a13e07436bda45df381c0797cd4258e4b581b941dbfc80\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:10.596427 containerd[1594]: time="2025-05-13T12:59:10.596360670Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"52578a6dcca8251d48a13e07436bda45df381c0797cd4258e4b581b941dbfc80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:10.596713 kubelet[2704]: E0513 12:59:10.596668 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52578a6dcca8251d48a13e07436bda45df381c0797cd4258e4b581b941dbfc80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:10.597167 kubelet[2704]: E0513 12:59:10.596750 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52578a6dcca8251d48a13e07436bda45df381c0797cd4258e4b581b941dbfc80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:59:10.597167 kubelet[2704]: E0513 12:59:10.596775 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52578a6dcca8251d48a13e07436bda45df381c0797cd4258e4b581b941dbfc80\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:59:10.597167 kubelet[2704]: E0513 12:59:10.596826 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52578a6dcca8251d48a13e07436bda45df381c0797cd4258e4b581b941dbfc80\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:59:10.597003 systemd[1]: run-netns-cni\x2de25b4eb0\x2db35f\x2deddb\x2d0834\x2d8f58547dd2e8.mount: Deactivated successfully. May 13 12:59:12.052737 systemd[1]: Started sshd@14-10.0.0.121:22-10.0.0.1:55548.service - OpenSSH per-connection server daemon (10.0.0.1:55548). May 13 12:59:12.104747 sshd[4315]: Accepted publickey for core from 10.0.0.1 port 55548 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:12.106095 sshd-session[4315]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:12.110326 systemd-logind[1578]: New session 15 of user core. May 13 12:59:12.118383 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 12:59:12.221399 sshd[4317]: Connection closed by 10.0.0.1 port 55548 May 13 12:59:12.221683 sshd-session[4315]: pam_unix(sshd:session): session closed for user core May 13 12:59:12.225961 systemd[1]: sshd@14-10.0.0.121:22-10.0.0.1:55548.service: Deactivated successfully. May 13 12:59:12.228243 systemd[1]: session-15.scope: Deactivated successfully. May 13 12:59:12.229573 systemd-logind[1578]: Session 15 logged out. Waiting for processes to exit. May 13 12:59:12.230918 systemd-logind[1578]: Removed session 15. May 13 12:59:12.544531 kubelet[2704]: I0513 12:59:12.544504 2704 scope.go:117] "RemoveContainer" containerID="082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921" May 13 12:59:12.544927 containerd[1594]: time="2025-05-13T12:59:12.544681015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,}" May 13 12:59:12.546165 containerd[1594]: time="2025-05-13T12:59:12.546121279Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for container &ContainerMetadata{Name:calico-node,Attempt:2,}" May 13 12:59:12.605816 containerd[1594]: time="2025-05-13T12:59:12.605762231Z" level=error msg="Failed to destroy network for sandbox \"f00d1c8f3cc3dde8bc1d915b15a5e1fb3a2476ab0b2236e7d7f8a14df4013533\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:12.607910 systemd[1]: run-netns-cni\x2d254a657c\x2d3f21\x2d1eea\x2d8952\x2d56be1c6aa4fc.mount: Deactivated successfully. May 13 12:59:12.730871 containerd[1594]: time="2025-05-13T12:59:12.730810140Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f00d1c8f3cc3dde8bc1d915b15a5e1fb3a2476ab0b2236e7d7f8a14df4013533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:12.731120 kubelet[2704]: E0513 12:59:12.731068 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f00d1c8f3cc3dde8bc1d915b15a5e1fb3a2476ab0b2236e7d7f8a14df4013533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:12.731169 kubelet[2704]: E0513 12:59:12.731138 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f00d1c8f3cc3dde8bc1d915b15a5e1fb3a2476ab0b2236e7d7f8a14df4013533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:59:12.731169 kubelet[2704]: E0513 12:59:12.731159 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f00d1c8f3cc3dde8bc1d915b15a5e1fb3a2476ab0b2236e7d7f8a14df4013533\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:59:12.731230 kubelet[2704]: E0513 12:59:12.731201 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f00d1c8f3cc3dde8bc1d915b15a5e1fb3a2476ab0b2236e7d7f8a14df4013533\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-cn4wk" podUID="9ba775bd-6563-42d6-824b-2dbfad12b002" May 13 12:59:12.869461 containerd[1594]: time="2025-05-13T12:59:12.869350346Z" level=info msg="Container 521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:12.870920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1087371543.mount: Deactivated successfully. May 13 12:59:13.037951 containerd[1594]: time="2025-05-13T12:59:13.037905705Z" level=info msg="CreateContainer within sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" for &ContainerMetadata{Name:calico-node,Attempt:2,} returns container id \"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\"" May 13 12:59:13.038686 containerd[1594]: time="2025-05-13T12:59:13.038628412Z" level=info msg="StartContainer for \"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\"" May 13 12:59:13.039896 containerd[1594]: time="2025-05-13T12:59:13.039867819Z" level=info msg="connecting to shim 521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de" address="unix:///run/containerd/s/786be93baebb7c9c5e268c4e62eb87fc755386413199055817f8baa37eda0865" protocol=ttrpc version=3 May 13 12:59:13.062392 systemd[1]: Started cri-containerd-521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de.scope - libcontainer container 521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de. May 13 12:59:13.106535 containerd[1594]: time="2025-05-13T12:59:13.106496502Z" level=info msg="StartContainer for \"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\" returns successfully" May 13 12:59:13.151809 systemd[1]: cri-containerd-521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de.scope: Deactivated successfully. May 13 12:59:13.152801 containerd[1594]: time="2025-05-13T12:59:13.152761859Z" level=info msg="TaskExit event in podsandbox handler container_id:\"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\" id:\"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\" pid:4379 exit_status:1 exited_at:{seconds:1747141153 nanos:152448972}" May 13 12:59:13.152801 containerd[1594]: time="2025-05-13T12:59:13.152767790Z" level=info msg="received exit event container_id:\"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\" id:\"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\" pid:4379 exit_status:1 exited_at:{seconds:1747141153 nanos:152448972}" May 13 12:59:13.266196 kubelet[2704]: I0513 12:59:13.266162 2704 scope.go:117] "RemoveContainer" containerID="082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921" May 13 12:59:13.266603 kubelet[2704]: I0513 12:59:13.266587 2704 scope.go:117] "RemoveContainer" containerID="521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de" May 13 12:59:13.266759 kubelet[2704]: E0513 12:59:13.266720 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 20s restarting failed container=calico-node pod=calico-node-krrmx_calico-system(6f5a9517-847c-4101-8b62-b95d8d1852c2)\"" pod="calico-system/calico-node-krrmx" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" May 13 12:59:13.269509 containerd[1594]: time="2025-05-13T12:59:13.269469068Z" level=info msg="RemoveContainer for \"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\"" May 13 12:59:13.277865 containerd[1594]: time="2025-05-13T12:59:13.277825556Z" level=info msg="RemoveContainer for \"082a78a41e782fce7495c94a20fe6b2760c89291b007ced90e1c5b4851d28921\" returns successfully" May 13 12:59:13.545594 containerd[1594]: time="2025-05-13T12:59:13.545544259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:13.551841 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de-rootfs.mount: Deactivated successfully. May 13 12:59:13.599176 containerd[1594]: time="2025-05-13T12:59:13.599117314Z" level=error msg="Failed to destroy network for sandbox \"a9f269b7e02dcbe2072c89809ce623cdda4bc9824a55701b5504a215c71bba47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:13.601359 containerd[1594]: time="2025-05-13T12:59:13.601220272Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f269b7e02dcbe2072c89809ce623cdda4bc9824a55701b5504a215c71bba47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:13.601587 kubelet[2704]: E0513 12:59:13.601546 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f269b7e02dcbe2072c89809ce623cdda4bc9824a55701b5504a215c71bba47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:13.602119 kubelet[2704]: E0513 12:59:13.601614 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f269b7e02dcbe2072c89809ce623cdda4bc9824a55701b5504a215c71bba47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:59:13.602119 kubelet[2704]: E0513 12:59:13.601636 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f269b7e02dcbe2072c89809ce623cdda4bc9824a55701b5504a215c71bba47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:59:13.602119 kubelet[2704]: E0513 12:59:13.601684 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9f269b7e02dcbe2072c89809ce623cdda4bc9824a55701b5504a215c71bba47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" podUID="509974ac-8c85-45a5-a90c-6e89c9a54779" May 13 12:59:13.602023 systemd[1]: run-netns-cni\x2d78f3eaf0\x2dbd4a\x2dbda1\x2d9527\x2d6304c8d1e98f.mount: Deactivated successfully. May 13 12:59:14.544735 containerd[1594]: time="2025-05-13T12:59:14.544682861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,}" May 13 12:59:14.544852 containerd[1594]: time="2025-05-13T12:59:14.544682941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,}" May 13 12:59:14.600732 containerd[1594]: time="2025-05-13T12:59:14.600668961Z" level=error msg="Failed to destroy network for sandbox \"80673da72dc3a342feaf8170f9b72cf433f21eb05390210c86f69990604e3235\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:14.602597 containerd[1594]: time="2025-05-13T12:59:14.602560331Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"80673da72dc3a342feaf8170f9b72cf433f21eb05390210c86f69990604e3235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:14.602802 kubelet[2704]: E0513 12:59:14.602765 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80673da72dc3a342feaf8170f9b72cf433f21eb05390210c86f69990604e3235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:14.603096 kubelet[2704]: E0513 12:59:14.602824 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80673da72dc3a342feaf8170f9b72cf433f21eb05390210c86f69990604e3235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:59:14.603096 kubelet[2704]: E0513 12:59:14.602845 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80673da72dc3a342feaf8170f9b72cf433f21eb05390210c86f69990604e3235\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:59:14.603096 kubelet[2704]: E0513 12:59:14.602889 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80673da72dc3a342feaf8170f9b72cf433f21eb05390210c86f69990604e3235\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" podUID="c5c7ee4c-244c-48f3-844f-9864efd99cd6" May 13 12:59:14.603034 systemd[1]: run-netns-cni\x2ddde65544\x2dd301\x2d8fc6\x2d6090\x2d773478ef5ef8.mount: Deactivated successfully. May 13 12:59:14.609344 containerd[1594]: time="2025-05-13T12:59:14.609298028Z" level=error msg="Failed to destroy network for sandbox \"da982ff75d32e8413904e68217c768c194d3ac3c4b34d272a1e5d42818b44cbf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:14.610760 containerd[1594]: time="2025-05-13T12:59:14.610721390Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"da982ff75d32e8413904e68217c768c194d3ac3c4b34d272a1e5d42818b44cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:14.610985 kubelet[2704]: E0513 12:59:14.610930 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da982ff75d32e8413904e68217c768c194d3ac3c4b34d272a1e5d42818b44cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:14.611049 kubelet[2704]: E0513 12:59:14.611010 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da982ff75d32e8413904e68217c768c194d3ac3c4b34d272a1e5d42818b44cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:59:14.611049 kubelet[2704]: E0513 12:59:14.611034 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da982ff75d32e8413904e68217c768c194d3ac3c4b34d272a1e5d42818b44cbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:59:14.611124 kubelet[2704]: E0513 12:59:14.611085 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"da982ff75d32e8413904e68217c768c194d3ac3c4b34d272a1e5d42818b44cbf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xs6bj" podUID="e7d6f99e-b52c-4124-8898-cdb6a2a240c9" May 13 12:59:14.612285 systemd[1]: run-netns-cni\x2d0be8e532\x2d211c\x2d898f\x2dd45d\x2d7ff03c549b1d.mount: Deactivated successfully. May 13 12:59:15.545752 containerd[1594]: time="2025-05-13T12:59:15.545683921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:15.598999 containerd[1594]: time="2025-05-13T12:59:15.598935545Z" level=error msg="Failed to destroy network for sandbox \"0e9a17099901ad9e455a2002ab6a00320d7dae91a07b3033245e2f7fd6ee4fc9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:15.600445 containerd[1594]: time="2025-05-13T12:59:15.600396887Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9a17099901ad9e455a2002ab6a00320d7dae91a07b3033245e2f7fd6ee4fc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:15.600684 kubelet[2704]: E0513 12:59:15.600634 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9a17099901ad9e455a2002ab6a00320d7dae91a07b3033245e2f7fd6ee4fc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:15.600829 kubelet[2704]: E0513 12:59:15.600703 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9a17099901ad9e455a2002ab6a00320d7dae91a07b3033245e2f7fd6ee4fc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:59:15.600829 kubelet[2704]: E0513 12:59:15.600724 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9a17099901ad9e455a2002ab6a00320d7dae91a07b3033245e2f7fd6ee4fc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:59:15.600829 kubelet[2704]: E0513 12:59:15.600767 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0e9a17099901ad9e455a2002ab6a00320d7dae91a07b3033245e2f7fd6ee4fc9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" podUID="13643b3d-f618-4901-ac95-9d9e3c0fb895" May 13 12:59:15.601344 systemd[1]: run-netns-cni\x2d526c6862\x2d3b36\x2d76a2\x2d9d9f\x2d4cc6cd3ae4eb.mount: Deactivated successfully. May 13 12:59:16.545276 containerd[1594]: time="2025-05-13T12:59:16.545216895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:16.595144 containerd[1594]: time="2025-05-13T12:59:16.595090262Z" level=error msg="Failed to destroy network for sandbox \"4b2515d56781a542753f491e28e42dd07a3dc4d953f3e0b16e4ba15a1aeac849\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:16.596567 containerd[1594]: time="2025-05-13T12:59:16.596535184Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2515d56781a542753f491e28e42dd07a3dc4d953f3e0b16e4ba15a1aeac849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:16.596808 kubelet[2704]: E0513 12:59:16.596754 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2515d56781a542753f491e28e42dd07a3dc4d953f3e0b16e4ba15a1aeac849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:16.597081 kubelet[2704]: E0513 12:59:16.596818 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2515d56781a542753f491e28e42dd07a3dc4d953f3e0b16e4ba15a1aeac849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:59:16.597081 kubelet[2704]: E0513 12:59:16.596837 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b2515d56781a542753f491e28e42dd07a3dc4d953f3e0b16e4ba15a1aeac849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:59:16.597081 kubelet[2704]: E0513 12:59:16.596899 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4b2515d56781a542753f491e28e42dd07a3dc4d953f3e0b16e4ba15a1aeac849\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" podUID="a9d392a7-21ed-46f8-a204-24615b7b6d08" May 13 12:59:16.597422 systemd[1]: run-netns-cni\x2d1032751e\x2de543\x2dfa3c\x2d080d\x2d6ac150c341f4.mount: Deactivated successfully. May 13 12:59:17.245243 systemd[1]: Started sshd@15-10.0.0.121:22-10.0.0.1:55560.service - OpenSSH per-connection server daemon (10.0.0.1:55560). May 13 12:59:17.300116 sshd[4605]: Accepted publickey for core from 10.0.0.1 port 55560 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:17.301574 sshd-session[4605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:17.305831 systemd-logind[1578]: New session 16 of user core. May 13 12:59:17.314391 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 12:59:17.420248 sshd[4607]: Connection closed by 10.0.0.1 port 55560 May 13 12:59:17.420587 sshd-session[4605]: pam_unix(sshd:session): session closed for user core May 13 12:59:17.423512 systemd[1]: sshd@15-10.0.0.121:22-10.0.0.1:55560.service: Deactivated successfully. May 13 12:59:17.425465 systemd[1]: session-16.scope: Deactivated successfully. May 13 12:59:17.427041 systemd-logind[1578]: Session 16 logged out. Waiting for processes to exit. May 13 12:59:17.428468 systemd-logind[1578]: Removed session 16. May 13 12:59:20.484335 kubelet[2704]: I0513 12:59:20.484287 2704 scope.go:117] "RemoveContainer" containerID="521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de" May 13 12:59:20.484737 kubelet[2704]: E0513 12:59:20.484425 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 20s restarting failed container=calico-node pod=calico-node-krrmx_calico-system(6f5a9517-847c-4101-8b62-b95d8d1852c2)\"" pod="calico-system/calico-node-krrmx" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" May 13 12:59:22.432135 systemd[1]: Started sshd@16-10.0.0.121:22-10.0.0.1:40090.service - OpenSSH per-connection server daemon (10.0.0.1:40090). May 13 12:59:22.471242 sshd[4622]: Accepted publickey for core from 10.0.0.1 port 40090 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:22.472959 sshd-session[4622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:22.477318 systemd-logind[1578]: New session 17 of user core. May 13 12:59:22.486456 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 12:59:22.545330 containerd[1594]: time="2025-05-13T12:59:22.545283909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,}" May 13 12:59:22.604418 sshd[4624]: Connection closed by 10.0.0.1 port 40090 May 13 12:59:22.604714 sshd-session[4622]: pam_unix(sshd:session): session closed for user core May 13 12:59:22.609003 systemd[1]: sshd@16-10.0.0.121:22-10.0.0.1:40090.service: Deactivated successfully. May 13 12:59:22.609130 containerd[1594]: time="2025-05-13T12:59:22.609090243Z" level=error msg="Failed to destroy network for sandbox \"ab9c89f19b449c0a98a75aa90a9b0ff9d9331670d7757822d12f5f1705a009c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:22.610782 containerd[1594]: time="2025-05-13T12:59:22.610719360Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab9c89f19b449c0a98a75aa90a9b0ff9d9331670d7757822d12f5f1705a009c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:22.611050 kubelet[2704]: E0513 12:59:22.610996 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab9c89f19b449c0a98a75aa90a9b0ff9d9331670d7757822d12f5f1705a009c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:22.611430 kubelet[2704]: E0513 12:59:22.611076 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab9c89f19b449c0a98a75aa90a9b0ff9d9331670d7757822d12f5f1705a009c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:59:22.611430 kubelet[2704]: E0513 12:59:22.611095 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab9c89f19b449c0a98a75aa90a9b0ff9d9331670d7757822d12f5f1705a009c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-q7cjm" May 13 12:59:22.611430 kubelet[2704]: E0513 12:59:22.611134 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-q7cjm_calico-system(7dd68750-995a-4166-a528-aef7a2785014)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab9c89f19b449c0a98a75aa90a9b0ff9d9331670d7757822d12f5f1705a009c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-q7cjm" podUID="7dd68750-995a-4166-a528-aef7a2785014" May 13 12:59:22.612639 systemd[1]: run-netns-cni\x2d8ecb65ee\x2dd81f\x2dd44b\x2dc993\x2de110504cbdba.mount: Deactivated successfully. May 13 12:59:22.613623 systemd[1]: session-17.scope: Deactivated successfully. May 13 12:59:22.614352 systemd-logind[1578]: Session 17 logged out. Waiting for processes to exit. May 13 12:59:22.615495 systemd-logind[1578]: Removed session 17. May 13 12:59:26.544902 containerd[1594]: time="2025-05-13T12:59:26.544855707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,}" May 13 12:59:26.589606 containerd[1594]: time="2025-05-13T12:59:26.589539580Z" level=error msg="Failed to destroy network for sandbox \"a03fd35c032cc93ba7a70672b7d8a9d32cf01fc922cd38c7199c74f2df6e2c9b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:26.591027 containerd[1594]: time="2025-05-13T12:59:26.590979170Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a03fd35c032cc93ba7a70672b7d8a9d32cf01fc922cd38c7199c74f2df6e2c9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:26.591243 kubelet[2704]: E0513 12:59:26.591201 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a03fd35c032cc93ba7a70672b7d8a9d32cf01fc922cd38c7199c74f2df6e2c9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:26.591785 kubelet[2704]: E0513 12:59:26.591383 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a03fd35c032cc93ba7a70672b7d8a9d32cf01fc922cd38c7199c74f2df6e2c9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:59:26.591785 kubelet[2704]: E0513 12:59:26.591402 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a03fd35c032cc93ba7a70672b7d8a9d32cf01fc922cd38c7199c74f2df6e2c9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" May 13 12:59:26.591785 kubelet[2704]: E0513 12:59:26.591445 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-647bb7c459-594xm_calico-system(c5c7ee4c-244c-48f3-844f-9864efd99cd6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a03fd35c032cc93ba7a70672b7d8a9d32cf01fc922cd38c7199c74f2df6e2c9b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" podUID="c5c7ee4c-244c-48f3-844f-9864efd99cd6" May 13 12:59:26.592024 systemd[1]: run-netns-cni\x2d5ed00ee7\x2d4dd1\x2dce67\x2d8bb3\x2d81e033d9c221.mount: Deactivated successfully. May 13 12:59:27.544993 containerd[1594]: time="2025-05-13T12:59:27.544698319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:27.544993 containerd[1594]: time="2025-05-13T12:59:27.544888857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,}" May 13 12:59:27.599043 containerd[1594]: time="2025-05-13T12:59:27.598984634Z" level=error msg="Failed to destroy network for sandbox \"e837cb9d797b292e13878cc2e98bc8c5673277a68eb5202f8118eb7d372666ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:27.601292 containerd[1594]: time="2025-05-13T12:59:27.600908576Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e837cb9d797b292e13878cc2e98bc8c5673277a68eb5202f8118eb7d372666ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:27.601404 kubelet[2704]: E0513 12:59:27.601322 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e837cb9d797b292e13878cc2e98bc8c5673277a68eb5202f8118eb7d372666ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:27.601723 kubelet[2704]: E0513 12:59:27.601419 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e837cb9d797b292e13878cc2e98bc8c5673277a68eb5202f8118eb7d372666ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:59:27.601723 kubelet[2704]: E0513 12:59:27.601473 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e837cb9d797b292e13878cc2e98bc8c5673277a68eb5202f8118eb7d372666ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" May 13 12:59:27.601798 containerd[1594]: time="2025-05-13T12:59:27.601736043Z" level=error msg="Failed to destroy network for sandbox \"43799683fbba3812e0b34c62420822d993bb69705bba2a7ed3abe94c6bd8333f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:27.601864 systemd[1]: run-netns-cni\x2d268eb630\x2d48c7\x2d9c53\x2d0afd\x2de668b45310e1.mount: Deactivated successfully. May 13 12:59:27.602150 kubelet[2704]: E0513 12:59:27.601569 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-2r646_calico-apiserver(509974ac-8c85-45a5-a90c-6e89c9a54779)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e837cb9d797b292e13878cc2e98bc8c5673277a68eb5202f8118eb7d372666ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" podUID="509974ac-8c85-45a5-a90c-6e89c9a54779" May 13 12:59:27.604669 containerd[1594]: time="2025-05-13T12:59:27.604446613Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"43799683fbba3812e0b34c62420822d993bb69705bba2a7ed3abe94c6bd8333f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:27.604558 systemd[1]: run-netns-cni\x2d6f558c15\x2d2b52\x2d3c7d\x2deaef\x2dda838639a854.mount: Deactivated successfully. May 13 12:59:27.604861 kubelet[2704]: E0513 12:59:27.604666 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43799683fbba3812e0b34c62420822d993bb69705bba2a7ed3abe94c6bd8333f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:27.604861 kubelet[2704]: E0513 12:59:27.604708 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43799683fbba3812e0b34c62420822d993bb69705bba2a7ed3abe94c6bd8333f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:59:27.604861 kubelet[2704]: E0513 12:59:27.604727 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43799683fbba3812e0b34c62420822d993bb69705bba2a7ed3abe94c6bd8333f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-cn4wk" May 13 12:59:27.604964 kubelet[2704]: E0513 12:59:27.604763 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-cn4wk_kube-system(9ba775bd-6563-42d6-824b-2dbfad12b002)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43799683fbba3812e0b34c62420822d993bb69705bba2a7ed3abe94c6bd8333f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-cn4wk" podUID="9ba775bd-6563-42d6-824b-2dbfad12b002" May 13 12:59:27.616634 systemd[1]: Started sshd@17-10.0.0.121:22-10.0.0.1:40094.service - OpenSSH per-connection server daemon (10.0.0.1:40094). May 13 12:59:27.652451 sshd[4793]: Accepted publickey for core from 10.0.0.1 port 40094 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:27.653654 sshd-session[4793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:27.657591 systemd-logind[1578]: New session 18 of user core. May 13 12:59:27.668372 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 12:59:27.771527 sshd[4795]: Connection closed by 10.0.0.1 port 40094 May 13 12:59:27.771888 sshd-session[4793]: pam_unix(sshd:session): session closed for user core May 13 12:59:27.775883 systemd[1]: sshd@17-10.0.0.121:22-10.0.0.1:40094.service: Deactivated successfully. May 13 12:59:27.777630 systemd[1]: session-18.scope: Deactivated successfully. May 13 12:59:27.778342 systemd-logind[1578]: Session 18 logged out. Waiting for processes to exit. May 13 12:59:27.779523 systemd-logind[1578]: Removed session 18. May 13 12:59:28.545134 containerd[1594]: time="2025-05-13T12:59:28.545090234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,}" May 13 12:59:28.545565 containerd[1594]: time="2025-05-13T12:59:28.545090264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:28.599132 containerd[1594]: time="2025-05-13T12:59:28.599085488Z" level=error msg="Failed to destroy network for sandbox \"4177a642253c5c97164aa85d55cc959b0b1189bd4fbfbb35210d12a4c1f80cbc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:28.599741 containerd[1594]: time="2025-05-13T12:59:28.599636641Z" level=error msg="Failed to destroy network for sandbox \"d642a25b37917b8004ad73e5fb7d8b20f7183e6b101c560c1b1cfb2f257521e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:28.601360 containerd[1594]: time="2025-05-13T12:59:28.601304647Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4177a642253c5c97164aa85d55cc959b0b1189bd4fbfbb35210d12a4c1f80cbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:28.601763 systemd[1]: run-netns-cni\x2d2af2ba5a\x2d87e7\x2d6bda\x2db794\x2d994e7bfcb3a4.mount: Deactivated successfully. May 13 12:59:28.601963 kubelet[2704]: E0513 12:59:28.601915 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4177a642253c5c97164aa85d55cc959b0b1189bd4fbfbb35210d12a4c1f80cbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:28.602288 kubelet[2704]: E0513 12:59:28.601991 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4177a642253c5c97164aa85d55cc959b0b1189bd4fbfbb35210d12a4c1f80cbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:59:28.602288 kubelet[2704]: E0513 12:59:28.602010 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4177a642253c5c97164aa85d55cc959b0b1189bd4fbfbb35210d12a4c1f80cbc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xs6bj" May 13 12:59:28.602353 kubelet[2704]: E0513 12:59:28.602148 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xs6bj_kube-system(e7d6f99e-b52c-4124-8898-cdb6a2a240c9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4177a642253c5c97164aa85d55cc959b0b1189bd4fbfbb35210d12a4c1f80cbc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xs6bj" podUID="e7d6f99e-b52c-4124-8898-cdb6a2a240c9" May 13 12:59:28.602474 containerd[1594]: time="2025-05-13T12:59:28.602432652Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d642a25b37917b8004ad73e5fb7d8b20f7183e6b101c560c1b1cfb2f257521e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:28.602616 systemd[1]: run-netns-cni\x2d67b6ba18\x2d19c6\x2dfd33\x2de3d5\x2d2f70482b768a.mount: Deactivated successfully. May 13 12:59:28.602803 kubelet[2704]: E0513 12:59:28.602628 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d642a25b37917b8004ad73e5fb7d8b20f7183e6b101c560c1b1cfb2f257521e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:28.602803 kubelet[2704]: E0513 12:59:28.602656 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d642a25b37917b8004ad73e5fb7d8b20f7183e6b101c560c1b1cfb2f257521e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:59:28.602803 kubelet[2704]: E0513 12:59:28.602671 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d642a25b37917b8004ad73e5fb7d8b20f7183e6b101c560c1b1cfb2f257521e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" May 13 12:59:28.602910 kubelet[2704]: E0513 12:59:28.602701 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d4cb57867-c2wmg_calico-apiserver(a9d392a7-21ed-46f8-a204-24615b7b6d08)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d642a25b37917b8004ad73e5fb7d8b20f7183e6b101c560c1b1cfb2f257521e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" podUID="a9d392a7-21ed-46f8-a204-24615b7b6d08" May 13 12:59:29.545770 containerd[1594]: time="2025-05-13T12:59:29.545714090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:29.608229 containerd[1594]: time="2025-05-13T12:59:29.608163026Z" level=error msg="Failed to destroy network for sandbox \"da69882fe272488b012f7fa269a86adbffe8b36c8e12164717f90741453024a5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:29.609525 containerd[1594]: time="2025-05-13T12:59:29.609485504Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"da69882fe272488b012f7fa269a86adbffe8b36c8e12164717f90741453024a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:29.610170 kubelet[2704]: E0513 12:59:29.609730 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da69882fe272488b012f7fa269a86adbffe8b36c8e12164717f90741453024a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 12:59:29.610170 kubelet[2704]: E0513 12:59:29.609801 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da69882fe272488b012f7fa269a86adbffe8b36c8e12164717f90741453024a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:59:29.610170 kubelet[2704]: E0513 12:59:29.609826 2704 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da69882fe272488b012f7fa269a86adbffe8b36c8e12164717f90741453024a5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" May 13 12:59:29.610644 kubelet[2704]: E0513 12:59:29.609886 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7854cf7bc7-llgzn_calico-apiserver(13643b3d-f618-4901-ac95-9d9e3c0fb895)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"da69882fe272488b012f7fa269a86adbffe8b36c8e12164717f90741453024a5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" podUID="13643b3d-f618-4901-ac95-9d9e3c0fb895" May 13 12:59:29.611527 systemd[1]: run-netns-cni\x2df4b8278b\x2d9ab8\x2d89d4\x2dc594\x2db10a3ba0eefa.mount: Deactivated successfully. May 13 12:59:30.135086 containerd[1594]: time="2025-05-13T12:59:30.133631388Z" level=info msg="StopPodSandbox for \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\"" May 13 12:59:30.143283 containerd[1594]: time="2025-05-13T12:59:30.141868317Z" level=info msg="Container to stop \"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 12:59:30.143283 containerd[1594]: time="2025-05-13T12:59:30.141920377Z" level=info msg="Container to stop \"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 12:59:30.143283 containerd[1594]: time="2025-05-13T12:59:30.141930446Z" level=info msg="Container to stop \"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 12:59:30.164819 systemd[1]: cri-containerd-b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe.scope: Deactivated successfully. May 13 12:59:30.168695 containerd[1594]: time="2025-05-13T12:59:30.167186501Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" id:\"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" pid:3276 exit_status:137 exited_at:{seconds:1747141170 nanos:166672250}" May 13 12:59:30.203888 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe-rootfs.mount: Deactivated successfully. May 13 12:59:30.215071 containerd[1594]: time="2025-05-13T12:59:30.215024481Z" level=info msg="shim disconnected" id=b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe namespace=k8s.io May 13 12:59:30.215452 containerd[1594]: time="2025-05-13T12:59:30.215061021Z" level=warning msg="cleaning up after shim disconnected" id=b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe namespace=k8s.io May 13 12:59:30.220615 containerd[1594]: time="2025-05-13T12:59:30.215449038Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 12:59:30.253130 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe-shm.mount: Deactivated successfully. May 13 12:59:30.278594 containerd[1594]: time="2025-05-13T12:59:30.278517542Z" level=info msg="TearDown network for sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" successfully" May 13 12:59:30.278594 containerd[1594]: time="2025-05-13T12:59:30.278569362Z" level=info msg="StopPodSandbox for \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" returns successfully" May 13 12:59:30.291391 containerd[1594]: time="2025-05-13T12:59:30.291341963Z" level=info msg="received exit event sandbox_id:\"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" exit_status:137 exited_at:{seconds:1747141170 nanos:166672250}" May 13 12:59:30.301192 kubelet[2704]: I0513 12:59:30.301167 2704 scope.go:117] "RemoveContainer" containerID="521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de" May 13 12:59:30.303830 containerd[1594]: time="2025-05-13T12:59:30.303790528Z" level=info msg="RemoveContainer for \"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\"" May 13 12:59:30.409613 kubelet[2704]: I0513 12:59:30.409478 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-lib-modules\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409613 kubelet[2704]: I0513 12:59:30.409521 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-net-dir\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409613 kubelet[2704]: I0513 12:59:30.409537 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-flexvol-driver-host\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409613 kubelet[2704]: I0513 12:59:30.409562 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-run-calico\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409613 kubelet[2704]: I0513 12:59:30.409584 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5a9517-847c-4101-8b62-b95d8d1852c2-tigera-ca-bundle\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409613 kubelet[2704]: I0513 12:59:30.409598 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-policysync\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409840 kubelet[2704]: I0513 12:59:30.409614 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-log-dir\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409840 kubelet[2704]: I0513 12:59:30.409631 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-bin-dir\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409840 kubelet[2704]: I0513 12:59:30.409643 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-xtables-lock\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409840 kubelet[2704]: I0513 12:59:30.409632 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.409840 kubelet[2704]: I0513 12:59:30.409654 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.409960 kubelet[2704]: I0513 12:59:30.409687 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.409960 kubelet[2704]: I0513 12:59:30.409656 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-lib-calico\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.409960 kubelet[2704]: I0513 12:59:30.409694 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.409960 kubelet[2704]: I0513 12:59:30.409715 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.409960 kubelet[2704]: I0513 12:59:30.409718 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.410120 kubelet[2704]: I0513 12:59:30.409733 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.410120 kubelet[2704]: I0513 12:59:30.409738 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6f5a9517-847c-4101-8b62-b95d8d1852c2-node-certs\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.410120 kubelet[2704]: I0513 12:59:30.409749 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.410120 kubelet[2704]: I0513 12:59:30.409765 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsjq8\" (UniqueName: \"kubernetes.io/projected/6f5a9517-847c-4101-8b62-b95d8d1852c2-kube-api-access-dsjq8\") pod \"6f5a9517-847c-4101-8b62-b95d8d1852c2\" (UID: \"6f5a9517-847c-4101-8b62-b95d8d1852c2\") " May 13 12:59:30.410120 kubelet[2704]: I0513 12:59:30.409841 2704 reconciler_common.go:288] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-net-dir\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410120 kubelet[2704]: I0513 12:59:30.409852 2704 reconciler_common.go:288] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410274 kubelet[2704]: I0513 12:59:30.409860 2704 reconciler_common.go:288] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-run-calico\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410274 kubelet[2704]: I0513 12:59:30.409870 2704 reconciler_common.go:288] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-log-dir\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410274 kubelet[2704]: I0513 12:59:30.409878 2704 reconciler_common.go:288] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410274 kubelet[2704]: I0513 12:59:30.409885 2704 reconciler_common.go:288] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-xtables-lock\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410274 kubelet[2704]: I0513 12:59:30.409892 2704 reconciler_common.go:288] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-var-lib-calico\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410274 kubelet[2704]: I0513 12:59:30.409899 2704 reconciler_common.go:288] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-lib-modules\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.410274 kubelet[2704]: I0513 12:59:30.409768 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-policysync" (OuterVolumeSpecName: "policysync") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 13 12:59:30.413404 kubelet[2704]: I0513 12:59:30.413374 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f5a9517-847c-4101-8b62-b95d8d1852c2-kube-api-access-dsjq8" (OuterVolumeSpecName: "kube-api-access-dsjq8") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "kube-api-access-dsjq8". PluginName "kubernetes.io/projected", VolumeGidValue "" May 13 12:59:30.413787 kubelet[2704]: I0513 12:59:30.413756 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f5a9517-847c-4101-8b62-b95d8d1852c2-node-certs" (OuterVolumeSpecName: "node-certs") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 13 12:59:30.415105 systemd[1]: var-lib-kubelet-pods-6f5a9517\x2d847c\x2d4101\x2d8b62\x2db95d8d1852c2-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddsjq8.mount: Deactivated successfully. May 13 12:59:30.415211 systemd[1]: var-lib-kubelet-pods-6f5a9517\x2d847c\x2d4101\x2d8b62\x2db95d8d1852c2-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. May 13 12:59:30.415581 kubelet[2704]: I0513 12:59:30.415538 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5a9517-847c-4101-8b62-b95d8d1852c2-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "6f5a9517-847c-4101-8b62-b95d8d1852c2" (UID: "6f5a9517-847c-4101-8b62-b95d8d1852c2"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 13 12:59:30.450318 containerd[1594]: time="2025-05-13T12:59:30.450240062Z" level=info msg="RemoveContainer for \"521169d77e5d780607da38ff6134bc048efc6b446945036941caf9f3bdc143de\" returns successfully" May 13 12:59:30.450541 kubelet[2704]: I0513 12:59:30.450503 2704 scope.go:117] "RemoveContainer" containerID="197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f" May 13 12:59:30.452907 containerd[1594]: time="2025-05-13T12:59:30.452871058Z" level=info msg="RemoveContainer for \"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\"" May 13 12:59:30.510360 kubelet[2704]: I0513 12:59:30.510312 2704 reconciler_common.go:288] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6f5a9517-847c-4101-8b62-b95d8d1852c2-node-certs\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.510360 kubelet[2704]: I0513 12:59:30.510345 2704 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-dsjq8\" (UniqueName: \"kubernetes.io/projected/6f5a9517-847c-4101-8b62-b95d8d1852c2-kube-api-access-dsjq8\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.510360 kubelet[2704]: I0513 12:59:30.510354 2704 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f5a9517-847c-4101-8b62-b95d8d1852c2-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.510360 kubelet[2704]: I0513 12:59:30.510362 2704 reconciler_common.go:288] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6f5a9517-847c-4101-8b62-b95d8d1852c2-policysync\") on node \"localhost\" DevicePath \"\"" May 13 12:59:30.550717 systemd[1]: var-lib-kubelet-pods-6f5a9517\x2d847c\x2d4101\x2d8b62\x2db95d8d1852c2-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. May 13 12:59:30.567566 containerd[1594]: time="2025-05-13T12:59:30.567498496Z" level=info msg="RemoveContainer for \"197c96edc18114dfb076b30b1959104b85d6f72a760061c596cf7342ac842c1f\" returns successfully" May 13 12:59:30.568034 kubelet[2704]: I0513 12:59:30.567825 2704 scope.go:117] "RemoveContainer" containerID="6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5" May 13 12:59:30.570236 containerd[1594]: time="2025-05-13T12:59:30.570203223Z" level=info msg="RemoveContainer for \"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\"" May 13 12:59:30.575657 containerd[1594]: time="2025-05-13T12:59:30.575539366Z" level=info msg="RemoveContainer for \"6fb5120910f2779f021ff356279c3d2cd48b7db390a43a0625dafc0c295b5bb5\" returns successfully" May 13 12:59:30.578869 kubelet[2704]: E0513 12:59:30.578803 2704 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="install-cni" May 13 12:59:30.578869 kubelet[2704]: E0513 12:59:30.578831 2704 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="calico-node" May 13 12:59:30.578869 kubelet[2704]: E0513 12:59:30.578840 2704 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="flexvol-driver" May 13 12:59:30.578869 kubelet[2704]: E0513 12:59:30.578847 2704 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="calico-node" May 13 12:59:30.578869 kubelet[2704]: E0513 12:59:30.578854 2704 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="calico-node" May 13 12:59:30.579101 kubelet[2704]: I0513 12:59:30.578887 2704 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="calico-node" May 13 12:59:30.579101 kubelet[2704]: I0513 12:59:30.578895 2704 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="calico-node" May 13 12:59:30.579101 kubelet[2704]: I0513 12:59:30.578902 2704 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" containerName="calico-node" May 13 12:59:30.587288 systemd[1]: Created slice kubepods-besteffort-podfacf8cdb_14b3_4ac7_8915_a783a46e9f79.slice - libcontainer container kubepods-besteffort-podfacf8cdb_14b3_4ac7_8915_a783a46e9f79.slice. May 13 12:59:30.617983 systemd[1]: Removed slice kubepods-besteffort-pod6f5a9517_847c_4101_8b62_b95d8d1852c2.slice - libcontainer container kubepods-besteffort-pod6f5a9517_847c_4101_8b62_b95d8d1852c2.slice. May 13 12:59:30.618485 systemd[1]: kubepods-besteffort-pod6f5a9517_847c_4101_8b62_b95d8d1852c2.slice: Consumed 823ms CPU time, 164.1M memory peak, 8K read from disk, 160.4M written to disk. May 13 12:59:30.712215 kubelet[2704]: I0513 12:59:30.711628 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-lib-modules\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712215 kubelet[2704]: I0513 12:59:30.711669 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-var-run-calico\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712215 kubelet[2704]: I0513 12:59:30.711688 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-xtables-lock\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712215 kubelet[2704]: I0513 12:59:30.711704 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-flexvol-driver-host\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712215 kubelet[2704]: I0513 12:59:30.711737 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-cni-net-dir\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712707 kubelet[2704]: I0513 12:59:30.711765 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/facf8cdb-14b3-4ac7-8915-a783a46e9f79-node-certs\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712707 kubelet[2704]: I0513 12:59:30.711778 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-var-lib-calico\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712707 kubelet[2704]: I0513 12:59:30.711793 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-policysync\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712707 kubelet[2704]: I0513 12:59:30.711805 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-cni-log-dir\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712707 kubelet[2704]: I0513 12:59:30.711833 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsv4w\" (UniqueName: \"kubernetes.io/projected/facf8cdb-14b3-4ac7-8915-a783a46e9f79-kube-api-access-lsv4w\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712814 kubelet[2704]: I0513 12:59:30.711848 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/facf8cdb-14b3-4ac7-8915-a783a46e9f79-cni-bin-dir\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.712814 kubelet[2704]: I0513 12:59:30.711864 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/facf8cdb-14b3-4ac7-8915-a783a46e9f79-tigera-ca-bundle\") pod \"calico-node-52thz\" (UID: \"facf8cdb-14b3-4ac7-8915-a783a46e9f79\") " pod="calico-system/calico-node-52thz" May 13 12:59:30.892094 containerd[1594]: time="2025-05-13T12:59:30.892057435Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-52thz,Uid:facf8cdb-14b3-4ac7-8915-a783a46e9f79,Namespace:calico-system,Attempt:0,}" May 13 12:59:31.004331 containerd[1594]: time="2025-05-13T12:59:31.004209653Z" level=info msg="connecting to shim 043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786" address="unix:///run/containerd/s/3b0e41f64997389cb9855275c6765965eb11f634747ff26e11641821f6e0c986" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:31.034394 systemd[1]: Started cri-containerd-043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786.scope - libcontainer container 043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786. May 13 12:59:31.060879 containerd[1594]: time="2025-05-13T12:59:31.060832553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-52thz,Uid:facf8cdb-14b3-4ac7-8915-a783a46e9f79,Namespace:calico-system,Attempt:0,} returns sandbox id \"043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786\"" May 13 12:59:31.063603 containerd[1594]: time="2025-05-13T12:59:31.063574618Z" level=info msg="CreateContainer within sandbox \"043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 12:59:31.071161 containerd[1594]: time="2025-05-13T12:59:31.071132278Z" level=info msg="Container 911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:31.080427 containerd[1594]: time="2025-05-13T12:59:31.080389679Z" level=info msg="CreateContainer within sandbox \"043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73\"" May 13 12:59:31.081527 containerd[1594]: time="2025-05-13T12:59:31.081507480Z" level=info msg="StartContainer for \"911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73\"" May 13 12:59:31.083182 containerd[1594]: time="2025-05-13T12:59:31.083145181Z" level=info msg="connecting to shim 911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73" address="unix:///run/containerd/s/3b0e41f64997389cb9855275c6765965eb11f634747ff26e11641821f6e0c986" protocol=ttrpc version=3 May 13 12:59:31.103382 systemd[1]: Started cri-containerd-911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73.scope - libcontainer container 911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73. May 13 12:59:31.148782 containerd[1594]: time="2025-05-13T12:59:31.148745241Z" level=info msg="StartContainer for \"911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73\" returns successfully" May 13 12:59:31.168736 systemd[1]: cri-containerd-911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73.scope: Deactivated successfully. May 13 12:59:31.169181 systemd[1]: cri-containerd-911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73.scope: Consumed 40ms CPU time, 16.2M memory peak, 7.9M read from disk, 6.3M written to disk. May 13 12:59:31.170479 containerd[1594]: time="2025-05-13T12:59:31.170436363Z" level=info msg="received exit event container_id:\"911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73\" id:\"911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73\" pid:5022 exited_at:{seconds:1747141171 nanos:170155904}" May 13 12:59:31.170817 containerd[1594]: time="2025-05-13T12:59:31.170772970Z" level=info msg="TaskExit event in podsandbox handler container_id:\"911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73\" id:\"911f717084e403c57f27cd10c240007a575a54d86b2fb51787afac771010cc73\" pid:5022 exited_at:{seconds:1747141171 nanos:170155904}" May 13 12:59:31.307946 containerd[1594]: time="2025-05-13T12:59:31.307797297Z" level=info msg="CreateContainer within sandbox \"043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 12:59:31.314688 containerd[1594]: time="2025-05-13T12:59:31.314652266Z" level=info msg="Container 6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:31.324284 containerd[1594]: time="2025-05-13T12:59:31.324219392Z" level=info msg="CreateContainer within sandbox \"043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b\"" May 13 12:59:31.324885 containerd[1594]: time="2025-05-13T12:59:31.324849525Z" level=info msg="StartContainer for \"6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b\"" May 13 12:59:31.326594 containerd[1594]: time="2025-05-13T12:59:31.326548904Z" level=info msg="connecting to shim 6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b" address="unix:///run/containerd/s/3b0e41f64997389cb9855275c6765965eb11f634747ff26e11641821f6e0c986" protocol=ttrpc version=3 May 13 12:59:31.352530 systemd[1]: Started cri-containerd-6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b.scope - libcontainer container 6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b. May 13 12:59:31.404759 containerd[1594]: time="2025-05-13T12:59:31.404712666Z" level=info msg="StartContainer for \"6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b\" returns successfully" May 13 12:59:31.547069 kubelet[2704]: I0513 12:59:31.547014 2704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f5a9517-847c-4101-8b62-b95d8d1852c2" path="/var/lib/kubelet/pods/6f5a9517-847c-4101-8b62-b95d8d1852c2/volumes" May 13 12:59:31.764754 containerd[1594]: time="2025-05-13T12:59:31.764702925Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: failed to load CNI config list file /etc/cni/net.d/10-calico.conflist: error parsing configuration list: unexpected end of JSON input: invalid cni config: failed to load cni config" May 13 12:59:31.767298 systemd[1]: cri-containerd-6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b.scope: Deactivated successfully. May 13 12:59:31.767651 systemd[1]: cri-containerd-6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b.scope: Consumed 676ms CPU time, 115.4M memory peak, 99.2M read from disk. May 13 12:59:31.768436 containerd[1594]: time="2025-05-13T12:59:31.768412071Z" level=info msg="received exit event container_id:\"6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b\" id:\"6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b\" pid:5075 exited_at:{seconds:1747141171 nanos:768175075}" May 13 12:59:31.768480 containerd[1594]: time="2025-05-13T12:59:31.768460584Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b\" id:\"6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b\" pid:5075 exited_at:{seconds:1747141171 nanos:768175075}" May 13 12:59:31.789599 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6df4a6d516138e8edc431fcde0b1b11e773863c37bd22498e269be0e68b2a56b-rootfs.mount: Deactivated successfully. May 13 12:59:32.319746 containerd[1594]: time="2025-05-13T12:59:32.318753486Z" level=info msg="CreateContainer within sandbox \"043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 12:59:32.330216 containerd[1594]: time="2025-05-13T12:59:32.330173864Z" level=info msg="Container fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:32.339482 containerd[1594]: time="2025-05-13T12:59:32.339440701Z" level=info msg="CreateContainer within sandbox \"043a979e04aa439cf134b92b394db623853a14ea78e783664305ec693cd35786\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a\"" May 13 12:59:32.340058 containerd[1594]: time="2025-05-13T12:59:32.340022199Z" level=info msg="StartContainer for \"fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a\"" May 13 12:59:32.341939 containerd[1594]: time="2025-05-13T12:59:32.341903787Z" level=info msg="connecting to shim fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a" address="unix:///run/containerd/s/3b0e41f64997389cb9855275c6765965eb11f634747ff26e11641821f6e0c986" protocol=ttrpc version=3 May 13 12:59:32.368422 systemd[1]: Started cri-containerd-fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a.scope - libcontainer container fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a. May 13 12:59:32.420161 containerd[1594]: time="2025-05-13T12:59:32.420108057Z" level=info msg="StartContainer for \"fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a\" returns successfully" May 13 12:59:32.794837 systemd[1]: Started sshd@18-10.0.0.121:22-10.0.0.1:60074.service - OpenSSH per-connection server daemon (10.0.0.1:60074). May 13 12:59:32.842386 sshd[5164]: Accepted publickey for core from 10.0.0.1 port 60074 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:32.844096 sshd-session[5164]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:32.848649 systemd-logind[1578]: New session 19 of user core. May 13 12:59:32.858384 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 12:59:32.976053 sshd[5166]: Connection closed by 10.0.0.1 port 60074 May 13 12:59:32.976381 sshd-session[5164]: pam_unix(sshd:session): session closed for user core May 13 12:59:32.980813 systemd[1]: sshd@18-10.0.0.121:22-10.0.0.1:60074.service: Deactivated successfully. May 13 12:59:32.982635 systemd[1]: session-19.scope: Deactivated successfully. May 13 12:59:32.983431 systemd-logind[1578]: Session 19 logged out. Waiting for processes to exit. May 13 12:59:32.984631 systemd-logind[1578]: Removed session 19. May 13 12:59:33.331631 kubelet[2704]: I0513 12:59:33.330699 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-52thz" podStartSLOduration=3.330679475 podStartE2EDuration="3.330679475s" podCreationTimestamp="2025-05-13 12:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:59:33.329822639 +0000 UTC m=+79.860450636" watchObservedRunningTime="2025-05-13 12:59:33.330679475 +0000 UTC m=+79.861307462" May 13 12:59:33.380129 containerd[1594]: time="2025-05-13T12:59:33.380081176Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a\" id:\"f5da2881e37440ee5ab58b393293d2c66ab5f4717e6cccc0b05518834f2e7bec\" pid:5191 exit_status:1 exited_at:{seconds:1747141173 nanos:379681598}" May 13 12:59:34.177706 systemd-networkd[1494]: vxlan.calico: Link UP May 13 12:59:34.177718 systemd-networkd[1494]: vxlan.calico: Gained carrier May 13 12:59:34.389648 containerd[1594]: time="2025-05-13T12:59:34.389599736Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a\" id:\"53f729967dbd016be3e89c1eafe1d41ccc20854aca9cbb015c0e4050886c0bec\" pid:5386 exit_status:1 exited_at:{seconds:1747141174 nanos:389121588}" May 13 12:59:34.545461 containerd[1594]: time="2025-05-13T12:59:34.545412210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,}" May 13 12:59:34.675669 systemd-networkd[1494]: cali66b03e5fc3d: Link UP May 13 12:59:34.676402 systemd-networkd[1494]: cali66b03e5fc3d: Gained carrier May 13 12:59:34.689298 containerd[1594]: 2025-05-13 12:59:34.579 [INFO][5428] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--q7cjm-eth0 csi-node-driver- calico-system 7dd68750-995a-4166-a528-aef7a2785014 588 0 2025-05-13 12:58:29 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-q7cjm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali66b03e5fc3d [] []}} ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-" May 13 12:59:34.689298 containerd[1594]: 2025-05-13 12:59:34.579 [INFO][5428] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-eth0" May 13 12:59:34.689298 containerd[1594]: 2025-05-13 12:59:34.638 [INFO][5443] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" HandleID="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Workload="localhost-k8s-csi--node--driver--q7cjm-eth0" May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.648 [INFO][5443] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" HandleID="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Workload="localhost-k8s-csi--node--driver--q7cjm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df3c0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-q7cjm", "timestamp":"2025-05-13 12:59:34.638825122 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.648 [INFO][5443] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.648 [INFO][5443] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.648 [INFO][5443] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.650 [INFO][5443] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" host="localhost" May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.653 [INFO][5443] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.657 [INFO][5443] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.659 [INFO][5443] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.660 [INFO][5443] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:59:34.689478 containerd[1594]: 2025-05-13 12:59:34.660 [INFO][5443] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" host="localhost" May 13 12:59:34.689729 containerd[1594]: 2025-05-13 12:59:34.662 [INFO][5443] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99 May 13 12:59:34.689729 containerd[1594]: 2025-05-13 12:59:34.665 [INFO][5443] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" host="localhost" May 13 12:59:34.689729 containerd[1594]: 2025-05-13 12:59:34.669 [INFO][5443] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" host="localhost" May 13 12:59:34.689729 containerd[1594]: 2025-05-13 12:59:34.669 [INFO][5443] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" host="localhost" May 13 12:59:34.689729 containerd[1594]: 2025-05-13 12:59:34.669 [INFO][5443] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:34.689729 containerd[1594]: 2025-05-13 12:59:34.669 [INFO][5443] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" HandleID="k8s-pod-network.3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Workload="localhost-k8s-csi--node--driver--q7cjm-eth0" May 13 12:59:34.689844 containerd[1594]: 2025-05-13 12:59:34.672 [INFO][5428] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--q7cjm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7dd68750-995a-4166-a528-aef7a2785014", ResourceVersion:"588", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-q7cjm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali66b03e5fc3d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:34.689844 containerd[1594]: 2025-05-13 12:59:34.673 [INFO][5428] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-eth0" May 13 12:59:34.689919 containerd[1594]: 2025-05-13 12:59:34.673 [INFO][5428] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66b03e5fc3d ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-eth0" May 13 12:59:34.689919 containerd[1594]: 2025-05-13 12:59:34.675 [INFO][5428] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-eth0" May 13 12:59:34.689959 containerd[1594]: 2025-05-13 12:59:34.676 [INFO][5428] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--q7cjm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7dd68750-995a-4166-a528-aef7a2785014", ResourceVersion:"588", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99", Pod:"csi-node-driver-q7cjm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali66b03e5fc3d", MAC:"ee:6a:65:e7:ea:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:34.690010 containerd[1594]: 2025-05-13 12:59:34.685 [INFO][5428] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" Namespace="calico-system" Pod="csi-node-driver-q7cjm" WorkloadEndpoint="localhost-k8s-csi--node--driver--q7cjm-eth0" May 13 12:59:34.719067 containerd[1594]: time="2025-05-13T12:59:34.718985357Z" level=info msg="connecting to shim 3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99" address="unix:///run/containerd/s/571c343fffa05afb7782afa0fdf86c1e695cde65376ae3c1513580585a187290" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:34.758433 systemd[1]: Started cri-containerd-3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99.scope - libcontainer container 3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99. May 13 12:59:34.769928 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:59:34.784850 containerd[1594]: time="2025-05-13T12:59:34.784811989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-q7cjm,Uid:7dd68750-995a-4166-a528-aef7a2785014,Namespace:calico-system,Attempt:0,} returns sandbox id \"3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99\"" May 13 12:59:34.786538 containerd[1594]: time="2025-05-13T12:59:34.786331146Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 12:59:35.506468 systemd-networkd[1494]: vxlan.calico: Gained IPv6LL May 13 12:59:36.402467 systemd-networkd[1494]: cali66b03e5fc3d: Gained IPv6LL May 13 12:59:36.637982 containerd[1594]: time="2025-05-13T12:59:36.637926606Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:36.638784 containerd[1594]: time="2025-05-13T12:59:36.638749234Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 13 12:59:36.639839 containerd[1594]: time="2025-05-13T12:59:36.639809156Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:36.641903 containerd[1594]: time="2025-05-13T12:59:36.641846284Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:36.642364 containerd[1594]: time="2025-05-13T12:59:36.642320573Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 1.855961824s" May 13 12:59:36.642364 containerd[1594]: time="2025-05-13T12:59:36.642360740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 13 12:59:36.644014 containerd[1594]: time="2025-05-13T12:59:36.643984554Z" level=info msg="CreateContainer within sandbox \"3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 12:59:36.671101 containerd[1594]: time="2025-05-13T12:59:36.671008314Z" level=info msg="Container 2ebb0394e996a31e500a9af2ba9e8ea126d3b25dfe78eb2c16f74e2f2d1125b3: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:36.763635 containerd[1594]: time="2025-05-13T12:59:36.763599206Z" level=info msg="CreateContainer within sandbox \"3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2ebb0394e996a31e500a9af2ba9e8ea126d3b25dfe78eb2c16f74e2f2d1125b3\"" May 13 12:59:36.764149 containerd[1594]: time="2025-05-13T12:59:36.764104576Z" level=info msg="StartContainer for \"2ebb0394e996a31e500a9af2ba9e8ea126d3b25dfe78eb2c16f74e2f2d1125b3\"" May 13 12:59:36.765739 containerd[1594]: time="2025-05-13T12:59:36.765685368Z" level=info msg="connecting to shim 2ebb0394e996a31e500a9af2ba9e8ea126d3b25dfe78eb2c16f74e2f2d1125b3" address="unix:///run/containerd/s/571c343fffa05afb7782afa0fdf86c1e695cde65376ae3c1513580585a187290" protocol=ttrpc version=3 May 13 12:59:36.790387 systemd[1]: Started cri-containerd-2ebb0394e996a31e500a9af2ba9e8ea126d3b25dfe78eb2c16f74e2f2d1125b3.scope - libcontainer container 2ebb0394e996a31e500a9af2ba9e8ea126d3b25dfe78eb2c16f74e2f2d1125b3. May 13 12:59:36.924582 containerd[1594]: time="2025-05-13T12:59:36.924456408Z" level=info msg="StartContainer for \"2ebb0394e996a31e500a9af2ba9e8ea126d3b25dfe78eb2c16f74e2f2d1125b3\" returns successfully" May 13 12:59:36.925832 containerd[1594]: time="2025-05-13T12:59:36.925791819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 12:59:37.990371 systemd[1]: Started sshd@19-10.0.0.121:22-10.0.0.1:42906.service - OpenSSH per-connection server daemon (10.0.0.1:42906). May 13 12:59:38.047969 sshd[5552]: Accepted publickey for core from 10.0.0.1 port 42906 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:38.049556 sshd-session[5552]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:38.053885 systemd-logind[1578]: New session 20 of user core. May 13 12:59:38.061400 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 12:59:38.177885 sshd[5554]: Connection closed by 10.0.0.1 port 42906 May 13 12:59:38.178217 sshd-session[5552]: pam_unix(sshd:session): session closed for user core May 13 12:59:38.182930 systemd[1]: sshd@19-10.0.0.121:22-10.0.0.1:42906.service: Deactivated successfully. May 13 12:59:38.184861 systemd[1]: session-20.scope: Deactivated successfully. May 13 12:59:38.185826 systemd-logind[1578]: Session 20 logged out. Waiting for processes to exit. May 13 12:59:38.186984 systemd-logind[1578]: Removed session 20. May 13 12:59:38.565058 containerd[1594]: time="2025-05-13T12:59:38.564987490Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:38.565822 containerd[1594]: time="2025-05-13T12:59:38.565760411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 13 12:59:38.567000 containerd[1594]: time="2025-05-13T12:59:38.566959638Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:38.568888 containerd[1594]: time="2025-05-13T12:59:38.568846222Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:38.569355 containerd[1594]: time="2025-05-13T12:59:38.569314359Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 1.643487202s" May 13 12:59:38.569401 containerd[1594]: time="2025-05-13T12:59:38.569353814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 13 12:59:38.571332 containerd[1594]: time="2025-05-13T12:59:38.571306475Z" level=info msg="CreateContainer within sandbox \"3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 12:59:38.580842 containerd[1594]: time="2025-05-13T12:59:38.580363827Z" level=info msg="Container 906ba3e7077406cedbf9512d888fdc3a9f8ea797b93dd0ace4e2d22f91d041b7: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:38.590187 containerd[1594]: time="2025-05-13T12:59:38.590148603Z" level=info msg="CreateContainer within sandbox \"3a25233487637cbd29d57aba7d48e4a4d6bcb403ec1c27e38b7bc3853c2bcd99\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"906ba3e7077406cedbf9512d888fdc3a9f8ea797b93dd0ace4e2d22f91d041b7\"" May 13 12:59:38.590572 containerd[1594]: time="2025-05-13T12:59:38.590548228Z" level=info msg="StartContainer for \"906ba3e7077406cedbf9512d888fdc3a9f8ea797b93dd0ace4e2d22f91d041b7\"" May 13 12:59:38.591850 containerd[1594]: time="2025-05-13T12:59:38.591818171Z" level=info msg="connecting to shim 906ba3e7077406cedbf9512d888fdc3a9f8ea797b93dd0ace4e2d22f91d041b7" address="unix:///run/containerd/s/571c343fffa05afb7782afa0fdf86c1e695cde65376ae3c1513580585a187290" protocol=ttrpc version=3 May 13 12:59:38.613384 systemd[1]: Started cri-containerd-906ba3e7077406cedbf9512d888fdc3a9f8ea797b93dd0ace4e2d22f91d041b7.scope - libcontainer container 906ba3e7077406cedbf9512d888fdc3a9f8ea797b93dd0ace4e2d22f91d041b7. May 13 12:59:38.652343 containerd[1594]: time="2025-05-13T12:59:38.652311052Z" level=info msg="StartContainer for \"906ba3e7077406cedbf9512d888fdc3a9f8ea797b93dd0ace4e2d22f91d041b7\" returns successfully" May 13 12:59:39.544809 containerd[1594]: time="2025-05-13T12:59:39.544710481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,}" May 13 12:59:39.545010 containerd[1594]: time="2025-05-13T12:59:39.544979717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:39.545153 containerd[1594]: time="2025-05-13T12:59:39.545063708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:39.650117 kubelet[2704]: I0513 12:59:39.649910 2704 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 12:59:39.650117 kubelet[2704]: I0513 12:59:39.649947 2704 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 12:59:39.664238 systemd-networkd[1494]: cali838914800de: Link UP May 13 12:59:39.665119 systemd-networkd[1494]: cali838914800de: Gained carrier May 13 12:59:39.682903 kubelet[2704]: I0513 12:59:39.682814 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-q7cjm" podStartSLOduration=66.898865331 podStartE2EDuration="1m10.682795099s" podCreationTimestamp="2025-05-13 12:58:29 +0000 UTC" firstStartedPulling="2025-05-13 12:59:34.78606758 +0000 UTC m=+81.316695577" lastFinishedPulling="2025-05-13 12:59:38.569997347 +0000 UTC m=+85.100625345" observedRunningTime="2025-05-13 12:59:39.347199989 +0000 UTC m=+85.877827986" watchObservedRunningTime="2025-05-13 12:59:39.682795099 +0000 UTC m=+86.213423086" May 13 12:59:39.683692 containerd[1594]: 2025-05-13 12:59:39.594 [INFO][5630] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0 calico-apiserver-5d4cb57867- calico-apiserver a9d392a7-21ed-46f8-a204-24615b7b6d08 751 0 2025-05-13 12:58:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d4cb57867 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5d4cb57867-c2wmg eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali838914800de [] []}} ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-" May 13 12:59:39.683692 containerd[1594]: 2025-05-13 12:59:39.595 [INFO][5630] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:39.683692 containerd[1594]: 2025-05-13 12:59:39.626 [INFO][5658] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.633 [INFO][5658] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000485340), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5d4cb57867-c2wmg", "timestamp":"2025-05-13 12:59:39.626563439 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.633 [INFO][5658] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.634 [INFO][5658] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.634 [INFO][5658] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.636 [INFO][5658] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" host="localhost" May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.638 [INFO][5658] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.642 [INFO][5658] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.643 [INFO][5658] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.646 [INFO][5658] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:59:39.684095 containerd[1594]: 2025-05-13 12:59:39.646 [INFO][5658] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" host="localhost" May 13 12:59:39.684368 containerd[1594]: 2025-05-13 12:59:39.647 [INFO][5658] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33 May 13 12:59:39.684368 containerd[1594]: 2025-05-13 12:59:39.650 [INFO][5658] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" host="localhost" May 13 12:59:39.684368 containerd[1594]: 2025-05-13 12:59:39.655 [INFO][5658] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" host="localhost" May 13 12:59:39.684368 containerd[1594]: 2025-05-13 12:59:39.655 [INFO][5658] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" host="localhost" May 13 12:59:39.684368 containerd[1594]: 2025-05-13 12:59:39.656 [INFO][5658] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:39.684368 containerd[1594]: 2025-05-13 12:59:39.656 [INFO][5658] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:39.684498 containerd[1594]: 2025-05-13 12:59:39.658 [INFO][5630] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0", GenerateName:"calico-apiserver-5d4cb57867-", Namespace:"calico-apiserver", SelfLink:"", UID:"a9d392a7-21ed-46f8-a204-24615b7b6d08", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d4cb57867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5d4cb57867-c2wmg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali838914800de", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:39.684551 containerd[1594]: 2025-05-13 12:59:39.659 [INFO][5630] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:39.684551 containerd[1594]: 2025-05-13 12:59:39.659 [INFO][5630] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali838914800de ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:39.684551 containerd[1594]: 2025-05-13 12:59:39.665 [INFO][5630] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:39.684625 containerd[1594]: 2025-05-13 12:59:39.666 [INFO][5630] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0", GenerateName:"calico-apiserver-5d4cb57867-", Namespace:"calico-apiserver", SelfLink:"", UID:"a9d392a7-21ed-46f8-a204-24615b7b6d08", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d4cb57867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33", Pod:"calico-apiserver-5d4cb57867-c2wmg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali838914800de", MAC:"62:6a:ae:27:49:89", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:39.684685 containerd[1594]: 2025-05-13 12:59:39.680 [INFO][5630] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-c2wmg" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:39.706875 containerd[1594]: time="2025-05-13T12:59:39.706837995Z" level=info msg="connecting to shim 8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" address="unix:///run/containerd/s/e85801ffcb58e3a7ed7643e587da5303ccdf6b4fbb1078f754fa18aaef93341f" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:39.739378 systemd[1]: Started cri-containerd-8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33.scope - libcontainer container 8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33. May 13 12:59:39.753753 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:59:39.764931 systemd-networkd[1494]: cali5316c1154e1: Link UP May 13 12:59:39.765653 systemd-networkd[1494]: cali5316c1154e1: Gained carrier May 13 12:59:39.778708 containerd[1594]: 2025-05-13 12:59:39.594 [INFO][5610] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0 calico-kube-controllers-647bb7c459- calico-system c5c7ee4c-244c-48f3-844f-9864efd99cd6 742 0 2025-05-13 12:58:30 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:647bb7c459 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-647bb7c459-594xm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali5316c1154e1 [] []}} ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-" May 13 12:59:39.778708 containerd[1594]: 2025-05-13 12:59:39.594 [INFO][5610] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" May 13 12:59:39.778708 containerd[1594]: 2025-05-13 12:59:39.629 [INFO][5656] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" HandleID="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Workload="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.637 [INFO][5656] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" HandleID="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Workload="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000440170), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-647bb7c459-594xm", "timestamp":"2025-05-13 12:59:39.629477819 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.637 [INFO][5656] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.656 [INFO][5656] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.656 [INFO][5656] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.737 [INFO][5656] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" host="localhost" May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.740 [INFO][5656] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.744 [INFO][5656] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.746 [INFO][5656] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.748 [INFO][5656] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:59:39.778940 containerd[1594]: 2025-05-13 12:59:39.748 [INFO][5656] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" host="localhost" May 13 12:59:39.779212 containerd[1594]: 2025-05-13 12:59:39.749 [INFO][5656] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1 May 13 12:59:39.779212 containerd[1594]: 2025-05-13 12:59:39.752 [INFO][5656] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" host="localhost" May 13 12:59:39.779212 containerd[1594]: 2025-05-13 12:59:39.757 [INFO][5656] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" host="localhost" May 13 12:59:39.779212 containerd[1594]: 2025-05-13 12:59:39.757 [INFO][5656] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" host="localhost" May 13 12:59:39.779212 containerd[1594]: 2025-05-13 12:59:39.757 [INFO][5656] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:39.779212 containerd[1594]: 2025-05-13 12:59:39.757 [INFO][5656] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" HandleID="k8s-pod-network.90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Workload="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" May 13 12:59:39.779466 containerd[1594]: 2025-05-13 12:59:39.762 [INFO][5610] cni-plugin/k8s.go 386: Populated endpoint ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0", GenerateName:"calico-kube-controllers-647bb7c459-", Namespace:"calico-system", SelfLink:"", UID:"c5c7ee4c-244c-48f3-844f-9864efd99cd6", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"647bb7c459", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-647bb7c459-594xm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5316c1154e1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:39.779536 containerd[1594]: 2025-05-13 12:59:39.762 [INFO][5610] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" May 13 12:59:39.779536 containerd[1594]: 2025-05-13 12:59:39.762 [INFO][5610] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5316c1154e1 ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" May 13 12:59:39.779536 containerd[1594]: 2025-05-13 12:59:39.765 [INFO][5610] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" May 13 12:59:39.779687 containerd[1594]: 2025-05-13 12:59:39.765 [INFO][5610] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0", GenerateName:"calico-kube-controllers-647bb7c459-", Namespace:"calico-system", SelfLink:"", UID:"c5c7ee4c-244c-48f3-844f-9864efd99cd6", ResourceVersion:"742", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"647bb7c459", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1", Pod:"calico-kube-controllers-647bb7c459-594xm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5316c1154e1", MAC:"f2:db:c3:11:30:a2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:39.779764 containerd[1594]: 2025-05-13 12:59:39.775 [INFO][5610] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" Namespace="calico-system" Pod="calico-kube-controllers-647bb7c459-594xm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--647bb7c459--594xm-eth0" May 13 12:59:39.790127 containerd[1594]: time="2025-05-13T12:59:39.790079411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-c2wmg,Uid:a9d392a7-21ed-46f8-a204-24615b7b6d08,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\"" May 13 12:59:39.791618 containerd[1594]: time="2025-05-13T12:59:39.791579714Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 12:59:39.804559 containerd[1594]: time="2025-05-13T12:59:39.804442143Z" level=info msg="connecting to shim 90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1" address="unix:///run/containerd/s/9c68245aeddc83d336a98af2842e00c2ccc7513d49fa40d9bf6b05d873c6fc00" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:39.831406 systemd[1]: Started cri-containerd-90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1.scope - libcontainer container 90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1. May 13 12:59:39.845246 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:59:39.866463 systemd-networkd[1494]: cali8c8c4ae4fde: Link UP May 13 12:59:39.867451 systemd-networkd[1494]: cali8c8c4ae4fde: Gained carrier May 13 12:59:39.879983 containerd[1594]: 2025-05-13 12:59:39.600 [INFO][5622] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0 calico-apiserver-5d4cb57867- calico-apiserver 509974ac-8c85-45a5-a90c-6e89c9a54779 752 0 2025-05-13 12:58:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d4cb57867 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5d4cb57867-2r646 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8c8c4ae4fde [] []}} ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-" May 13 12:59:39.879983 containerd[1594]: 2025-05-13 12:59:39.600 [INFO][5622] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:39.879983 containerd[1594]: 2025-05-13 12:59:39.634 [INFO][5671] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.641 [INFO][5671] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000606600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5d4cb57867-2r646", "timestamp":"2025-05-13 12:59:39.634410965 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.642 [INFO][5671] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.757 [INFO][5671] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.757 [INFO][5671] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.838 [INFO][5671] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" host="localhost" May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.842 [INFO][5671] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.846 [INFO][5671] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.847 [INFO][5671] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.849 [INFO][5671] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:59:39.880173 containerd[1594]: 2025-05-13 12:59:39.849 [INFO][5671] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" host="localhost" May 13 12:59:39.880407 containerd[1594]: 2025-05-13 12:59:39.850 [INFO][5671] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5 May 13 12:59:39.880407 containerd[1594]: 2025-05-13 12:59:39.852 [INFO][5671] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" host="localhost" May 13 12:59:39.880407 containerd[1594]: 2025-05-13 12:59:39.857 [INFO][5671] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" host="localhost" May 13 12:59:39.880407 containerd[1594]: 2025-05-13 12:59:39.857 [INFO][5671] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" host="localhost" May 13 12:59:39.880407 containerd[1594]: 2025-05-13 12:59:39.857 [INFO][5671] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:39.880407 containerd[1594]: 2025-05-13 12:59:39.857 [INFO][5671] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:39.880525 containerd[1594]: 2025-05-13 12:59:39.861 [INFO][5622] cni-plugin/k8s.go 386: Populated endpoint ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0", GenerateName:"calico-apiserver-5d4cb57867-", Namespace:"calico-apiserver", SelfLink:"", UID:"509974ac-8c85-45a5-a90c-6e89c9a54779", ResourceVersion:"752", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d4cb57867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5d4cb57867-2r646", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8c8c4ae4fde", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:39.880580 containerd[1594]: 2025-05-13 12:59:39.863 [INFO][5622] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:39.880580 containerd[1594]: 2025-05-13 12:59:39.863 [INFO][5622] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8c8c4ae4fde ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:39.880580 containerd[1594]: 2025-05-13 12:59:39.867 [INFO][5622] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:39.880644 containerd[1594]: 2025-05-13 12:59:39.867 [INFO][5622] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0", GenerateName:"calico-apiserver-5d4cb57867-", Namespace:"calico-apiserver", SelfLink:"", UID:"509974ac-8c85-45a5-a90c-6e89c9a54779", ResourceVersion:"752", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d4cb57867", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5", Pod:"calico-apiserver-5d4cb57867-2r646", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8c8c4ae4fde", MAC:"ea:df:ed:bd:83:6c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:39.880691 containerd[1594]: 2025-05-13 12:59:39.875 [INFO][5622] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Namespace="calico-apiserver" Pod="calico-apiserver-5d4cb57867-2r646" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:39.885714 containerd[1594]: time="2025-05-13T12:59:39.885664753Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-647bb7c459-594xm,Uid:c5c7ee4c-244c-48f3-844f-9864efd99cd6,Namespace:calico-system,Attempt:0,} returns sandbox id \"90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1\"" May 13 12:59:39.905755 containerd[1594]: time="2025-05-13T12:59:39.905719213Z" level=info msg="connecting to shim 97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" address="unix:///run/containerd/s/bfda057a6975cb06251878d9fef87207c9b38527568bf903d28ee137e747f220" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:39.940438 systemd[1]: Started cri-containerd-97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5.scope - libcontainer container 97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5. May 13 12:59:39.951585 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:59:39.982022 containerd[1594]: time="2025-05-13T12:59:39.981975324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d4cb57867-2r646,Uid:509974ac-8c85-45a5-a90c-6e89c9a54779,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\"" May 13 12:59:40.545318 containerd[1594]: time="2025-05-13T12:59:40.545233573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,}" May 13 12:59:40.633897 systemd-networkd[1494]: cali38a96af2451: Link UP May 13 12:59:40.634371 systemd-networkd[1494]: cali38a96af2451: Gained carrier May 13 12:59:40.644585 containerd[1594]: 2025-05-13 12:59:40.578 [INFO][5865] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0 coredns-6f6b679f8f- kube-system 9ba775bd-6563-42d6-824b-2dbfad12b002 750 0 2025-05-13 12:58:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-cn4wk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali38a96af2451 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-" May 13 12:59:40.644585 containerd[1594]: 2025-05-13 12:59:40.578 [INFO][5865] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" May 13 12:59:40.644585 containerd[1594]: 2025-05-13 12:59:40.603 [INFO][5880] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" HandleID="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Workload="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.610 [INFO][5880] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" HandleID="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Workload="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df680), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-cn4wk", "timestamp":"2025-05-13 12:59:40.603892703 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.610 [INFO][5880] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.610 [INFO][5880] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.610 [INFO][5880] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.611 [INFO][5880] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" host="localhost" May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.614 [INFO][5880] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.618 [INFO][5880] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.619 [INFO][5880] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.621 [INFO][5880] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:59:40.644887 containerd[1594]: 2025-05-13 12:59:40.621 [INFO][5880] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" host="localhost" May 13 12:59:40.645196 containerd[1594]: 2025-05-13 12:59:40.622 [INFO][5880] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68 May 13 12:59:40.645196 containerd[1594]: 2025-05-13 12:59:40.624 [INFO][5880] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" host="localhost" May 13 12:59:40.645196 containerd[1594]: 2025-05-13 12:59:40.629 [INFO][5880] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" host="localhost" May 13 12:59:40.645196 containerd[1594]: 2025-05-13 12:59:40.629 [INFO][5880] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" host="localhost" May 13 12:59:40.645196 containerd[1594]: 2025-05-13 12:59:40.629 [INFO][5880] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:40.645196 containerd[1594]: 2025-05-13 12:59:40.629 [INFO][5880] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" HandleID="k8s-pod-network.b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Workload="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" May 13 12:59:40.645438 containerd[1594]: 2025-05-13 12:59:40.632 [INFO][5865] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"9ba775bd-6563-42d6-824b-2dbfad12b002", ResourceVersion:"750", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-cn4wk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali38a96af2451", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:40.645525 containerd[1594]: 2025-05-13 12:59:40.632 [INFO][5865] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" May 13 12:59:40.645525 containerd[1594]: 2025-05-13 12:59:40.632 [INFO][5865] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali38a96af2451 ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" May 13 12:59:40.645525 containerd[1594]: 2025-05-13 12:59:40.634 [INFO][5865] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" May 13 12:59:40.645610 containerd[1594]: 2025-05-13 12:59:40.634 [INFO][5865] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"9ba775bd-6563-42d6-824b-2dbfad12b002", ResourceVersion:"750", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68", Pod:"coredns-6f6b679f8f-cn4wk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali38a96af2451", MAC:"7e:03:49:11:6a:dc", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:40.645610 containerd[1594]: 2025-05-13 12:59:40.641 [INFO][5865] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" Namespace="kube-system" Pod="coredns-6f6b679f8f-cn4wk" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--cn4wk-eth0" May 13 12:59:40.667053 containerd[1594]: time="2025-05-13T12:59:40.667001337Z" level=info msg="connecting to shim b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68" address="unix:///run/containerd/s/1305308eabafff48f33b819bc5a75fd8a87acf7570e699a6394f8ad78d3ef5f1" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:40.701415 systemd[1]: Started cri-containerd-b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68.scope - libcontainer container b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68. May 13 12:59:40.715203 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:59:40.744074 containerd[1594]: time="2025-05-13T12:59:40.744030592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-cn4wk,Uid:9ba775bd-6563-42d6-824b-2dbfad12b002,Namespace:kube-system,Attempt:0,} returns sandbox id \"b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68\"" May 13 12:59:40.746009 containerd[1594]: time="2025-05-13T12:59:40.745982638Z" level=info msg="CreateContainer within sandbox \"b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 12:59:40.755632 containerd[1594]: time="2025-05-13T12:59:40.755546872Z" level=info msg="Container b846b1bbc7aa9f3395de68bc5c3b1469b97a513b9c5019d7ae8d1d57b953e5c0: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:40.762002 containerd[1594]: time="2025-05-13T12:59:40.761965054Z" level=info msg="CreateContainer within sandbox \"b3df8259e9b1ef485688e52829774cdbcb7a2412400f39fe0e3fc8cecdcdde68\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b846b1bbc7aa9f3395de68bc5c3b1469b97a513b9c5019d7ae8d1d57b953e5c0\"" May 13 12:59:40.762427 containerd[1594]: time="2025-05-13T12:59:40.762403263Z" level=info msg="StartContainer for \"b846b1bbc7aa9f3395de68bc5c3b1469b97a513b9c5019d7ae8d1d57b953e5c0\"" May 13 12:59:40.763246 containerd[1594]: time="2025-05-13T12:59:40.763219946Z" level=info msg="connecting to shim b846b1bbc7aa9f3395de68bc5c3b1469b97a513b9c5019d7ae8d1d57b953e5c0" address="unix:///run/containerd/s/1305308eabafff48f33b819bc5a75fd8a87acf7570e699a6394f8ad78d3ef5f1" protocol=ttrpc version=3 May 13 12:59:40.785373 systemd[1]: Started cri-containerd-b846b1bbc7aa9f3395de68bc5c3b1469b97a513b9c5019d7ae8d1d57b953e5c0.scope - libcontainer container b846b1bbc7aa9f3395de68bc5c3b1469b97a513b9c5019d7ae8d1d57b953e5c0. May 13 12:59:40.813126 containerd[1594]: time="2025-05-13T12:59:40.813019841Z" level=info msg="StartContainer for \"b846b1bbc7aa9f3395de68bc5c3b1469b97a513b9c5019d7ae8d1d57b953e5c0\" returns successfully" May 13 12:59:40.818419 systemd-networkd[1494]: cali5316c1154e1: Gained IPv6LL May 13 12:59:40.946429 systemd-networkd[1494]: cali8c8c4ae4fde: Gained IPv6LL May 13 12:59:41.074438 systemd-networkd[1494]: cali838914800de: Gained IPv6LL May 13 12:59:41.420853 kubelet[2704]: I0513 12:59:41.420442 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-cn4wk" podStartSLOduration=82.420426447 podStartE2EDuration="1m22.420426447s" podCreationTimestamp="2025-05-13 12:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:59:41.399135341 +0000 UTC m=+87.929763358" watchObservedRunningTime="2025-05-13 12:59:41.420426447 +0000 UTC m=+87.951054434" May 13 12:59:41.545572 containerd[1594]: time="2025-05-13T12:59:41.545499776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,}" May 13 12:59:41.650459 systemd-networkd[1494]: cali53709d47b8c: Link UP May 13 12:59:41.651346 systemd-networkd[1494]: cali53709d47b8c: Gained carrier May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.584 [INFO][5992] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0 coredns-6f6b679f8f- kube-system e7d6f99e-b52c-4124-8898-cdb6a2a240c9 748 0 2025-05-13 12:58:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-xs6bj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali53709d47b8c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.584 [INFO][5992] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.610 [INFO][6007] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" HandleID="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Workload="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.621 [INFO][6007] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" HandleID="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Workload="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000393010), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-xs6bj", "timestamp":"2025-05-13 12:59:41.6104376 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.622 [INFO][6007] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.622 [INFO][6007] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.622 [INFO][6007] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.623 [INFO][6007] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.627 [INFO][6007] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.630 [INFO][6007] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.632 [INFO][6007] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.634 [INFO][6007] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.634 [INFO][6007] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.635 [INFO][6007] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.639 [INFO][6007] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.644 [INFO][6007] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.644 [INFO][6007] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" host="localhost" May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.644 [INFO][6007] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:41.663395 containerd[1594]: 2025-05-13 12:59:41.644 [INFO][6007] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" HandleID="k8s-pod-network.fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Workload="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" May 13 12:59:41.663929 containerd[1594]: 2025-05-13 12:59:41.647 [INFO][5992] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"e7d6f99e-b52c-4124-8898-cdb6a2a240c9", ResourceVersion:"748", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-xs6bj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali53709d47b8c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:41.663929 containerd[1594]: 2025-05-13 12:59:41.647 [INFO][5992] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" May 13 12:59:41.663929 containerd[1594]: 2025-05-13 12:59:41.647 [INFO][5992] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali53709d47b8c ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" May 13 12:59:41.663929 containerd[1594]: 2025-05-13 12:59:41.650 [INFO][5992] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" May 13 12:59:41.663929 containerd[1594]: 2025-05-13 12:59:41.650 [INFO][5992] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"e7d6f99e-b52c-4124-8898-cdb6a2a240c9", ResourceVersion:"748", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c", Pod:"coredns-6f6b679f8f-xs6bj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali53709d47b8c", MAC:"7e:69:85:71:7a:75", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:41.663929 containerd[1594]: 2025-05-13 12:59:41.660 [INFO][5992] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" Namespace="kube-system" Pod="coredns-6f6b679f8f-xs6bj" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xs6bj-eth0" May 13 12:59:41.688573 containerd[1594]: time="2025-05-13T12:59:41.688463737Z" level=info msg="connecting to shim fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c" address="unix:///run/containerd/s/8b8856eecc779b955373952b7d41ed75a7aed006f28a06ecb8fc33177783134d" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:41.714368 systemd-networkd[1494]: cali38a96af2451: Gained IPv6LL May 13 12:59:41.719603 systemd[1]: Started cri-containerd-fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c.scope - libcontainer container fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c. May 13 12:59:41.733346 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:59:41.765158 containerd[1594]: time="2025-05-13T12:59:41.765064168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xs6bj,Uid:e7d6f99e-b52c-4124-8898-cdb6a2a240c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c\"" May 13 12:59:41.768228 containerd[1594]: time="2025-05-13T12:59:41.768185799Z" level=info msg="CreateContainer within sandbox \"fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 12:59:41.778702 containerd[1594]: time="2025-05-13T12:59:41.778661908Z" level=info msg="Container 826d5e27435ffc217dca7e94bc6abab09e98d4f4023ce0eab246000074cbc7f8: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:41.786626 containerd[1594]: time="2025-05-13T12:59:41.786587188Z" level=info msg="CreateContainer within sandbox \"fb9234ac94b0d96b020e5b65edb7d3a8d044572abe2f0dc0f7d3df8c3cc5c48c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"826d5e27435ffc217dca7e94bc6abab09e98d4f4023ce0eab246000074cbc7f8\"" May 13 12:59:41.787123 containerd[1594]: time="2025-05-13T12:59:41.787060763Z" level=info msg="StartContainer for \"826d5e27435ffc217dca7e94bc6abab09e98d4f4023ce0eab246000074cbc7f8\"" May 13 12:59:41.788112 containerd[1594]: time="2025-05-13T12:59:41.788092377Z" level=info msg="connecting to shim 826d5e27435ffc217dca7e94bc6abab09e98d4f4023ce0eab246000074cbc7f8" address="unix:///run/containerd/s/8b8856eecc779b955373952b7d41ed75a7aed006f28a06ecb8fc33177783134d" protocol=ttrpc version=3 May 13 12:59:41.822870 systemd[1]: Started cri-containerd-826d5e27435ffc217dca7e94bc6abab09e98d4f4023ce0eab246000074cbc7f8.scope - libcontainer container 826d5e27435ffc217dca7e94bc6abab09e98d4f4023ce0eab246000074cbc7f8. May 13 12:59:41.856209 containerd[1594]: time="2025-05-13T12:59:41.856173432Z" level=info msg="StartContainer for \"826d5e27435ffc217dca7e94bc6abab09e98d4f4023ce0eab246000074cbc7f8\" returns successfully" May 13 12:59:41.976446 containerd[1594]: time="2025-05-13T12:59:41.976330948Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:41.977162 containerd[1594]: time="2025-05-13T12:59:41.977109898Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 13 12:59:41.978294 containerd[1594]: time="2025-05-13T12:59:41.978273694Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:41.980402 containerd[1594]: time="2025-05-13T12:59:41.980374162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:41.980845 containerd[1594]: time="2025-05-13T12:59:41.980794717Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 2.189177189s" May 13 12:59:41.980845 containerd[1594]: time="2025-05-13T12:59:41.980839502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 12:59:41.985042 containerd[1594]: time="2025-05-13T12:59:41.985014227Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 12:59:41.986071 containerd[1594]: time="2025-05-13T12:59:41.986041824Z" level=info msg="CreateContainer within sandbox \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 12:59:41.994333 containerd[1594]: time="2025-05-13T12:59:41.994294909Z" level=info msg="Container b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:42.000969 containerd[1594]: time="2025-05-13T12:59:42.000926084Z" level=info msg="CreateContainer within sandbox \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\"" May 13 12:59:42.001468 containerd[1594]: time="2025-05-13T12:59:42.001444235Z" level=info msg="StartContainer for \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\"" May 13 12:59:42.002607 containerd[1594]: time="2025-05-13T12:59:42.002552383Z" level=info msg="connecting to shim b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12" address="unix:///run/containerd/s/e85801ffcb58e3a7ed7643e587da5303ccdf6b4fbb1078f754fa18aaef93341f" protocol=ttrpc version=3 May 13 12:59:42.025402 systemd[1]: Started cri-containerd-b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12.scope - libcontainer container b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12. May 13 12:59:42.115135 containerd[1594]: time="2025-05-13T12:59:42.115099187Z" level=info msg="StartContainer for \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" returns successfully" May 13 12:59:42.364808 kubelet[2704]: I0513 12:59:42.364660 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-xs6bj" podStartSLOduration=83.364638893 podStartE2EDuration="1m23.364638893s" podCreationTimestamp="2025-05-13 12:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:59:42.363293822 +0000 UTC m=+88.893921839" watchObservedRunningTime="2025-05-13 12:59:42.364638893 +0000 UTC m=+88.895266890" May 13 12:59:42.386964 kubelet[2704]: I0513 12:59:42.386885 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d4cb57867-c2wmg" podStartSLOduration=71.193255407 podStartE2EDuration="1m13.386868834s" podCreationTimestamp="2025-05-13 12:58:29 +0000 UTC" firstStartedPulling="2025-05-13 12:59:39.791243861 +0000 UTC m=+86.321871858" lastFinishedPulling="2025-05-13 12:59:41.984857288 +0000 UTC m=+88.515485285" observedRunningTime="2025-05-13 12:59:42.386535406 +0000 UTC m=+88.917163393" watchObservedRunningTime="2025-05-13 12:59:42.386868834 +0000 UTC m=+88.917496831" May 13 12:59:42.994455 systemd-networkd[1494]: cali53709d47b8c: Gained IPv6LL May 13 12:59:43.202047 systemd[1]: Started sshd@20-10.0.0.121:22-10.0.0.1:42914.service - OpenSSH per-connection server daemon (10.0.0.1:42914). May 13 12:59:43.256634 sshd[6151]: Accepted publickey for core from 10.0.0.1 port 42914 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:43.258210 sshd-session[6151]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:43.263036 systemd-logind[1578]: New session 21 of user core. May 13 12:59:43.273380 systemd[1]: Started session-21.scope - Session 21 of User core. May 13 12:59:43.356114 kubelet[2704]: I0513 12:59:43.356071 2704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 12:59:43.400651 sshd[6153]: Connection closed by 10.0.0.1 port 42914 May 13 12:59:43.401004 sshd-session[6151]: pam_unix(sshd:session): session closed for user core May 13 12:59:43.405839 systemd[1]: sshd@20-10.0.0.121:22-10.0.0.1:42914.service: Deactivated successfully. May 13 12:59:43.408097 systemd[1]: session-21.scope: Deactivated successfully. May 13 12:59:43.409056 systemd-logind[1578]: Session 21 logged out. Waiting for processes to exit. May 13 12:59:43.410698 systemd-logind[1578]: Removed session 21. May 13 12:59:44.556276 containerd[1594]: time="2025-05-13T12:59:44.556170543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,}" May 13 12:59:44.858651 containerd[1594]: time="2025-05-13T12:59:44.858513585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:44.868348 systemd-networkd[1494]: calic6ee26edbd9: Link UP May 13 12:59:44.869013 systemd-networkd[1494]: calic6ee26edbd9: Gained carrier May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.782 [INFO][6179] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0 calico-apiserver-7854cf7bc7- calico-apiserver 13643b3d-f618-4901-ac95-9d9e3c0fb895 745 0 2025-05-13 12:58:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7854cf7bc7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7854cf7bc7-llgzn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic6ee26edbd9 [] []}} ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.783 [INFO][6179] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.811 [INFO][6194] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" HandleID="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Workload="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.818 [INFO][6194] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" HandleID="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Workload="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002711e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7854cf7bc7-llgzn", "timestamp":"2025-05-13 12:59:44.811045306 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.818 [INFO][6194] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.818 [INFO][6194] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.818 [INFO][6194] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.820 [INFO][6194] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.823 [INFO][6194] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.828 [INFO][6194] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.829 [INFO][6194] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.831 [INFO][6194] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.831 [INFO][6194] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.832 [INFO][6194] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.857 [INFO][6194] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.863 [INFO][6194] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.863 [INFO][6194] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" host="localhost" May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.863 [INFO][6194] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:44.881882 containerd[1594]: 2025-05-13 12:59:44.863 [INFO][6194] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" HandleID="k8s-pod-network.744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Workload="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" May 13 12:59:44.882481 containerd[1594]: 2025-05-13 12:59:44.865 [INFO][6179] cni-plugin/k8s.go 386: Populated endpoint ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0", GenerateName:"calico-apiserver-7854cf7bc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"13643b3d-f618-4901-ac95-9d9e3c0fb895", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7854cf7bc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7854cf7bc7-llgzn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic6ee26edbd9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:44.882481 containerd[1594]: 2025-05-13 12:59:44.866 [INFO][6179] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.135/32] ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" May 13 12:59:44.882481 containerd[1594]: 2025-05-13 12:59:44.866 [INFO][6179] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic6ee26edbd9 ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" May 13 12:59:44.882481 containerd[1594]: 2025-05-13 12:59:44.868 [INFO][6179] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" May 13 12:59:44.882481 containerd[1594]: 2025-05-13 12:59:44.868 [INFO][6179] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0", GenerateName:"calico-apiserver-7854cf7bc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"13643b3d-f618-4901-ac95-9d9e3c0fb895", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 12, 58, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7854cf7bc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e", Pod:"calico-apiserver-7854cf7bc7-llgzn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic6ee26edbd9", MAC:"0e:99:e3:80:00:3f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 12:59:44.882481 containerd[1594]: 2025-05-13 12:59:44.877 [INFO][6179] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" Namespace="calico-apiserver" Pod="calico-apiserver-7854cf7bc7-llgzn" WorkloadEndpoint="localhost-k8s-calico--apiserver--7854cf7bc7--llgzn-eth0" May 13 12:59:45.179204 containerd[1594]: time="2025-05-13T12:59:45.179041410Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 13 12:59:45.182816 containerd[1594]: time="2025-05-13T12:59:45.182764338Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:45.185869 containerd[1594]: time="2025-05-13T12:59:45.185809371Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:45.186174 containerd[1594]: time="2025-05-13T12:59:45.186118362Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 3.201071351s" May 13 12:59:45.186174 containerd[1594]: time="2025-05-13T12:59:45.186168317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 13 12:59:45.191613 containerd[1594]: time="2025-05-13T12:59:45.191575220Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 12:59:45.204337 containerd[1594]: time="2025-05-13T12:59:45.203886795Z" level=info msg="CreateContainer within sandbox \"90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 12:59:45.211606 containerd[1594]: time="2025-05-13T12:59:45.211558353Z" level=info msg="connecting to shim 744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e" address="unix:///run/containerd/s/30246a7b88bc4da5f42871904e944068b8fa50df85f8ca5f6cb7725fedac3ae6" namespace=k8s.io protocol=ttrpc version=3 May 13 12:59:45.212290 containerd[1594]: time="2025-05-13T12:59:45.212212742Z" level=info msg="Container 63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:45.221978 containerd[1594]: time="2025-05-13T12:59:45.221943100Z" level=info msg="CreateContainer within sandbox \"90d3b65d9eb49c9b96ae64056fbd58ebfbb8430e7a56a52e0a6397cc8771aba1\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\"" May 13 12:59:45.226008 containerd[1594]: time="2025-05-13T12:59:45.225978644Z" level=info msg="StartContainer for \"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\"" May 13 12:59:45.226986 containerd[1594]: time="2025-05-13T12:59:45.226939329Z" level=info msg="connecting to shim 63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f" address="unix:///run/containerd/s/9c68245aeddc83d336a98af2842e00c2ccc7513d49fa40d9bf6b05d873c6fc00" protocol=ttrpc version=3 May 13 12:59:45.273382 systemd[1]: Started cri-containerd-63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f.scope - libcontainer container 63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f. May 13 12:59:45.277684 systemd[1]: Started cri-containerd-744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e.scope - libcontainer container 744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e. May 13 12:59:45.301036 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 12:59:45.334452 containerd[1594]: time="2025-05-13T12:59:45.334347286Z" level=info msg="StartContainer for \"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\" returns successfully" May 13 12:59:45.346976 containerd[1594]: time="2025-05-13T12:59:45.346926723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7854cf7bc7-llgzn,Uid:13643b3d-f618-4901-ac95-9d9e3c0fb895,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e\"" May 13 12:59:45.353125 containerd[1594]: time="2025-05-13T12:59:45.352980571Z" level=info msg="CreateContainer within sandbox \"744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 12:59:45.365277 containerd[1594]: time="2025-05-13T12:59:45.365206504Z" level=info msg="Container 9451bc17e566786f05ecba1a1d8eea4d25efc8db60a908f08d6f21681fc2b654: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:45.382023 containerd[1594]: time="2025-05-13T12:59:45.381967655Z" level=info msg="CreateContainer within sandbox \"744e82e6098259a8195460dffc1afd6159f68b9eadc4f42f221fd2f64134b58e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9451bc17e566786f05ecba1a1d8eea4d25efc8db60a908f08d6f21681fc2b654\"" May 13 12:59:45.388812 kubelet[2704]: I0513 12:59:45.388652 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-647bb7c459-594xm" podStartSLOduration=70.084448818 podStartE2EDuration="1m15.388609045s" podCreationTimestamp="2025-05-13 12:58:30 +0000 UTC" firstStartedPulling="2025-05-13 12:59:39.88711485 +0000 UTC m=+86.417742847" lastFinishedPulling="2025-05-13 12:59:45.191275077 +0000 UTC m=+91.721903074" observedRunningTime="2025-05-13 12:59:45.387630627 +0000 UTC m=+91.918258644" watchObservedRunningTime="2025-05-13 12:59:45.388609045 +0000 UTC m=+91.919237042" May 13 12:59:45.391802 containerd[1594]: time="2025-05-13T12:59:45.391756634Z" level=info msg="StartContainer for \"9451bc17e566786f05ecba1a1d8eea4d25efc8db60a908f08d6f21681fc2b654\"" May 13 12:59:45.393442 containerd[1594]: time="2025-05-13T12:59:45.393405963Z" level=info msg="connecting to shim 9451bc17e566786f05ecba1a1d8eea4d25efc8db60a908f08d6f21681fc2b654" address="unix:///run/containerd/s/30246a7b88bc4da5f42871904e944068b8fa50df85f8ca5f6cb7725fedac3ae6" protocol=ttrpc version=3 May 13 12:59:45.418713 systemd[1]: Started cri-containerd-9451bc17e566786f05ecba1a1d8eea4d25efc8db60a908f08d6f21681fc2b654.scope - libcontainer container 9451bc17e566786f05ecba1a1d8eea4d25efc8db60a908f08d6f21681fc2b654. May 13 12:59:45.430218 containerd[1594]: time="2025-05-13T12:59:45.430118616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\" id:\"40029c45c991ea0de878f697cf7c05e6d69ec9901c69e4b51018b8b0a2740c5c\" pid:6316 exit_status:1 exited_at:{seconds:1747141185 nanos:429619633}" May 13 12:59:45.475693 containerd[1594]: time="2025-05-13T12:59:45.475662571Z" level=info msg="StartContainer for \"9451bc17e566786f05ecba1a1d8eea4d25efc8db60a908f08d6f21681fc2b654\" returns successfully" May 13 12:59:45.639666 containerd[1594]: time="2025-05-13T12:59:45.639602772Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 12:59:45.640460 containerd[1594]: time="2025-05-13T12:59:45.640434870Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 12:59:45.642298 containerd[1594]: time="2025-05-13T12:59:45.642231500Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 450.621964ms" May 13 12:59:45.642298 containerd[1594]: time="2025-05-13T12:59:45.642289431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 12:59:45.644461 containerd[1594]: time="2025-05-13T12:59:45.644414489Z" level=info msg="CreateContainer within sandbox \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 12:59:45.656284 containerd[1594]: time="2025-05-13T12:59:45.655526173Z" level=info msg="Container 79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4: CDI devices from CRI Config.CDIDevices: []" May 13 12:59:45.663636 containerd[1594]: time="2025-05-13T12:59:45.663589578Z" level=info msg="CreateContainer within sandbox \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\"" May 13 12:59:45.664110 containerd[1594]: time="2025-05-13T12:59:45.664071338Z" level=info msg="StartContainer for \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\"" May 13 12:59:45.665162 containerd[1594]: time="2025-05-13T12:59:45.665103588Z" level=info msg="connecting to shim 79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4" address="unix:///run/containerd/s/bfda057a6975cb06251878d9fef87207c9b38527568bf903d28ee137e747f220" protocol=ttrpc version=3 May 13 12:59:45.690467 systemd[1]: Started cri-containerd-79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4.scope - libcontainer container 79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4. May 13 12:59:45.740856 containerd[1594]: time="2025-05-13T12:59:45.740813517Z" level=info msg="StartContainer for \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" returns successfully" May 13 12:59:46.384433 kubelet[2704]: I0513 12:59:46.384271 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d4cb57867-2r646" podStartSLOduration=71.724171345 podStartE2EDuration="1m17.384225099s" podCreationTimestamp="2025-05-13 12:58:29 +0000 UTC" firstStartedPulling="2025-05-13 12:59:39.983009935 +0000 UTC m=+86.513637922" lastFinishedPulling="2025-05-13 12:59:45.643063679 +0000 UTC m=+92.173691676" observedRunningTime="2025-05-13 12:59:46.382750585 +0000 UTC m=+92.913378582" watchObservedRunningTime="2025-05-13 12:59:46.384225099 +0000 UTC m=+92.914853096" May 13 12:59:46.418107 kubelet[2704]: I0513 12:59:46.418029 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7854cf7bc7-llgzn" podStartSLOduration=76.417993012 podStartE2EDuration="1m16.417993012s" podCreationTimestamp="2025-05-13 12:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 12:59:46.398633486 +0000 UTC m=+92.929261483" watchObservedRunningTime="2025-05-13 12:59:46.417993012 +0000 UTC m=+92.948621009" May 13 12:59:46.451801 containerd[1594]: time="2025-05-13T12:59:46.451755425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\" id:\"52f1cdd37d742e58589039d233492442abee21b5b6ce4a444482dfb277918afe\" pid:6409 exited_at:{seconds:1747141186 nanos:450540977}" May 13 12:59:46.770479 systemd-networkd[1494]: calic6ee26edbd9: Gained IPv6LL May 13 12:59:48.043077 containerd[1594]: time="2025-05-13T12:59:48.043031514Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\" id:\"daeefa7f33371c35edf52eb7cf38d37848678283cdb7fe39167c9299a6b5420b\" pid:6436 exited_at:{seconds:1747141188 nanos:42782559}" May 13 12:59:48.379173 containerd[1594]: time="2025-05-13T12:59:48.379026217Z" level=info msg="StopContainer for \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" with timeout 30 (s)" May 13 12:59:48.383401 containerd[1594]: time="2025-05-13T12:59:48.383363657Z" level=info msg="Stop container \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" with signal terminated" May 13 12:59:48.395666 systemd[1]: cri-containerd-79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4.scope: Deactivated successfully. May 13 12:59:48.397407 containerd[1594]: time="2025-05-13T12:59:48.397347542Z" level=info msg="received exit event container_id:\"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" id:\"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" pid:6374 exit_status:1 exited_at:{seconds:1747141188 nanos:397086655}" May 13 12:59:48.397479 containerd[1594]: time="2025-05-13T12:59:48.397431963Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" id:\"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" pid:6374 exit_status:1 exited_at:{seconds:1747141188 nanos:397086655}" May 13 12:59:48.415826 systemd[1]: Started sshd@21-10.0.0.121:22-10.0.0.1:52816.service - OpenSSH per-connection server daemon (10.0.0.1:52816). May 13 12:59:48.423612 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4-rootfs.mount: Deactivated successfully. May 13 12:59:48.434560 containerd[1594]: time="2025-05-13T12:59:48.434451584Z" level=info msg="StopContainer for \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" returns successfully" May 13 12:59:48.435028 containerd[1594]: time="2025-05-13T12:59:48.435002825Z" level=info msg="StopPodSandbox for \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\"" May 13 12:59:48.435080 containerd[1594]: time="2025-05-13T12:59:48.435050275Z" level=info msg="Container to stop \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 12:59:48.443062 systemd[1]: cri-containerd-97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5.scope: Deactivated successfully. May 13 12:59:48.444484 containerd[1594]: time="2025-05-13T12:59:48.444440682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" id:\"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" pid:5852 exit_status:137 exited_at:{seconds:1747141188 nanos:444063543}" May 13 12:59:48.470811 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5-rootfs.mount: Deactivated successfully. May 13 12:59:48.474068 containerd[1594]: time="2025-05-13T12:59:48.474030465Z" level=info msg="shim disconnected" id=97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5 namespace=k8s.io May 13 12:59:48.474068 containerd[1594]: time="2025-05-13T12:59:48.474061755Z" level=warning msg="cleaning up after shim disconnected" id=97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5 namespace=k8s.io May 13 12:59:48.474235 containerd[1594]: time="2025-05-13T12:59:48.474070422Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 12:59:48.475680 sshd[6466]: Accepted publickey for core from 10.0.0.1 port 52816 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:48.477781 sshd-session[6466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:48.483403 systemd-logind[1578]: New session 22 of user core. May 13 12:59:48.487532 containerd[1594]: time="2025-05-13T12:59:48.487497746Z" level=info msg="received exit event sandbox_id:\"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" exit_status:137 exited_at:{seconds:1747141188 nanos:444063543}" May 13 12:59:48.489385 systemd[1]: Started session-22.scope - Session 22 of User core. May 13 12:59:48.491689 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5-shm.mount: Deactivated successfully. May 13 12:59:48.538802 systemd-networkd[1494]: cali8c8c4ae4fde: Link DOWN May 13 12:59:48.538882 systemd-networkd[1494]: cali8c8c4ae4fde: Lost carrier May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.536 [INFO][6525] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.536 [INFO][6525] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" iface="eth0" netns="/var/run/netns/cni-ad0fc161-9e64-c623-e660-4415b57346ef" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.537 [INFO][6525] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" iface="eth0" netns="/var/run/netns/cni-ad0fc161-9e64-c623-e660-4415b57346ef" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.545 [INFO][6525] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" after=8.739444ms iface="eth0" netns="/var/run/netns/cni-ad0fc161-9e64-c623-e660-4415b57346ef" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.545 [INFO][6525] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.545 [INFO][6525] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.573 [INFO][6539] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.573 [INFO][6539] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.573 [INFO][6539] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.608 [INFO][6539] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.608 [INFO][6539] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.609 [INFO][6539] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:48.616972 containerd[1594]: 2025-05-13 12:59:48.613 [INFO][6525] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 12:59:48.618473 containerd[1594]: time="2025-05-13T12:59:48.618420997Z" level=info msg="TearDown network for sandbox \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" successfully" May 13 12:59:48.618473 containerd[1594]: time="2025-05-13T12:59:48.618457797Z" level=info msg="StopPodSandbox for \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" returns successfully" May 13 12:59:48.621716 systemd[1]: run-netns-cni\x2dad0fc161\x2d9e64\x2dc623\x2de660\x2d4415b57346ef.mount: Deactivated successfully. May 13 12:59:48.647629 sshd[6514]: Connection closed by 10.0.0.1 port 52816 May 13 12:59:48.649029 sshd-session[6466]: pam_unix(sshd:session): session closed for user core May 13 12:59:48.653812 systemd-logind[1578]: Session 22 logged out. Waiting for processes to exit. May 13 12:59:48.654085 systemd[1]: sshd@21-10.0.0.121:22-10.0.0.1:52816.service: Deactivated successfully. May 13 12:59:48.655947 systemd[1]: session-22.scope: Deactivated successfully. May 13 12:59:48.657865 systemd-logind[1578]: Removed session 22. May 13 12:59:48.732486 kubelet[2704]: I0513 12:59:48.732369 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gl5b\" (UniqueName: \"kubernetes.io/projected/509974ac-8c85-45a5-a90c-6e89c9a54779-kube-api-access-2gl5b\") pod \"509974ac-8c85-45a5-a90c-6e89c9a54779\" (UID: \"509974ac-8c85-45a5-a90c-6e89c9a54779\") " May 13 12:59:48.732486 kubelet[2704]: I0513 12:59:48.732424 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/509974ac-8c85-45a5-a90c-6e89c9a54779-calico-apiserver-certs\") pod \"509974ac-8c85-45a5-a90c-6e89c9a54779\" (UID: \"509974ac-8c85-45a5-a90c-6e89c9a54779\") " May 13 12:59:48.736301 kubelet[2704]: I0513 12:59:48.736246 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/509974ac-8c85-45a5-a90c-6e89c9a54779-kube-api-access-2gl5b" (OuterVolumeSpecName: "kube-api-access-2gl5b") pod "509974ac-8c85-45a5-a90c-6e89c9a54779" (UID: "509974ac-8c85-45a5-a90c-6e89c9a54779"). InnerVolumeSpecName "kube-api-access-2gl5b". PluginName "kubernetes.io/projected", VolumeGidValue "" May 13 12:59:48.737502 kubelet[2704]: I0513 12:59:48.737445 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/509974ac-8c85-45a5-a90c-6e89c9a54779-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "509974ac-8c85-45a5-a90c-6e89c9a54779" (UID: "509974ac-8c85-45a5-a90c-6e89c9a54779"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 13 12:59:48.739311 systemd[1]: var-lib-kubelet-pods-509974ac\x2d8c85\x2d45a5\x2da90c\x2d6e89c9a54779-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2gl5b.mount: Deactivated successfully. May 13 12:59:48.739468 systemd[1]: var-lib-kubelet-pods-509974ac\x2d8c85\x2d45a5\x2da90c\x2d6e89c9a54779-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 13 12:59:48.832873 kubelet[2704]: I0513 12:59:48.832802 2704 reconciler_common.go:288] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/509974ac-8c85-45a5-a90c-6e89c9a54779-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" May 13 12:59:48.832873 kubelet[2704]: I0513 12:59:48.832840 2704 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-2gl5b\" (UniqueName: \"kubernetes.io/projected/509974ac-8c85-45a5-a90c-6e89c9a54779-kube-api-access-2gl5b\") on node \"localhost\" DevicePath \"\"" May 13 12:59:49.381767 kubelet[2704]: I0513 12:59:49.381717 2704 scope.go:117] "RemoveContainer" containerID="79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4" May 13 12:59:49.383528 containerd[1594]: time="2025-05-13T12:59:49.383496038Z" level=info msg="RemoveContainer for \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\"" May 13 12:59:49.387767 systemd[1]: Removed slice kubepods-besteffort-pod509974ac_8c85_45a5_a90c_6e89c9a54779.slice - libcontainer container kubepods-besteffort-pod509974ac_8c85_45a5_a90c_6e89c9a54779.slice. May 13 12:59:49.511333 containerd[1594]: time="2025-05-13T12:59:49.511289886Z" level=info msg="RemoveContainer for \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" returns successfully" May 13 12:59:49.511609 kubelet[2704]: I0513 12:59:49.511562 2704 scope.go:117] "RemoveContainer" containerID="79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4" May 13 12:59:49.511945 containerd[1594]: time="2025-05-13T12:59:49.511883808Z" level=error msg="ContainerStatus for \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\": not found" May 13 12:59:49.512121 kubelet[2704]: E0513 12:59:49.512052 2704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\": not found" containerID="79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4" May 13 12:59:49.512183 kubelet[2704]: I0513 12:59:49.512079 2704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4"} err="failed to get container status \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\": rpc error: code = NotFound desc = an error occurred when try to find container \"79fe7b1f07cb8a9d3fbafe9d067c466b7017f50d1966b7e758046aee948cebc4\": not found" May 13 12:59:49.553616 kubelet[2704]: I0513 12:59:49.553567 2704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="509974ac-8c85-45a5-a90c-6e89c9a54779" path="/var/lib/kubelet/pods/509974ac-8c85-45a5-a90c-6e89c9a54779/volumes" May 13 12:59:52.846899 containerd[1594]: time="2025-05-13T12:59:52.846826752Z" level=info msg="StopContainer for \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" with timeout 30 (s)" May 13 12:59:52.848491 containerd[1594]: time="2025-05-13T12:59:52.848355983Z" level=info msg="Stop container \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" with signal terminated" May 13 12:59:52.859778 systemd[1]: cri-containerd-b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12.scope: Deactivated successfully. May 13 12:59:52.863651 containerd[1594]: time="2025-05-13T12:59:52.863595499Z" level=info msg="received exit event container_id:\"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" id:\"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" pid:6124 exit_status:1 exited_at:{seconds:1747141192 nanos:863053467}" May 13 12:59:52.863784 containerd[1594]: time="2025-05-13T12:59:52.863710098Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" id:\"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" pid:6124 exit_status:1 exited_at:{seconds:1747141192 nanos:863053467}" May 13 12:59:52.885382 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12-rootfs.mount: Deactivated successfully. May 13 12:59:52.901580 containerd[1594]: time="2025-05-13T12:59:52.901542721Z" level=info msg="StopContainer for \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" returns successfully" May 13 12:59:52.902078 containerd[1594]: time="2025-05-13T12:59:52.902039307Z" level=info msg="StopPodSandbox for \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\"" May 13 12:59:52.902127 containerd[1594]: time="2025-05-13T12:59:52.902103830Z" level=info msg="Container to stop \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 13 12:59:52.909328 systemd[1]: cri-containerd-8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33.scope: Deactivated successfully. May 13 12:59:52.911421 containerd[1594]: time="2025-05-13T12:59:52.911384290Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" id:\"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" pid:5730 exit_status:137 exited_at:{seconds:1747141192 nanos:911131709}" May 13 12:59:52.937662 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33-rootfs.mount: Deactivated successfully. May 13 12:59:52.968234 containerd[1594]: time="2025-05-13T12:59:52.968196931Z" level=info msg="shim disconnected" id=8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33 namespace=k8s.io May 13 12:59:52.968234 containerd[1594]: time="2025-05-13T12:59:52.968233450Z" level=warning msg="cleaning up after shim disconnected" id=8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33 namespace=k8s.io May 13 12:59:52.968445 containerd[1594]: time="2025-05-13T12:59:52.968242078Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 13 12:59:52.979887 containerd[1594]: time="2025-05-13T12:59:52.979846102Z" level=info msg="received exit event sandbox_id:\"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" exit_status:137 exited_at:{seconds:1747141192 nanos:911131709}" May 13 12:59:52.982564 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33-shm.mount: Deactivated successfully. May 13 12:59:53.020604 systemd-networkd[1494]: cali838914800de: Link DOWN May 13 12:59:53.020613 systemd-networkd[1494]: cali838914800de: Lost carrier May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.019 [INFO][6647] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.019 [INFO][6647] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" iface="eth0" netns="/var/run/netns/cni-21664547-30d9-bbba-1fe9-68b55f7408d1" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.019 [INFO][6647] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" iface="eth0" netns="/var/run/netns/cni-21664547-30d9-bbba-1fe9-68b55f7408d1" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.026 [INFO][6647] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" after=7.27163ms iface="eth0" netns="/var/run/netns/cni-21664547-30d9-bbba-1fe9-68b55f7408d1" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.026 [INFO][6647] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.026 [INFO][6647] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.046 [INFO][6658] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.046 [INFO][6658] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.047 [INFO][6658] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.071 [INFO][6658] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.071 [INFO][6658] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.072 [INFO][6658] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 12:59:53.077220 containerd[1594]: 2025-05-13 12:59:53.074 [INFO][6647] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 12:59:53.077643 containerd[1594]: time="2025-05-13T12:59:53.077458856Z" level=info msg="TearDown network for sandbox \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" successfully" May 13 12:59:53.077643 containerd[1594]: time="2025-05-13T12:59:53.077482862Z" level=info msg="StopPodSandbox for \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" returns successfully" May 13 12:59:53.080005 systemd[1]: run-netns-cni\x2d21664547\x2d30d9\x2dbbba\x2d1fe9\x2d68b55f7408d1.mount: Deactivated successfully. May 13 12:59:53.259094 kubelet[2704]: I0513 12:59:53.259049 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c6jv\" (UniqueName: \"kubernetes.io/projected/a9d392a7-21ed-46f8-a204-24615b7b6d08-kube-api-access-4c6jv\") pod \"a9d392a7-21ed-46f8-a204-24615b7b6d08\" (UID: \"a9d392a7-21ed-46f8-a204-24615b7b6d08\") " May 13 12:59:53.259538 kubelet[2704]: I0513 12:59:53.259105 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a9d392a7-21ed-46f8-a204-24615b7b6d08-calico-apiserver-certs\") pod \"a9d392a7-21ed-46f8-a204-24615b7b6d08\" (UID: \"a9d392a7-21ed-46f8-a204-24615b7b6d08\") " May 13 12:59:53.262201 kubelet[2704]: I0513 12:59:53.262158 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d392a7-21ed-46f8-a204-24615b7b6d08-kube-api-access-4c6jv" (OuterVolumeSpecName: "kube-api-access-4c6jv") pod "a9d392a7-21ed-46f8-a204-24615b7b6d08" (UID: "a9d392a7-21ed-46f8-a204-24615b7b6d08"). InnerVolumeSpecName "kube-api-access-4c6jv". PluginName "kubernetes.io/projected", VolumeGidValue "" May 13 12:59:53.262464 kubelet[2704]: I0513 12:59:53.262403 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d392a7-21ed-46f8-a204-24615b7b6d08-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "a9d392a7-21ed-46f8-a204-24615b7b6d08" (UID: "a9d392a7-21ed-46f8-a204-24615b7b6d08"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 13 12:59:53.263935 systemd[1]: var-lib-kubelet-pods-a9d392a7\x2d21ed\x2d46f8\x2da204\x2d24615b7b6d08-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d4c6jv.mount: Deactivated successfully. May 13 12:59:53.264058 systemd[1]: var-lib-kubelet-pods-a9d392a7\x2d21ed\x2d46f8\x2da204\x2d24615b7b6d08-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 13 12:59:53.359887 kubelet[2704]: I0513 12:59:53.359847 2704 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-4c6jv\" (UniqueName: \"kubernetes.io/projected/a9d392a7-21ed-46f8-a204-24615b7b6d08-kube-api-access-4c6jv\") on node \"localhost\" DevicePath \"\"" May 13 12:59:53.359887 kubelet[2704]: I0513 12:59:53.359866 2704 reconciler_common.go:288] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a9d392a7-21ed-46f8-a204-24615b7b6d08-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" May 13 12:59:53.391111 kubelet[2704]: I0513 12:59:53.391086 2704 scope.go:117] "RemoveContainer" containerID="b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12" May 13 12:59:53.392867 containerd[1594]: time="2025-05-13T12:59:53.392829678Z" level=info msg="RemoveContainer for \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\"" May 13 12:59:53.396835 systemd[1]: Removed slice kubepods-besteffort-poda9d392a7_21ed_46f8_a204_24615b7b6d08.slice - libcontainer container kubepods-besteffort-poda9d392a7_21ed_46f8_a204_24615b7b6d08.slice. May 13 12:59:53.397965 containerd[1594]: time="2025-05-13T12:59:53.397939272Z" level=info msg="RemoveContainer for \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" returns successfully" May 13 12:59:53.398178 kubelet[2704]: I0513 12:59:53.398110 2704 scope.go:117] "RemoveContainer" containerID="b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12" May 13 12:59:53.398338 containerd[1594]: time="2025-05-13T12:59:53.398312313Z" level=error msg="ContainerStatus for \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\": not found" May 13 12:59:53.398449 kubelet[2704]: E0513 12:59:53.398429 2704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\": not found" containerID="b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12" May 13 12:59:53.398490 kubelet[2704]: I0513 12:59:53.398454 2704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12"} err="failed to get container status \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\": rpc error: code = NotFound desc = an error occurred when try to find container \"b70b8c0f82a39731b34fad9480408cd3bf012cb573ed6ae97a34eb690169fa12\": not found" May 13 12:59:53.549528 kubelet[2704]: I0513 12:59:53.549071 2704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d392a7-21ed-46f8-a204-24615b7b6d08" path="/var/lib/kubelet/pods/a9d392a7-21ed-46f8-a204-24615b7b6d08/volumes" May 13 12:59:53.669444 systemd[1]: Started sshd@22-10.0.0.121:22-10.0.0.1:52826.service - OpenSSH per-connection server daemon (10.0.0.1:52826). May 13 12:59:53.707500 sshd[6673]: Accepted publickey for core from 10.0.0.1 port 52826 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:53.709346 sshd-session[6673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:53.713902 systemd-logind[1578]: New session 23 of user core. May 13 12:59:53.719384 systemd[1]: Started session-23.scope - Session 23 of User core. May 13 12:59:53.833632 sshd[6675]: Connection closed by 10.0.0.1 port 52826 May 13 12:59:53.834113 sshd-session[6673]: pam_unix(sshd:session): session closed for user core May 13 12:59:53.843876 systemd[1]: sshd@22-10.0.0.121:22-10.0.0.1:52826.service: Deactivated successfully. May 13 12:59:53.845849 systemd[1]: session-23.scope: Deactivated successfully. May 13 12:59:53.846616 systemd-logind[1578]: Session 23 logged out. Waiting for processes to exit. May 13 12:59:53.849568 systemd[1]: Started sshd@23-10.0.0.121:22-10.0.0.1:52840.service - OpenSSH per-connection server daemon (10.0.0.1:52840). May 13 12:59:53.850431 systemd-logind[1578]: Removed session 23. May 13 12:59:53.898108 sshd[6689]: Accepted publickey for core from 10.0.0.1 port 52840 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:53.899816 sshd-session[6689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:53.903911 systemd-logind[1578]: New session 24 of user core. May 13 12:59:53.915409 systemd[1]: Started session-24.scope - Session 24 of User core. May 13 12:59:54.219978 sshd[6693]: Connection closed by 10.0.0.1 port 52840 May 13 12:59:54.220499 sshd-session[6689]: pam_unix(sshd:session): session closed for user core May 13 12:59:54.230112 systemd[1]: sshd@23-10.0.0.121:22-10.0.0.1:52840.service: Deactivated successfully. May 13 12:59:54.232932 systemd[1]: session-24.scope: Deactivated successfully. May 13 12:59:54.233977 systemd-logind[1578]: Session 24 logged out. Waiting for processes to exit. May 13 12:59:54.237704 systemd[1]: Started sshd@24-10.0.0.121:22-10.0.0.1:52856.service - OpenSSH per-connection server daemon (10.0.0.1:52856). May 13 12:59:54.238735 systemd-logind[1578]: Removed session 24. May 13 12:59:54.302801 sshd[6704]: Accepted publickey for core from 10.0.0.1 port 52856 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:54.304406 sshd-session[6704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:54.308962 systemd-logind[1578]: New session 25 of user core. May 13 12:59:54.323400 systemd[1]: Started session-25.scope - Session 25 of User core. May 13 12:59:55.773414 sshd[6706]: Connection closed by 10.0.0.1 port 52856 May 13 12:59:55.774794 sshd-session[6704]: pam_unix(sshd:session): session closed for user core May 13 12:59:55.785023 systemd[1]: sshd@24-10.0.0.121:22-10.0.0.1:52856.service: Deactivated successfully. May 13 12:59:55.787009 systemd[1]: session-25.scope: Deactivated successfully. May 13 12:59:55.788090 systemd[1]: session-25.scope: Consumed 588ms CPU time, 68.7M memory peak. May 13 12:59:55.791729 systemd-logind[1578]: Session 25 logged out. Waiting for processes to exit. May 13 12:59:55.796391 systemd[1]: Started sshd@25-10.0.0.121:22-10.0.0.1:52866.service - OpenSSH per-connection server daemon (10.0.0.1:52866). May 13 12:59:55.797196 systemd-logind[1578]: Removed session 25. May 13 12:59:55.848598 sshd[6734]: Accepted publickey for core from 10.0.0.1 port 52866 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:55.850964 sshd-session[6734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:55.864366 systemd-logind[1578]: New session 26 of user core. May 13 12:59:55.868432 systemd[1]: Started session-26.scope - Session 26 of User core. May 13 12:59:56.082824 sshd[6736]: Connection closed by 10.0.0.1 port 52866 May 13 12:59:56.081798 sshd-session[6734]: pam_unix(sshd:session): session closed for user core May 13 12:59:56.094061 systemd[1]: sshd@25-10.0.0.121:22-10.0.0.1:52866.service: Deactivated successfully. May 13 12:59:56.096035 systemd[1]: session-26.scope: Deactivated successfully. May 13 12:59:56.096858 systemd-logind[1578]: Session 26 logged out. Waiting for processes to exit. May 13 12:59:56.099854 systemd[1]: Started sshd@26-10.0.0.121:22-10.0.0.1:52870.service - OpenSSH per-connection server daemon (10.0.0.1:52870). May 13 12:59:56.100696 systemd-logind[1578]: Removed session 26. May 13 12:59:56.146857 sshd[6747]: Accepted publickey for core from 10.0.0.1 port 52870 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 12:59:56.148461 sshd-session[6747]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 12:59:56.152924 systemd-logind[1578]: New session 27 of user core. May 13 12:59:56.170380 systemd[1]: Started session-27.scope - Session 27 of User core. May 13 12:59:56.279805 sshd[6749]: Connection closed by 10.0.0.1 port 52870 May 13 12:59:56.280116 sshd-session[6747]: pam_unix(sshd:session): session closed for user core May 13 12:59:56.283813 systemd[1]: sshd@26-10.0.0.121:22-10.0.0.1:52870.service: Deactivated successfully. May 13 12:59:56.285551 systemd[1]: session-27.scope: Deactivated successfully. May 13 12:59:56.286321 systemd-logind[1578]: Session 27 logged out. Waiting for processes to exit. May 13 12:59:56.287509 systemd-logind[1578]: Removed session 27. May 13 13:00:00.958223 containerd[1594]: time="2025-05-13T13:00:00.958161036Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc083029df11f7ad84b28fc4c5f2751db22e7691f20c8534b4c21033ab5fff8a\" id:\"548ac8faf9acceba1bd8ae03e39fc960afd9c8eb814df6cf9c03a75cdc9bb822\" pid:6776 exited_at:{seconds:1747141200 nanos:957909027}" May 13 13:00:01.296699 systemd[1]: Started sshd@27-10.0.0.121:22-10.0.0.1:41502.service - OpenSSH per-connection server daemon (10.0.0.1:41502). May 13 13:00:01.341643 sshd[6789]: Accepted publickey for core from 10.0.0.1 port 41502 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 13:00:01.343187 sshd-session[6789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 13:00:01.347611 systemd-logind[1578]: New session 28 of user core. May 13 13:00:01.358384 systemd[1]: Started session-28.scope - Session 28 of User core. May 13 13:00:01.472470 sshd[6792]: Connection closed by 10.0.0.1 port 41502 May 13 13:00:01.472781 sshd-session[6789]: pam_unix(sshd:session): session closed for user core May 13 13:00:01.477116 systemd[1]: sshd@27-10.0.0.121:22-10.0.0.1:41502.service: Deactivated successfully. May 13 13:00:01.479229 systemd[1]: session-28.scope: Deactivated successfully. May 13 13:00:01.480150 systemd-logind[1578]: Session 28 logged out. Waiting for processes to exit. May 13 13:00:01.481389 systemd-logind[1578]: Removed session 28. May 13 13:00:06.488184 systemd[1]: Started sshd@28-10.0.0.121:22-10.0.0.1:41508.service - OpenSSH per-connection server daemon (10.0.0.1:41508). May 13 13:00:06.540820 sshd[6812]: Accepted publickey for core from 10.0.0.1 port 41508 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 13:00:06.542483 sshd-session[6812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 13:00:06.547224 systemd-logind[1578]: New session 29 of user core. May 13 13:00:06.552462 systemd[1]: Started session-29.scope - Session 29 of User core. May 13 13:00:06.657479 sshd[6816]: Connection closed by 10.0.0.1 port 41508 May 13 13:00:06.657807 sshd-session[6812]: pam_unix(sshd:session): session closed for user core May 13 13:00:06.662175 systemd[1]: sshd@28-10.0.0.121:22-10.0.0.1:41508.service: Deactivated successfully. May 13 13:00:06.664241 systemd[1]: session-29.scope: Deactivated successfully. May 13 13:00:06.665041 systemd-logind[1578]: Session 29 logged out. Waiting for processes to exit. May 13 13:00:06.666484 systemd-logind[1578]: Removed session 29. May 13 13:00:07.739881 containerd[1594]: time="2025-05-13T13:00:07.739833690Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\" id:\"8f703dee176ce36588508cd8b393692fd61a36593e1c4aee2506440383466e3a\" pid:6841 exited_at:{seconds:1747141207 nanos:739658799}" May 13 13:00:11.676045 systemd[1]: Started sshd@29-10.0.0.121:22-10.0.0.1:57776.service - OpenSSH per-connection server daemon (10.0.0.1:57776). May 13 13:00:11.729872 sshd[6853]: Accepted publickey for core from 10.0.0.1 port 57776 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 13:00:11.731354 sshd-session[6853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 13:00:11.735476 systemd-logind[1578]: New session 30 of user core. May 13 13:00:11.743421 systemd[1]: Started session-30.scope - Session 30 of User core. May 13 13:00:11.856049 sshd[6855]: Connection closed by 10.0.0.1 port 57776 May 13 13:00:11.856746 sshd-session[6853]: pam_unix(sshd:session): session closed for user core May 13 13:00:11.861800 systemd[1]: sshd@29-10.0.0.121:22-10.0.0.1:57776.service: Deactivated successfully. May 13 13:00:11.863731 systemd[1]: session-30.scope: Deactivated successfully. May 13 13:00:11.864563 systemd-logind[1578]: Session 30 logged out. Waiting for processes to exit. May 13 13:00:11.866147 systemd-logind[1578]: Removed session 30. May 13 13:00:13.543330 containerd[1594]: time="2025-05-13T13:00:13.543274567Z" level=info msg="StopPodSandbox for \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\"" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.577 [WARNING][6886] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.577 [INFO][6886] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.577 [INFO][6886] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" iface="eth0" netns="" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.577 [INFO][6886] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.577 [INFO][6886] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.600 [INFO][6896] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.600 [INFO][6896] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.600 [INFO][6896] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.606 [WARNING][6896] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.606 [INFO][6896] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.607 [INFO][6896] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 13:00:13.612509 containerd[1594]: 2025-05-13 13:00:13.609 [INFO][6886] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.612979 containerd[1594]: time="2025-05-13T13:00:13.612549046Z" level=info msg="TearDown network for sandbox \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" successfully" May 13 13:00:13.612979 containerd[1594]: time="2025-05-13T13:00:13.612573914Z" level=info msg="StopPodSandbox for \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" returns successfully" May 13 13:00:13.613442 containerd[1594]: time="2025-05-13T13:00:13.613404657Z" level=info msg="RemovePodSandbox for \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\"" May 13 13:00:13.617106 containerd[1594]: time="2025-05-13T13:00:13.617080899Z" level=info msg="Forcibly stopping sandbox \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\"" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.657 [WARNING][6920] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.658 [INFO][6920] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.658 [INFO][6920] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" iface="eth0" netns="" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.660 [INFO][6920] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.660 [INFO][6920] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.703 [INFO][6929] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.704 [INFO][6929] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.704 [INFO][6929] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.708 [WARNING][6929] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.708 [INFO][6929] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" HandleID="k8s-pod-network.8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" Workload="localhost-k8s-calico--apiserver--5d4cb57867--c2wmg-eth0" May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.709 [INFO][6929] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 13:00:13.713572 containerd[1594]: 2025-05-13 13:00:13.711 [INFO][6920] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33" May 13 13:00:13.713938 containerd[1594]: time="2025-05-13T13:00:13.713609252Z" level=info msg="TearDown network for sandbox \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" successfully" May 13 13:00:13.718637 containerd[1594]: time="2025-05-13T13:00:13.718583993Z" level=info msg="Ensure that sandbox 8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33 in task-service has been cleanup successfully" May 13 13:00:13.987586 containerd[1594]: time="2025-05-13T13:00:13.987428503Z" level=info msg="RemovePodSandbox \"8892c54f99351d72c0416a9f14ffb9dd087fac4f0d4dc6cd6c9135032c4eba33\" returns successfully" May 13 13:00:13.988134 containerd[1594]: time="2025-05-13T13:00:13.988047866Z" level=info msg="StopPodSandbox for \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\"" May 13 13:00:13.988329 containerd[1594]: time="2025-05-13T13:00:13.988307327Z" level=info msg="TearDown network for sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" successfully" May 13 13:00:13.988329 containerd[1594]: time="2025-05-13T13:00:13.988325221Z" level=info msg="StopPodSandbox for \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" returns successfully" May 13 13:00:13.988841 containerd[1594]: time="2025-05-13T13:00:13.988807094Z" level=info msg="RemovePodSandbox for \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\"" May 13 13:00:13.988890 containerd[1594]: time="2025-05-13T13:00:13.988851427Z" level=info msg="Forcibly stopping sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\"" May 13 13:00:13.988990 containerd[1594]: time="2025-05-13T13:00:13.988971334Z" level=info msg="TearDown network for sandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" successfully" May 13 13:00:13.991054 containerd[1594]: time="2025-05-13T13:00:13.991003503Z" level=info msg="Ensure that sandbox b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe in task-service has been cleanup successfully" May 13 13:00:14.158019 containerd[1594]: time="2025-05-13T13:00:14.157967472Z" level=info msg="RemovePodSandbox \"b8c41c920a33c3f0dcaef0a6bf0827419b2f07627354a3234001aa2f51a766fe\" returns successfully" May 13 13:00:14.158567 containerd[1594]: time="2025-05-13T13:00:14.158534005Z" level=info msg="StopPodSandbox for \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\"" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.193 [WARNING][6954] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.193 [INFO][6954] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.193 [INFO][6954] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" iface="eth0" netns="" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.193 [INFO][6954] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.193 [INFO][6954] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.213 [INFO][6963] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.214 [INFO][6963] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.214 [INFO][6963] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.222 [WARNING][6963] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.222 [INFO][6963] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.224 [INFO][6963] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 13:00:14.228278 containerd[1594]: 2025-05-13 13:00:14.226 [INFO][6954] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.228768 containerd[1594]: time="2025-05-13T13:00:14.228315561Z" level=info msg="TearDown network for sandbox \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" successfully" May 13 13:00:14.228768 containerd[1594]: time="2025-05-13T13:00:14.228342312Z" level=info msg="StopPodSandbox for \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" returns successfully" May 13 13:00:14.228817 containerd[1594]: time="2025-05-13T13:00:14.228801702Z" level=info msg="RemovePodSandbox for \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\"" May 13 13:00:14.228841 containerd[1594]: time="2025-05-13T13:00:14.228827010Z" level=info msg="Forcibly stopping sandbox \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\"" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.262 [WARNING][6985] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.263 [INFO][6985] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.263 [INFO][6985] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" iface="eth0" netns="" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.263 [INFO][6985] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.263 [INFO][6985] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.284 [INFO][6993] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.284 [INFO][6993] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.284 [INFO][6993] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.289 [WARNING][6993] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.289 [INFO][6993] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" HandleID="k8s-pod-network.97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" Workload="localhost-k8s-calico--apiserver--5d4cb57867--2r646-eth0" May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.290 [INFO][6993] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 13:00:14.295235 containerd[1594]: 2025-05-13 13:00:14.293 [INFO][6985] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5" May 13 13:00:14.295644 containerd[1594]: time="2025-05-13T13:00:14.295268041Z" level=info msg="TearDown network for sandbox \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" successfully" May 13 13:00:14.297176 containerd[1594]: time="2025-05-13T13:00:14.297144764Z" level=info msg="Ensure that sandbox 97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5 in task-service has been cleanup successfully" May 13 13:00:14.447609 containerd[1594]: time="2025-05-13T13:00:14.447555949Z" level=info msg="RemovePodSandbox \"97be5a05ab38fb895740effd30e25f57224b7ff0b7c01e50585600bce18328d5\" returns successfully" May 13 13:00:16.875133 systemd[1]: Started sshd@30-10.0.0.121:22-10.0.0.1:57786.service - OpenSSH per-connection server daemon (10.0.0.1:57786). May 13 13:00:16.923532 sshd[7007]: Accepted publickey for core from 10.0.0.1 port 57786 ssh2: RSA SHA256:KkL3F8epEKDzqF4GUDsi0vRmecGudNCTOWUWlTFD3Yo May 13 13:00:16.924922 sshd-session[7007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 13:00:16.928895 systemd-logind[1578]: New session 31 of user core. May 13 13:00:16.937377 systemd[1]: Started session-31.scope - Session 31 of User core. May 13 13:00:17.217774 sshd[7009]: Connection closed by 10.0.0.1 port 57786 May 13 13:00:17.218115 sshd-session[7007]: pam_unix(sshd:session): session closed for user core May 13 13:00:17.222320 systemd[1]: sshd@30-10.0.0.121:22-10.0.0.1:57786.service: Deactivated successfully. May 13 13:00:17.224078 systemd[1]: session-31.scope: Deactivated successfully. May 13 13:00:17.224835 systemd-logind[1578]: Session 31 logged out. Waiting for processes to exit. May 13 13:00:17.225978 systemd-logind[1578]: Removed session 31. May 13 13:00:18.050618 containerd[1594]: time="2025-05-13T13:00:18.050566578Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63cb458312265182c0fcd04e2846b338593b86c5c19625d74eea4b58b6c5cc2f\" id:\"28d046561bda9dbe98da2a95801b46d39110f4e20ea0bda25488234bd6510664\" pid:7033 exited_at:{seconds:1747141218 nanos:50368223}"