May 17 10:32:47.816462 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Sat May 17 08:43:17 -00 2025 May 17 10:32:47.816482 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=7c133b701f4651573a60c1236067561af59c5f220e6e069d5bcb75ac157263bd May 17 10:32:47.816493 kernel: BIOS-provided physical RAM map: May 17 10:32:47.816500 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 17 10:32:47.816506 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable May 17 10:32:47.816513 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS May 17 10:32:47.816520 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable May 17 10:32:47.816527 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS May 17 10:32:47.816535 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable May 17 10:32:47.816542 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS May 17 10:32:47.816548 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable May 17 10:32:47.816555 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved May 17 10:32:47.816561 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable May 17 10:32:47.816568 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved May 17 10:32:47.816578 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data May 17 10:32:47.816585 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS May 17 10:32:47.816592 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable May 17 10:32:47.816598 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved May 17 10:32:47.816605 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS May 17 10:32:47.816612 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable May 17 10:32:47.816619 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved May 17 10:32:47.816626 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS May 17 10:32:47.816646 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 17 10:32:47.816653 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 17 10:32:47.816659 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved May 17 10:32:47.816669 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 17 10:32:47.816676 kernel: NX (Execute Disable) protection: active May 17 10:32:47.816682 kernel: APIC: Static calls initialized May 17 10:32:47.816689 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable May 17 10:32:47.816696 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable May 17 10:32:47.816703 kernel: extended physical RAM map: May 17 10:32:47.816710 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable May 17 10:32:47.816717 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable May 17 10:32:47.816724 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS May 17 10:32:47.816731 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable May 17 10:32:47.816738 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS May 17 10:32:47.816747 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable May 17 10:32:47.816754 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS May 17 10:32:47.816761 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable May 17 10:32:47.816768 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable May 17 10:32:47.816778 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable May 17 10:32:47.816785 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable May 17 10:32:47.816794 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable May 17 10:32:47.816802 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved May 17 10:32:47.816809 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable May 17 10:32:47.816816 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved May 17 10:32:47.816823 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data May 17 10:32:47.816830 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS May 17 10:32:47.816837 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable May 17 10:32:47.816844 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved May 17 10:32:47.816852 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS May 17 10:32:47.816861 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable May 17 10:32:47.816868 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved May 17 10:32:47.816875 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS May 17 10:32:47.816882 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved May 17 10:32:47.816889 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 17 10:32:47.816896 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved May 17 10:32:47.816904 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 17 10:32:47.816911 kernel: efi: EFI v2.7 by EDK II May 17 10:32:47.816918 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 May 17 10:32:47.816925 kernel: random: crng init done May 17 10:32:47.816932 kernel: efi: Remove mem149: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map May 17 10:32:47.816940 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved May 17 10:32:47.816949 kernel: secureboot: Secure boot disabled May 17 10:32:47.816956 kernel: SMBIOS 2.8 present. May 17 10:32:47.816963 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 May 17 10:32:47.816970 kernel: DMI: Memory slots populated: 1/1 May 17 10:32:47.816977 kernel: Hypervisor detected: KVM May 17 10:32:47.816984 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 17 10:32:47.816991 kernel: kvm-clock: using sched offset of 3527098729 cycles May 17 10:32:47.816999 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 17 10:32:47.817006 kernel: tsc: Detected 2794.746 MHz processor May 17 10:32:47.817014 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 17 10:32:47.817021 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 17 10:32:47.817031 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 May 17 10:32:47.817038 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 17 10:32:47.817046 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 17 10:32:47.817053 kernel: Using GB pages for direct mapping May 17 10:32:47.817060 kernel: ACPI: Early table checksum verification disabled May 17 10:32:47.817068 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) May 17 10:32:47.817075 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) May 17 10:32:47.817083 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 17 10:32:47.817090 kernel: ACPI: DSDT 0x000000009CB7A000 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 17 10:32:47.817100 kernel: ACPI: FACS 0x000000009CBDD000 000040 May 17 10:32:47.817107 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 17 10:32:47.817115 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 17 10:32:47.817122 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 17 10:32:47.817129 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 17 10:32:47.817137 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) May 17 10:32:47.817144 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] May 17 10:32:47.817151 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1a7] May 17 10:32:47.817161 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] May 17 10:32:47.817168 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] May 17 10:32:47.817175 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] May 17 10:32:47.817190 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] May 17 10:32:47.817197 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] May 17 10:32:47.817205 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] May 17 10:32:47.817212 kernel: No NUMA configuration found May 17 10:32:47.817219 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] May 17 10:32:47.817226 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] May 17 10:32:47.817234 kernel: Zone ranges: May 17 10:32:47.817244 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 17 10:32:47.817251 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] May 17 10:32:47.817258 kernel: Normal empty May 17 10:32:47.817265 kernel: Device empty May 17 10:32:47.817273 kernel: Movable zone start for each node May 17 10:32:47.817280 kernel: Early memory node ranges May 17 10:32:47.817287 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 17 10:32:47.817294 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] May 17 10:32:47.817301 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] May 17 10:32:47.817311 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] May 17 10:32:47.817318 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] May 17 10:32:47.817326 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] May 17 10:32:47.817333 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] May 17 10:32:47.817340 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] May 17 10:32:47.817347 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] May 17 10:32:47.817355 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 17 10:32:47.817372 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 17 10:32:47.817389 kernel: On node 0, zone DMA: 8 pages in unavailable ranges May 17 10:32:47.817412 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 17 10:32:47.817420 kernel: On node 0, zone DMA: 239 pages in unavailable ranges May 17 10:32:47.817427 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges May 17 10:32:47.817435 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges May 17 10:32:47.817445 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges May 17 10:32:47.817452 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges May 17 10:32:47.817467 kernel: ACPI: PM-Timer IO Port: 0x608 May 17 10:32:47.817483 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 17 10:32:47.817498 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 17 10:32:47.817516 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 17 10:32:47.817531 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 17 10:32:47.817547 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 17 10:32:47.817563 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 17 10:32:47.817578 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 17 10:32:47.817586 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 17 10:32:47.817593 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 17 10:32:47.817601 kernel: TSC deadline timer available May 17 10:32:47.817609 kernel: CPU topo: Max. logical packages: 1 May 17 10:32:47.817619 kernel: CPU topo: Max. logical dies: 1 May 17 10:32:47.817626 kernel: CPU topo: Max. dies per package: 1 May 17 10:32:47.817647 kernel: CPU topo: Max. threads per core: 1 May 17 10:32:47.817655 kernel: CPU topo: Num. cores per package: 4 May 17 10:32:47.817663 kernel: CPU topo: Num. threads per package: 4 May 17 10:32:47.817670 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs May 17 10:32:47.817678 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 17 10:32:47.817685 kernel: kvm-guest: KVM setup pv remote TLB flush May 17 10:32:47.817693 kernel: kvm-guest: setup PV sched yield May 17 10:32:47.817703 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices May 17 10:32:47.817711 kernel: Booting paravirtualized kernel on KVM May 17 10:32:47.817719 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 17 10:32:47.817727 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 May 17 10:32:47.817734 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 May 17 10:32:47.817742 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 May 17 10:32:47.817750 kernel: pcpu-alloc: [0] 0 1 2 3 May 17 10:32:47.817757 kernel: kvm-guest: PV spinlocks enabled May 17 10:32:47.817765 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 17 10:32:47.817775 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=7c133b701f4651573a60c1236067561af59c5f220e6e069d5bcb75ac157263bd May 17 10:32:47.817784 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 17 10:32:47.817791 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 17 10:32:47.817799 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 17 10:32:47.817807 kernel: Fallback order for Node 0: 0 May 17 10:32:47.817814 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 May 17 10:32:47.817822 kernel: Policy zone: DMA32 May 17 10:32:47.817830 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 17 10:32:47.817839 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 17 10:32:47.817847 kernel: ftrace: allocating 40065 entries in 157 pages May 17 10:32:47.817855 kernel: ftrace: allocated 157 pages with 5 groups May 17 10:32:47.817862 kernel: Dynamic Preempt: voluntary May 17 10:32:47.817870 kernel: rcu: Preemptible hierarchical RCU implementation. May 17 10:32:47.817878 kernel: rcu: RCU event tracing is enabled. May 17 10:32:47.817886 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 17 10:32:47.817894 kernel: Trampoline variant of Tasks RCU enabled. May 17 10:32:47.817901 kernel: Rude variant of Tasks RCU enabled. May 17 10:32:47.817911 kernel: Tracing variant of Tasks RCU enabled. May 17 10:32:47.817919 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 17 10:32:47.817927 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 17 10:32:47.817934 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 17 10:32:47.817942 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 17 10:32:47.817950 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 17 10:32:47.817958 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 May 17 10:32:47.817965 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 17 10:32:47.817973 kernel: Console: colour dummy device 80x25 May 17 10:32:47.817983 kernel: printk: legacy console [ttyS0] enabled May 17 10:32:47.817990 kernel: ACPI: Core revision 20240827 May 17 10:32:47.817998 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 17 10:32:47.818006 kernel: APIC: Switch to symmetric I/O mode setup May 17 10:32:47.818013 kernel: x2apic enabled May 17 10:32:47.818021 kernel: APIC: Switched APIC routing to: physical x2apic May 17 10:32:47.818029 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() May 17 10:32:47.818036 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() May 17 10:32:47.818044 kernel: kvm-guest: setup PV IPIs May 17 10:32:47.818054 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 17 10:32:47.818062 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848ddd4e75, max_idle_ns: 440795346320 ns May 17 10:32:47.818070 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794746) May 17 10:32:47.818078 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 17 10:32:47.818085 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 17 10:32:47.818093 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 17 10:32:47.818101 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 17 10:32:47.818108 kernel: Spectre V2 : Mitigation: Retpolines May 17 10:32:47.818116 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 17 10:32:47.818126 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 17 10:32:47.818133 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 17 10:32:47.818141 kernel: RETBleed: Mitigation: untrained return thunk May 17 10:32:47.818149 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 17 10:32:47.818156 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 17 10:32:47.818164 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! May 17 10:32:47.818172 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. May 17 10:32:47.818187 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode May 17 10:32:47.818197 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 17 10:32:47.818206 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 17 10:32:47.818213 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 17 10:32:47.818221 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 17 10:32:47.818229 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 17 10:32:47.818236 kernel: Freeing SMP alternatives memory: 32K May 17 10:32:47.818244 kernel: pid_max: default: 32768 minimum: 301 May 17 10:32:47.818251 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 17 10:32:47.818259 kernel: landlock: Up and running. May 17 10:32:47.818269 kernel: SELinux: Initializing. May 17 10:32:47.818276 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 17 10:32:47.818284 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 17 10:32:47.818292 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) May 17 10:32:47.818300 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 17 10:32:47.818307 kernel: ... version: 0 May 17 10:32:47.818315 kernel: ... bit width: 48 May 17 10:32:47.818322 kernel: ... generic registers: 6 May 17 10:32:47.818330 kernel: ... value mask: 0000ffffffffffff May 17 10:32:47.818339 kernel: ... max period: 00007fffffffffff May 17 10:32:47.818347 kernel: ... fixed-purpose events: 0 May 17 10:32:47.818354 kernel: ... event mask: 000000000000003f May 17 10:32:47.818362 kernel: signal: max sigframe size: 1776 May 17 10:32:47.818369 kernel: rcu: Hierarchical SRCU implementation. May 17 10:32:47.818377 kernel: rcu: Max phase no-delay instances is 400. May 17 10:32:47.818385 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 17 10:32:47.818393 kernel: smp: Bringing up secondary CPUs ... May 17 10:32:47.818400 kernel: smpboot: x86: Booting SMP configuration: May 17 10:32:47.818408 kernel: .... node #0, CPUs: #1 #2 #3 May 17 10:32:47.818417 kernel: smp: Brought up 1 node, 4 CPUs May 17 10:32:47.818425 kernel: smpboot: Total of 4 processors activated (22357.96 BogoMIPS) May 17 10:32:47.818433 kernel: Memory: 2422668K/2565800K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54428K init, 2532K bss, 137196K reserved, 0K cma-reserved) May 17 10:32:47.818441 kernel: devtmpfs: initialized May 17 10:32:47.818448 kernel: x86/mm: Memory block size: 128MB May 17 10:32:47.818456 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) May 17 10:32:47.818464 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) May 17 10:32:47.818472 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) May 17 10:32:47.818479 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) May 17 10:32:47.818490 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) May 17 10:32:47.818497 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) May 17 10:32:47.818505 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 17 10:32:47.818513 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 17 10:32:47.818521 kernel: pinctrl core: initialized pinctrl subsystem May 17 10:32:47.818528 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 17 10:32:47.818536 kernel: audit: initializing netlink subsys (disabled) May 17 10:32:47.818544 kernel: audit: type=2000 audit(1747477966.335:1): state=initialized audit_enabled=0 res=1 May 17 10:32:47.818561 kernel: thermal_sys: Registered thermal governor 'step_wise' May 17 10:32:47.818577 kernel: thermal_sys: Registered thermal governor 'user_space' May 17 10:32:47.818593 kernel: cpuidle: using governor menu May 17 10:32:47.818609 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 17 10:32:47.818621 kernel: dca service started, version 1.12.1 May 17 10:32:47.818640 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] May 17 10:32:47.818648 kernel: PCI: Using configuration type 1 for base access May 17 10:32:47.818655 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 17 10:32:47.818663 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 17 10:32:47.818673 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 17 10:32:47.818681 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 17 10:32:47.818689 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 17 10:32:47.818696 kernel: ACPI: Added _OSI(Module Device) May 17 10:32:47.818704 kernel: ACPI: Added _OSI(Processor Device) May 17 10:32:47.818711 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 17 10:32:47.818719 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 17 10:32:47.818726 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 17 10:32:47.818734 kernel: ACPI: Interpreter enabled May 17 10:32:47.818743 kernel: ACPI: PM: (supports S0 S3 S5) May 17 10:32:47.818751 kernel: ACPI: Using IOAPIC for interrupt routing May 17 10:32:47.818759 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 17 10:32:47.818766 kernel: PCI: Using E820 reservations for host bridge windows May 17 10:32:47.818774 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 17 10:32:47.818782 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 17 10:32:47.818982 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 17 10:32:47.819103 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 17 10:32:47.819232 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 17 10:32:47.819243 kernel: PCI host bridge to bus 0000:00 May 17 10:32:47.819361 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 17 10:32:47.819466 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 17 10:32:47.819574 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 17 10:32:47.819696 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] May 17 10:32:47.819813 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] May 17 10:32:47.819924 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] May 17 10:32:47.820028 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 17 10:32:47.820160 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 17 10:32:47.820294 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint May 17 10:32:47.820410 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] May 17 10:32:47.820524 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] May 17 10:32:47.820656 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] May 17 10:32:47.820775 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 17 10:32:47.820926 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint May 17 10:32:47.821053 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] May 17 10:32:47.821168 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] May 17 10:32:47.821294 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] May 17 10:32:47.821425 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 17 10:32:47.821546 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] May 17 10:32:47.821721 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] May 17 10:32:47.821861 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] May 17 10:32:47.821993 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 17 10:32:47.822110 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] May 17 10:32:47.822236 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] May 17 10:32:47.822353 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] May 17 10:32:47.822472 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] May 17 10:32:47.822595 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 17 10:32:47.822727 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 17 10:32:47.822852 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 17 10:32:47.822968 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] May 17 10:32:47.823082 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] May 17 10:32:47.823214 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 17 10:32:47.823334 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] May 17 10:32:47.823345 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 17 10:32:47.823353 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 17 10:32:47.823361 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 17 10:32:47.823369 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 17 10:32:47.823376 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 17 10:32:47.823384 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 17 10:32:47.823391 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 17 10:32:47.823401 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 17 10:32:47.823409 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 17 10:32:47.823416 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 17 10:32:47.823424 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 17 10:32:47.823432 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 17 10:32:47.823439 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 17 10:32:47.823447 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 17 10:32:47.823454 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 17 10:32:47.823462 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 17 10:32:47.823471 kernel: iommu: Default domain type: Translated May 17 10:32:47.823479 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 17 10:32:47.823487 kernel: efivars: Registered efivars operations May 17 10:32:47.823494 kernel: PCI: Using ACPI for IRQ routing May 17 10:32:47.823502 kernel: PCI: pci_cache_line_size set to 64 bytes May 17 10:32:47.823510 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] May 17 10:32:47.823517 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] May 17 10:32:47.823525 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] May 17 10:32:47.823532 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] May 17 10:32:47.823542 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] May 17 10:32:47.823549 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] May 17 10:32:47.823557 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] May 17 10:32:47.823565 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] May 17 10:32:47.823720 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 17 10:32:47.823857 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 17 10:32:47.823973 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 17 10:32:47.823988 kernel: vgaarb: loaded May 17 10:32:47.823996 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 17 10:32:47.824004 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 17 10:32:47.824011 kernel: clocksource: Switched to clocksource kvm-clock May 17 10:32:47.824019 kernel: VFS: Disk quotas dquot_6.6.0 May 17 10:32:47.824027 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 17 10:32:47.824034 kernel: pnp: PnP ACPI init May 17 10:32:47.824172 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved May 17 10:32:47.824197 kernel: pnp: PnP ACPI: found 6 devices May 17 10:32:47.824207 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 17 10:32:47.824215 kernel: NET: Registered PF_INET protocol family May 17 10:32:47.824225 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 17 10:32:47.824236 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 17 10:32:47.824245 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 17 10:32:47.824255 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 17 10:32:47.824264 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 17 10:32:47.824273 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 17 10:32:47.824285 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 17 10:32:47.824294 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 17 10:32:47.824303 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 17 10:32:47.824313 kernel: NET: Registered PF_XDP protocol family May 17 10:32:47.824439 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window May 17 10:32:47.824561 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned May 17 10:32:47.824707 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 17 10:32:47.824820 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 17 10:32:47.824934 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 17 10:32:47.825044 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] May 17 10:32:47.825154 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] May 17 10:32:47.825276 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] May 17 10:32:47.825288 kernel: PCI: CLS 0 bytes, default 64 May 17 10:32:47.825298 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848ddd4e75, max_idle_ns: 440795346320 ns May 17 10:32:47.825308 kernel: Initialise system trusted keyrings May 17 10:32:47.825321 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 17 10:32:47.825330 kernel: Key type asymmetric registered May 17 10:32:47.825339 kernel: Asymmetric key parser 'x509' registered May 17 10:32:47.825349 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 17 10:32:47.825358 kernel: io scheduler mq-deadline registered May 17 10:32:47.825367 kernel: io scheduler kyber registered May 17 10:32:47.825376 kernel: io scheduler bfq registered May 17 10:32:47.825386 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 17 10:32:47.825398 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 17 10:32:47.825407 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 17 10:32:47.825417 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 17 10:32:47.825426 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 17 10:32:47.825436 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 17 10:32:47.825445 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 17 10:32:47.825455 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 17 10:32:47.825467 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 17 10:32:47.825590 kernel: rtc_cmos 00:04: RTC can wake from S4 May 17 10:32:47.825606 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 17 10:32:47.825740 kernel: rtc_cmos 00:04: registered as rtc0 May 17 10:32:47.825851 kernel: rtc_cmos 00:04: setting system clock to 2025-05-17T10:32:47 UTC (1747477967) May 17 10:32:47.825961 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 17 10:32:47.825972 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 17 10:32:47.825979 kernel: efifb: probing for efifb May 17 10:32:47.825988 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k May 17 10:32:47.825999 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 May 17 10:32:47.826007 kernel: efifb: scrolling: redraw May 17 10:32:47.826015 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 17 10:32:47.826023 kernel: Console: switching to colour frame buffer device 160x50 May 17 10:32:47.826031 kernel: fb0: EFI VGA frame buffer device May 17 10:32:47.826039 kernel: pstore: Using crash dump compression: deflate May 17 10:32:47.826047 kernel: pstore: Registered efi_pstore as persistent store backend May 17 10:32:47.826055 kernel: NET: Registered PF_INET6 protocol family May 17 10:32:47.826063 kernel: Segment Routing with IPv6 May 17 10:32:47.826070 kernel: In-situ OAM (IOAM) with IPv6 May 17 10:32:47.826080 kernel: NET: Registered PF_PACKET protocol family May 17 10:32:47.826088 kernel: Key type dns_resolver registered May 17 10:32:47.826096 kernel: IPI shorthand broadcast: enabled May 17 10:32:47.826104 kernel: sched_clock: Marking stable (2745002141, 158331626)->(2918210012, -14876245) May 17 10:32:47.826112 kernel: registered taskstats version 1 May 17 10:32:47.826120 kernel: Loading compiled-in X.509 certificates May 17 10:32:47.826128 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 2cc2dc7fee657ed10992c84454f152cb3c880646' May 17 10:32:47.826136 kernel: Demotion targets for Node 0: null May 17 10:32:47.826144 kernel: Key type .fscrypt registered May 17 10:32:47.826153 kernel: Key type fscrypt-provisioning registered May 17 10:32:47.826162 kernel: ima: No TPM chip found, activating TPM-bypass! May 17 10:32:47.826170 kernel: ima: Allocated hash algorithm: sha1 May 17 10:32:47.826177 kernel: ima: No architecture policies found May 17 10:32:47.826195 kernel: clk: Disabling unused clocks May 17 10:32:47.826203 kernel: Warning: unable to open an initial console. May 17 10:32:47.826211 kernel: Freeing unused kernel image (initmem) memory: 54428K May 17 10:32:47.826220 kernel: Write protecting the kernel read-only data: 24576k May 17 10:32:47.826229 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 17 10:32:47.826237 kernel: Run /init as init process May 17 10:32:47.826246 kernel: with arguments: May 17 10:32:47.826254 kernel: /init May 17 10:32:47.826261 kernel: with environment: May 17 10:32:47.826269 kernel: HOME=/ May 17 10:32:47.826277 kernel: TERM=linux May 17 10:32:47.826285 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 17 10:32:47.826294 systemd[1]: Successfully made /usr/ read-only. May 17 10:32:47.826307 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 17 10:32:47.826317 systemd[1]: Detected virtualization kvm. May 17 10:32:47.826325 systemd[1]: Detected architecture x86-64. May 17 10:32:47.826333 systemd[1]: Running in initrd. May 17 10:32:47.826342 systemd[1]: No hostname configured, using default hostname. May 17 10:32:47.826351 systemd[1]: Hostname set to . May 17 10:32:47.826359 systemd[1]: Initializing machine ID from VM UUID. May 17 10:32:47.826370 systemd[1]: Queued start job for default target initrd.target. May 17 10:32:47.826378 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 17 10:32:47.826387 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 17 10:32:47.826396 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 17 10:32:47.826405 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 17 10:32:47.826413 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 17 10:32:47.826423 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 17 10:32:47.826435 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 17 10:32:47.826443 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 17 10:32:47.826452 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 17 10:32:47.826461 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 17 10:32:47.826469 systemd[1]: Reached target paths.target - Path Units. May 17 10:32:47.826478 systemd[1]: Reached target slices.target - Slice Units. May 17 10:32:47.826487 systemd[1]: Reached target swap.target - Swaps. May 17 10:32:47.826495 systemd[1]: Reached target timers.target - Timer Units. May 17 10:32:47.826503 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 17 10:32:47.826514 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 17 10:32:47.826523 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 17 10:32:47.826531 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 17 10:32:47.826540 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 17 10:32:47.826549 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 17 10:32:47.826557 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 17 10:32:47.826565 systemd[1]: Reached target sockets.target - Socket Units. May 17 10:32:47.826574 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 17 10:32:47.826584 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 17 10:32:47.826594 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 17 10:32:47.826604 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 17 10:32:47.826615 systemd[1]: Starting systemd-fsck-usr.service... May 17 10:32:47.826624 systemd[1]: Starting systemd-journald.service - Journal Service... May 17 10:32:47.826648 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 17 10:32:47.826657 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 17 10:32:47.826666 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 17 10:32:47.826677 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 17 10:32:47.826686 systemd[1]: Finished systemd-fsck-usr.service. May 17 10:32:47.826713 systemd-journald[220]: Collecting audit messages is disabled. May 17 10:32:47.826737 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 17 10:32:47.826746 systemd-journald[220]: Journal started May 17 10:32:47.826765 systemd-journald[220]: Runtime Journal (/run/log/journal/6f40cbf8286c45bb86c499b7c5da6c5d) is 6M, max 48.5M, 42.4M free. May 17 10:32:47.826803 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 17 10:32:47.819760 systemd-modules-load[222]: Inserted module 'overlay' May 17 10:32:47.831723 systemd[1]: Started systemd-journald.service - Journal Service. May 17 10:32:47.835729 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 17 10:32:47.838078 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 17 10:32:47.846763 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 17 10:32:47.848678 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 17 10:32:47.849143 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 17 10:32:47.853236 systemd-modules-load[222]: Inserted module 'br_netfilter' May 17 10:32:47.854233 kernel: Bridge firewalling registered May 17 10:32:47.855362 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 17 10:32:47.858735 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 17 10:32:47.860432 systemd-tmpfiles[238]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 17 10:32:47.860442 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 17 10:32:47.863697 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 17 10:32:47.867834 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 17 10:32:47.870162 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 17 10:32:47.876850 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 17 10:32:47.879675 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 17 10:32:47.892322 dracut-cmdline[258]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=7c133b701f4651573a60c1236067561af59c5f220e6e069d5bcb75ac157263bd May 17 10:32:47.930092 systemd-resolved[264]: Positive Trust Anchors: May 17 10:32:47.930110 systemd-resolved[264]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 17 10:32:47.930141 systemd-resolved[264]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 17 10:32:47.932650 systemd-resolved[264]: Defaulting to hostname 'linux'. May 17 10:32:47.933680 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 17 10:32:47.939795 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 17 10:32:47.993665 kernel: SCSI subsystem initialized May 17 10:32:48.002657 kernel: Loading iSCSI transport class v2.0-870. May 17 10:32:48.013656 kernel: iscsi: registered transport (tcp) May 17 10:32:48.033974 kernel: iscsi: registered transport (qla4xxx) May 17 10:32:48.034005 kernel: QLogic iSCSI HBA Driver May 17 10:32:48.053529 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 17 10:32:48.075299 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 17 10:32:48.076570 systemd[1]: Reached target network-pre.target - Preparation for Network. May 17 10:32:48.125943 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 17 10:32:48.127587 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 17 10:32:48.190667 kernel: raid6: avx2x4 gen() 30394 MB/s May 17 10:32:48.207655 kernel: raid6: avx2x2 gen() 31066 MB/s May 17 10:32:48.224742 kernel: raid6: avx2x1 gen() 25910 MB/s May 17 10:32:48.224760 kernel: raid6: using algorithm avx2x2 gen() 31066 MB/s May 17 10:32:48.242795 kernel: raid6: .... xor() 19761 MB/s, rmw enabled May 17 10:32:48.242828 kernel: raid6: using avx2x2 recovery algorithm May 17 10:32:48.262660 kernel: xor: automatically using best checksumming function avx May 17 10:32:48.423685 kernel: Btrfs loaded, zoned=no, fsverity=no May 17 10:32:48.431785 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 17 10:32:48.435535 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 17 10:32:48.483459 systemd-udevd[472]: Using default interface naming scheme 'v255'. May 17 10:32:48.492369 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 17 10:32:48.494787 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 17 10:32:48.517759 dracut-pre-trigger[474]: rd.md=0: removing MD RAID activation May 17 10:32:48.547697 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 17 10:32:48.551549 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 17 10:32:48.634303 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 17 10:32:48.637359 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 17 10:32:48.671655 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues May 17 10:32:48.680882 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 17 10:32:48.681032 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 17 10:32:48.681044 kernel: GPT:9289727 != 19775487 May 17 10:32:48.681053 kernel: GPT:Alternate GPT header not at the end of the disk. May 17 10:32:48.681063 kernel: GPT:9289727 != 19775487 May 17 10:32:48.681072 kernel: GPT: Use GNU Parted to correct GPT errors. May 17 10:32:48.681082 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 17 10:32:48.698653 kernel: libata version 3.00 loaded. May 17 10:32:48.698724 kernel: cryptd: max_cpu_qlen set to 1000 May 17 10:32:48.708137 kernel: ahci 0000:00:1f.2: version 3.0 May 17 10:32:48.733920 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 17 10:32:48.733938 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 17 10:32:48.734097 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 17 10:32:48.734243 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 17 10:32:48.734379 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 May 17 10:32:48.734390 kernel: scsi host0: ahci May 17 10:32:48.734581 kernel: scsi host1: ahci May 17 10:32:48.734742 kernel: scsi host2: ahci May 17 10:32:48.734890 kernel: scsi host3: ahci May 17 10:32:48.735030 kernel: scsi host4: ahci May 17 10:32:48.735170 kernel: scsi host5: ahci May 17 10:32:48.735308 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 0 May 17 10:32:48.735323 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 0 May 17 10:32:48.735333 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 0 May 17 10:32:48.735343 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 0 May 17 10:32:48.735354 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 0 May 17 10:32:48.735364 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 0 May 17 10:32:48.710901 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 17 10:32:48.711069 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 17 10:32:48.715025 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 17 10:32:48.720819 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 17 10:32:48.741621 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 17 10:32:48.744740 kernel: AES CTR mode by8 optimization enabled May 17 10:32:48.751934 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 17 10:32:48.778449 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 17 10:32:48.778571 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 17 10:32:48.791210 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 17 10:32:48.793721 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 17 10:32:48.794871 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 17 10:32:48.794922 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 17 10:32:48.799488 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 17 10:32:48.823368 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 17 10:32:48.823927 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 17 10:32:48.834107 disk-uuid[629]: Primary Header is updated. May 17 10:32:48.834107 disk-uuid[629]: Secondary Entries is updated. May 17 10:32:48.834107 disk-uuid[629]: Secondary Header is updated. May 17 10:32:48.837658 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 17 10:32:48.841660 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 17 10:32:48.846914 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 17 10:32:49.042667 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 17 10:32:49.042755 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 17 10:32:49.043658 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 17 10:32:49.044675 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 17 10:32:49.044759 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 17 10:32:49.045675 kernel: ata3.00: applying bridge limits May 17 10:32:49.046671 kernel: ata3.00: configured for UDMA/100 May 17 10:32:49.046694 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 17 10:32:49.051674 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 17 10:32:49.051719 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 17 10:32:49.109675 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 17 10:32:49.123382 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 17 10:32:49.123399 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 17 10:32:49.493849 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 17 10:32:49.496553 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 17 10:32:49.499240 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 17 10:32:49.499327 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 17 10:32:49.502603 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 17 10:32:49.531345 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 17 10:32:49.844435 disk-uuid[633]: The operation has completed successfully. May 17 10:32:49.846004 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 17 10:32:49.876639 systemd[1]: disk-uuid.service: Deactivated successfully. May 17 10:32:49.876759 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 17 10:32:49.912056 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 17 10:32:49.925581 sh[675]: Success May 17 10:32:49.942965 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 17 10:32:49.943002 kernel: device-mapper: uevent: version 1.0.3 May 17 10:32:49.944109 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 17 10:32:49.952652 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 17 10:32:49.981047 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 17 10:32:49.982747 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 17 10:32:49.997363 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 17 10:32:50.005955 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 17 10:32:50.005987 kernel: BTRFS: device fsid 68d67fdc-db1a-4cd3-9490-455e627e302b devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (687) May 17 10:32:50.007289 kernel: BTRFS info (device dm-0): first mount of filesystem 68d67fdc-db1a-4cd3-9490-455e627e302b May 17 10:32:50.008186 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 17 10:32:50.008213 kernel: BTRFS info (device dm-0): using free-space-tree May 17 10:32:50.012844 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 17 10:32:50.013363 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 17 10:32:50.013860 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 17 10:32:50.014652 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 17 10:32:50.015467 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 17 10:32:50.042665 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (721) May 17 10:32:50.042717 kernel: BTRFS info (device vda6): first mount of filesystem dfcb18e1-4b20-4f52-aac0-10c7829dc173 May 17 10:32:50.045236 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 17 10:32:50.045263 kernel: BTRFS info (device vda6): using free-space-tree May 17 10:32:50.051665 kernel: BTRFS info (device vda6): last unmount of filesystem dfcb18e1-4b20-4f52-aac0-10c7829dc173 May 17 10:32:50.052936 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 17 10:32:50.056309 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 17 10:32:50.134882 ignition[765]: Ignition 2.21.0 May 17 10:32:50.134896 ignition[765]: Stage: fetch-offline May 17 10:32:50.136325 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 17 10:32:50.134926 ignition[765]: no configs at "/usr/lib/ignition/base.d" May 17 10:32:50.134935 ignition[765]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 17 10:32:50.141776 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 17 10:32:50.135015 ignition[765]: parsed url from cmdline: "" May 17 10:32:50.135019 ignition[765]: no config URL provided May 17 10:32:50.135024 ignition[765]: reading system config file "/usr/lib/ignition/user.ign" May 17 10:32:50.135031 ignition[765]: no config at "/usr/lib/ignition/user.ign" May 17 10:32:50.135050 ignition[765]: op(1): [started] loading QEMU firmware config module May 17 10:32:50.135055 ignition[765]: op(1): executing: "modprobe" "qemu_fw_cfg" May 17 10:32:50.143194 ignition[765]: op(1): [finished] loading QEMU firmware config module May 17 10:32:50.182588 systemd-networkd[865]: lo: Link UP May 17 10:32:50.182599 systemd-networkd[865]: lo: Gained carrier May 17 10:32:50.184120 systemd-networkd[865]: Enumeration completed May 17 10:32:50.184470 systemd-networkd[865]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 17 10:32:50.184475 systemd-networkd[865]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 17 10:32:50.184694 systemd[1]: Started systemd-networkd.service - Network Configuration. May 17 10:32:50.186066 systemd-networkd[865]: eth0: Link UP May 17 10:32:50.186070 systemd-networkd[865]: eth0: Gained carrier May 17 10:32:50.193076 ignition[765]: parsing config with SHA512: ad4538d8f72cf5f6a76bddfb82aa7c683394ffd4403276149f0c3538792985196c60bcf2a0ccd40c49e87eef9e0f0e7190f3b6509b3dc7a6739fc2331a845d83 May 17 10:32:50.186078 systemd-networkd[865]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 17 10:32:50.187144 systemd[1]: Reached target network.target - Network. May 17 10:32:50.196494 unknown[765]: fetched base config from "system" May 17 10:32:50.196876 ignition[765]: fetch-offline: fetch-offline passed May 17 10:32:50.196501 unknown[765]: fetched user config from "qemu" May 17 10:32:50.196943 ignition[765]: Ignition finished successfully May 17 10:32:50.199880 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 17 10:32:50.201997 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 17 10:32:50.202948 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 17 10:32:50.216673 systemd-networkd[865]: eth0: DHCPv4 address 10.0.0.118/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 17 10:32:50.240042 ignition[869]: Ignition 2.21.0 May 17 10:32:50.240055 ignition[869]: Stage: kargs May 17 10:32:50.240207 ignition[869]: no configs at "/usr/lib/ignition/base.d" May 17 10:32:50.240218 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 17 10:32:50.241554 ignition[869]: kargs: kargs passed May 17 10:32:50.241618 ignition[869]: Ignition finished successfully May 17 10:32:50.245908 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 17 10:32:50.248953 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 17 10:32:50.283795 ignition[878]: Ignition 2.21.0 May 17 10:32:50.283810 ignition[878]: Stage: disks May 17 10:32:50.283963 ignition[878]: no configs at "/usr/lib/ignition/base.d" May 17 10:32:50.283974 ignition[878]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 17 10:32:50.285898 ignition[878]: disks: disks passed May 17 10:32:50.285969 ignition[878]: Ignition finished successfully May 17 10:32:50.288738 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 17 10:32:50.290047 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 17 10:32:50.291982 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 17 10:32:50.292192 systemd[1]: Reached target local-fs.target - Local File Systems. May 17 10:32:50.292521 systemd[1]: Reached target sysinit.target - System Initialization. May 17 10:32:50.297902 systemd[1]: Reached target basic.target - Basic System. May 17 10:32:50.300586 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 17 10:32:50.334186 systemd-fsck[888]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 17 10:32:50.342531 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 17 10:32:50.346763 systemd[1]: Mounting sysroot.mount - /sysroot... May 17 10:32:50.457656 kernel: EXT4-fs (vda9): mounted filesystem 44b0ba68-13ba-4c53-8432-268eaab48ec0 r/w with ordered data mode. Quota mode: none. May 17 10:32:50.457848 systemd[1]: Mounted sysroot.mount - /sysroot. May 17 10:32:50.458543 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 17 10:32:50.461876 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 17 10:32:50.463690 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 17 10:32:50.463972 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 17 10:32:50.464009 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 17 10:32:50.464029 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 17 10:32:50.483913 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 17 10:32:50.485251 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 17 10:32:50.492686 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (896) May 17 10:32:50.492728 kernel: BTRFS info (device vda6): first mount of filesystem dfcb18e1-4b20-4f52-aac0-10c7829dc173 May 17 10:32:50.492739 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 17 10:32:50.493680 kernel: BTRFS info (device vda6): using free-space-tree May 17 10:32:50.498157 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 17 10:32:50.520728 initrd-setup-root[920]: cut: /sysroot/etc/passwd: No such file or directory May 17 10:32:50.525657 initrd-setup-root[927]: cut: /sysroot/etc/group: No such file or directory May 17 10:32:50.529351 initrd-setup-root[934]: cut: /sysroot/etc/shadow: No such file or directory May 17 10:32:50.533696 initrd-setup-root[941]: cut: /sysroot/etc/gshadow: No such file or directory May 17 10:32:50.616994 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 17 10:32:50.619167 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 17 10:32:50.620789 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 17 10:32:50.642658 kernel: BTRFS info (device vda6): last unmount of filesystem dfcb18e1-4b20-4f52-aac0-10c7829dc173 May 17 10:32:50.654756 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 17 10:32:50.669261 ignition[1011]: INFO : Ignition 2.21.0 May 17 10:32:50.669261 ignition[1011]: INFO : Stage: mount May 17 10:32:50.670920 ignition[1011]: INFO : no configs at "/usr/lib/ignition/base.d" May 17 10:32:50.670920 ignition[1011]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 17 10:32:50.670920 ignition[1011]: INFO : mount: mount passed May 17 10:32:50.670920 ignition[1011]: INFO : Ignition finished successfully May 17 10:32:50.672659 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 17 10:32:50.675164 systemd[1]: Starting ignition-files.service - Ignition (files)... May 17 10:32:51.005630 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 17 10:32:51.007756 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 17 10:32:51.043350 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (1023) May 17 10:32:51.043404 kernel: BTRFS info (device vda6): first mount of filesystem dfcb18e1-4b20-4f52-aac0-10c7829dc173 May 17 10:32:51.043428 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 17 10:32:51.044246 kernel: BTRFS info (device vda6): using free-space-tree May 17 10:32:51.048334 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 17 10:32:51.087180 ignition[1040]: INFO : Ignition 2.21.0 May 17 10:32:51.087180 ignition[1040]: INFO : Stage: files May 17 10:32:51.089180 ignition[1040]: INFO : no configs at "/usr/lib/ignition/base.d" May 17 10:32:51.089180 ignition[1040]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 17 10:32:51.091992 ignition[1040]: DEBUG : files: compiled without relabeling support, skipping May 17 10:32:51.091992 ignition[1040]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 17 10:32:51.091992 ignition[1040]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 17 10:32:51.096317 ignition[1040]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 17 10:32:51.096317 ignition[1040]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 17 10:32:51.096317 ignition[1040]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 17 10:32:51.095028 unknown[1040]: wrote ssh authorized keys file for user: core May 17 10:32:51.101742 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 17 10:32:51.101742 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 17 10:32:51.208166 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 17 10:32:51.336750 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 17 10:32:51.336750 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 17 10:32:51.340700 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 17 10:32:51.340700 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 17 10:32:51.340700 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 17 10:32:51.340700 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 17 10:32:51.340700 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 17 10:32:51.340700 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 17 10:32:51.340700 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 17 10:32:51.352984 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 17 10:32:51.352984 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 17 10:32:51.352984 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 17 10:32:51.352984 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 17 10:32:51.361356 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 17 10:32:51.361356 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 May 17 10:32:51.619785 systemd-networkd[865]: eth0: Gained IPv6LL May 17 10:32:52.366336 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 17 10:32:52.692607 ignition[1040]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 17 10:32:52.692607 ignition[1040]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 17 10:32:52.696293 ignition[1040]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 17 10:32:52.702193 ignition[1040]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 17 10:32:52.702193 ignition[1040]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 17 10:32:52.702193 ignition[1040]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 17 10:32:52.706522 ignition[1040]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 17 10:32:52.706522 ignition[1040]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 17 10:32:52.706522 ignition[1040]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 17 10:32:52.706522 ignition[1040]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 17 10:32:52.729843 ignition[1040]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 17 10:32:52.733743 ignition[1040]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 17 10:32:52.735427 ignition[1040]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 17 10:32:52.735427 ignition[1040]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 17 10:32:52.738247 ignition[1040]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 17 10:32:52.738247 ignition[1040]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 17 10:32:52.738247 ignition[1040]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 17 10:32:52.738247 ignition[1040]: INFO : files: files passed May 17 10:32:52.738247 ignition[1040]: INFO : Ignition finished successfully May 17 10:32:52.742765 systemd[1]: Finished ignition-files.service - Ignition (files). May 17 10:32:52.744483 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 17 10:32:52.748057 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 17 10:32:52.765646 systemd[1]: ignition-quench.service: Deactivated successfully. May 17 10:32:52.765876 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 17 10:32:52.767886 initrd-setup-root-after-ignition[1069]: grep: /sysroot/oem/oem-release: No such file or directory May 17 10:32:52.772125 initrd-setup-root-after-ignition[1071]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 17 10:32:52.772125 initrd-setup-root-after-ignition[1071]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 17 10:32:52.776417 initrd-setup-root-after-ignition[1075]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 17 10:32:52.779367 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 17 10:32:52.780920 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 17 10:32:52.784331 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 17 10:32:52.859174 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 17 10:32:52.859310 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 17 10:32:52.860581 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 17 10:32:52.863950 systemd[1]: Reached target initrd.target - Initrd Default Target. May 17 10:32:52.865980 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 17 10:32:52.867052 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 17 10:32:52.906759 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 17 10:32:52.910938 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 17 10:32:52.944740 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 17 10:32:52.946150 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 17 10:32:52.948585 systemd[1]: Stopped target timers.target - Timer Units. May 17 10:32:52.950825 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 17 10:32:52.950989 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 17 10:32:52.953260 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 17 10:32:52.954830 systemd[1]: Stopped target basic.target - Basic System. May 17 10:32:52.956928 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 17 10:32:52.959137 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 17 10:32:52.961237 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 17 10:32:52.963481 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 17 10:32:52.965768 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 17 10:32:52.967853 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 17 10:32:52.970129 systemd[1]: Stopped target sysinit.target - System Initialization. May 17 10:32:52.972098 systemd[1]: Stopped target local-fs.target - Local File Systems. May 17 10:32:52.974381 systemd[1]: Stopped target swap.target - Swaps. May 17 10:32:52.976292 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 17 10:32:52.976424 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 17 10:32:52.978726 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 17 10:32:52.980455 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 17 10:32:52.982650 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 17 10:32:52.982759 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 17 10:32:52.984979 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 17 10:32:52.985092 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 17 10:32:52.987508 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 17 10:32:52.987611 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 17 10:32:52.989801 systemd[1]: Stopped target paths.target - Path Units. May 17 10:32:52.991531 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 17 10:32:52.995720 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 17 10:32:52.997664 systemd[1]: Stopped target slices.target - Slice Units. May 17 10:32:52.999706 systemd[1]: Stopped target sockets.target - Socket Units. May 17 10:32:53.001414 systemd[1]: iscsid.socket: Deactivated successfully. May 17 10:32:53.001541 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 17 10:32:53.003409 systemd[1]: iscsiuio.socket: Deactivated successfully. May 17 10:32:53.003527 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 17 10:32:53.005844 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 17 10:32:53.006039 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 17 10:32:53.007852 systemd[1]: ignition-files.service: Deactivated successfully. May 17 10:32:53.007998 systemd[1]: Stopped ignition-files.service - Ignition (files). May 17 10:32:53.011380 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 17 10:32:53.013052 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 17 10:32:53.013208 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 17 10:32:53.015927 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 17 10:32:53.017797 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 17 10:32:53.017949 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 17 10:32:53.019623 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 17 10:32:53.019771 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 17 10:32:53.029232 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 17 10:32:53.036766 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 17 10:32:53.049404 ignition[1095]: INFO : Ignition 2.21.0 May 17 10:32:53.049404 ignition[1095]: INFO : Stage: umount May 17 10:32:53.051530 ignition[1095]: INFO : no configs at "/usr/lib/ignition/base.d" May 17 10:32:53.051530 ignition[1095]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 17 10:32:53.051530 ignition[1095]: INFO : umount: umount passed May 17 10:32:53.051530 ignition[1095]: INFO : Ignition finished successfully May 17 10:32:53.056362 systemd[1]: ignition-mount.service: Deactivated successfully. May 17 10:32:53.056837 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 17 10:32:53.060763 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 17 10:32:53.061192 systemd[1]: Stopped target network.target - Network. May 17 10:32:53.063676 systemd[1]: ignition-disks.service: Deactivated successfully. May 17 10:32:53.063729 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 17 10:32:53.065788 systemd[1]: ignition-kargs.service: Deactivated successfully. May 17 10:32:53.065834 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 17 10:32:53.066893 systemd[1]: ignition-setup.service: Deactivated successfully. May 17 10:32:53.066945 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 17 10:32:53.067252 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 17 10:32:53.067293 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 17 10:32:53.067705 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 17 10:32:53.073081 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 17 10:32:53.079618 systemd[1]: systemd-resolved.service: Deactivated successfully. May 17 10:32:53.079809 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 17 10:32:53.083951 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 17 10:32:53.084226 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 17 10:32:53.084270 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 17 10:32:53.088170 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 17 10:32:53.090590 systemd[1]: systemd-networkd.service: Deactivated successfully. May 17 10:32:53.090730 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 17 10:32:53.094921 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 17 10:32:53.095087 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 17 10:32:53.097425 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 17 10:32:53.097465 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 17 10:32:53.103054 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 17 10:32:53.104955 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 17 10:32:53.105017 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 17 10:32:53.106091 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 17 10:32:53.106135 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 17 10:32:53.110645 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 17 10:32:53.110691 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 17 10:32:53.111621 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 17 10:32:53.112894 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 17 10:32:53.127245 systemd[1]: network-cleanup.service: Deactivated successfully. May 17 10:32:53.127360 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 17 10:32:53.146268 systemd[1]: systemd-udevd.service: Deactivated successfully. May 17 10:32:53.146440 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 17 10:32:53.147480 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 17 10:32:53.147522 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 17 10:32:53.149860 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 17 10:32:53.149893 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 17 10:32:53.151851 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 17 10:32:53.151897 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 17 10:32:53.156689 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 17 10:32:53.156738 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 17 10:32:53.159622 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 17 10:32:53.159689 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 17 10:32:53.165477 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 17 10:32:53.166567 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 17 10:32:53.166622 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 17 10:32:53.170092 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 17 10:32:53.170137 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 17 10:32:53.173623 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 17 10:32:53.173691 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 17 10:32:53.188258 systemd[1]: sysroot-boot.service: Deactivated successfully. May 17 10:32:53.188381 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 17 10:32:53.189547 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 17 10:32:53.189592 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 17 10:32:53.193387 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 17 10:32:53.193490 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 17 10:32:53.195487 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 17 10:32:53.198031 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 17 10:32:53.219815 systemd[1]: Switching root. May 17 10:32:53.264895 systemd-journald[220]: Journal stopped May 17 10:32:54.322712 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). May 17 10:32:54.322785 kernel: SELinux: policy capability network_peer_controls=1 May 17 10:32:54.322802 kernel: SELinux: policy capability open_perms=1 May 17 10:32:54.322813 kernel: SELinux: policy capability extended_socket_class=1 May 17 10:32:54.322830 kernel: SELinux: policy capability always_check_network=0 May 17 10:32:54.322841 kernel: SELinux: policy capability cgroup_seclabel=1 May 17 10:32:54.322857 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 17 10:32:54.322868 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 17 10:32:54.322882 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 17 10:32:54.322893 kernel: SELinux: policy capability userspace_initial_context=0 May 17 10:32:54.322904 kernel: audit: type=1403 audit(1747477973.577:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 17 10:32:54.322916 systemd[1]: Successfully loaded SELinux policy in 52.420ms. May 17 10:32:54.322942 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.004ms. May 17 10:32:54.322955 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 17 10:32:54.322968 systemd[1]: Detected virtualization kvm. May 17 10:32:54.322979 systemd[1]: Detected architecture x86-64. May 17 10:32:54.322992 systemd[1]: Detected first boot. May 17 10:32:54.323006 systemd[1]: Initializing machine ID from VM UUID. May 17 10:32:54.323017 zram_generator::config[1141]: No configuration found. May 17 10:32:54.323030 kernel: Guest personality initialized and is inactive May 17 10:32:54.323048 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 17 10:32:54.323059 kernel: Initialized host personality May 17 10:32:54.323070 kernel: NET: Registered PF_VSOCK protocol family May 17 10:32:54.323082 systemd[1]: Populated /etc with preset unit settings. May 17 10:32:54.323094 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 17 10:32:54.323106 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 17 10:32:54.323120 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 17 10:32:54.323132 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 17 10:32:54.323144 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 17 10:32:54.323156 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 17 10:32:54.323175 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 17 10:32:54.323186 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 17 10:32:54.323198 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 17 10:32:54.323210 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 17 10:32:54.323225 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 17 10:32:54.323237 systemd[1]: Created slice user.slice - User and Session Slice. May 17 10:32:54.323248 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 17 10:32:54.323262 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 17 10:32:54.323274 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 17 10:32:54.323286 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 17 10:32:54.323299 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 17 10:32:54.323313 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 17 10:32:54.323325 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 17 10:32:54.323337 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 17 10:32:54.323349 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 17 10:32:54.323361 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 17 10:32:54.323373 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 17 10:32:54.323385 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 17 10:32:54.323396 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 17 10:32:54.323408 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 17 10:32:54.323420 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 17 10:32:54.323434 systemd[1]: Reached target slices.target - Slice Units. May 17 10:32:54.323446 systemd[1]: Reached target swap.target - Swaps. May 17 10:32:54.323458 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 17 10:32:54.323470 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 17 10:32:54.323482 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 17 10:32:54.323494 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 17 10:32:54.323506 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 17 10:32:54.323519 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 17 10:32:54.323530 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 17 10:32:54.323544 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 17 10:32:54.323556 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 17 10:32:54.323568 systemd[1]: Mounting media.mount - External Media Directory... May 17 10:32:54.323580 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:54.323592 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 17 10:32:54.323610 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 17 10:32:54.323622 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 17 10:32:54.323726 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 17 10:32:54.323742 systemd[1]: Reached target machines.target - Containers. May 17 10:32:54.323754 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 17 10:32:54.323766 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 17 10:32:54.323779 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 17 10:32:54.323790 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 17 10:32:54.323802 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 17 10:32:54.323814 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 17 10:32:54.323826 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 17 10:32:54.323838 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 17 10:32:54.323852 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 17 10:32:54.323865 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 17 10:32:54.323877 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 17 10:32:54.323890 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 17 10:32:54.323902 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 17 10:32:54.323914 systemd[1]: Stopped systemd-fsck-usr.service. May 17 10:32:54.323926 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 17 10:32:54.323939 systemd[1]: Starting systemd-journald.service - Journal Service... May 17 10:32:54.323952 kernel: loop: module loaded May 17 10:32:54.323964 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 17 10:32:54.323975 kernel: fuse: init (API version 7.41) May 17 10:32:54.323987 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 17 10:32:54.323999 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 17 10:32:54.324011 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 17 10:32:54.324023 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 17 10:32:54.324043 systemd[1]: verity-setup.service: Deactivated successfully. May 17 10:32:54.324056 systemd[1]: Stopped verity-setup.service. May 17 10:32:54.324068 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:54.324079 kernel: ACPI: bus type drm_connector registered May 17 10:32:54.324091 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 17 10:32:54.324103 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 17 10:32:54.324118 systemd[1]: Mounted media.mount - External Media Directory. May 17 10:32:54.324150 systemd-journald[1216]: Collecting audit messages is disabled. May 17 10:32:54.324171 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 17 10:32:54.324184 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 17 10:32:54.324197 systemd-journald[1216]: Journal started May 17 10:32:54.324222 systemd-journald[1216]: Runtime Journal (/run/log/journal/6f40cbf8286c45bb86c499b7c5da6c5d) is 6M, max 48.5M, 42.4M free. May 17 10:32:54.091069 systemd[1]: Queued start job for default target multi-user.target. May 17 10:32:54.111614 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 17 10:32:54.112086 systemd[1]: systemd-journald.service: Deactivated successfully. May 17 10:32:54.327565 systemd[1]: Started systemd-journald.service - Journal Service. May 17 10:32:54.328293 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 17 10:32:54.329579 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 17 10:32:54.331059 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 17 10:32:54.332586 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 17 10:32:54.332818 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 17 10:32:54.334365 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 17 10:32:54.334575 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 17 10:32:54.335995 systemd[1]: modprobe@drm.service: Deactivated successfully. May 17 10:32:54.336213 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 17 10:32:54.337584 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 17 10:32:54.337801 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 17 10:32:54.339372 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 17 10:32:54.339585 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 17 10:32:54.340950 systemd[1]: modprobe@loop.service: Deactivated successfully. May 17 10:32:54.341164 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 17 10:32:54.342762 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 17 10:32:54.344344 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 17 10:32:54.345969 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 17 10:32:54.347543 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 17 10:32:54.363244 systemd[1]: Reached target network-pre.target - Preparation for Network. May 17 10:32:54.365873 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 17 10:32:54.368231 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 17 10:32:54.369492 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 17 10:32:54.369582 systemd[1]: Reached target local-fs.target - Local File Systems. May 17 10:32:54.371729 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 17 10:32:54.375731 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 17 10:32:54.376883 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 17 10:32:54.378206 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 17 10:32:54.380733 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 17 10:32:54.382859 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 17 10:32:54.385534 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 17 10:32:54.387003 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 17 10:32:54.388443 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 17 10:32:54.390593 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 17 10:32:54.394786 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 17 10:32:54.395800 systemd-journald[1216]: Time spent on flushing to /var/log/journal/6f40cbf8286c45bb86c499b7c5da6c5d is 15.370ms for 1064 entries. May 17 10:32:54.395800 systemd-journald[1216]: System Journal (/var/log/journal/6f40cbf8286c45bb86c499b7c5da6c5d) is 8M, max 195.6M, 187.6M free. May 17 10:32:54.431083 systemd-journald[1216]: Received client request to flush runtime journal. May 17 10:32:54.397542 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 17 10:32:54.398918 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 17 10:32:54.403346 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 17 10:32:54.404529 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 17 10:32:54.411829 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 17 10:32:54.413388 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 17 10:32:54.433126 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 17 10:32:54.435488 kernel: loop0: detected capacity change from 0 to 146240 May 17 10:32:54.438985 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 17 10:32:54.444718 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 17 10:32:54.456797 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 17 10:32:54.460147 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 17 10:32:54.463365 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 17 10:32:54.473688 kernel: loop1: detected capacity change from 0 to 221472 May 17 10:32:54.492742 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. May 17 10:32:54.492759 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. May 17 10:32:54.498689 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 17 10:32:54.502342 kernel: loop2: detected capacity change from 0 to 113872 May 17 10:32:54.537655 kernel: loop3: detected capacity change from 0 to 146240 May 17 10:32:54.550721 kernel: loop4: detected capacity change from 0 to 221472 May 17 10:32:54.559749 kernel: loop5: detected capacity change from 0 to 113872 May 17 10:32:54.567259 (sd-merge)[1286]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 17 10:32:54.567855 (sd-merge)[1286]: Merged extensions into '/usr'. May 17 10:32:54.572058 systemd[1]: Reload requested from client PID 1260 ('systemd-sysext') (unit systemd-sysext.service)... May 17 10:32:54.572074 systemd[1]: Reloading... May 17 10:32:54.629659 zram_generator::config[1311]: No configuration found. May 17 10:32:54.707852 ldconfig[1255]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 17 10:32:54.730664 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 17 10:32:54.811207 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 17 10:32:54.811757 systemd[1]: Reloading finished in 239 ms. May 17 10:32:54.849474 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 17 10:32:54.851104 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 17 10:32:54.868110 systemd[1]: Starting ensure-sysext.service... May 17 10:32:54.870045 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 17 10:32:54.881343 systemd[1]: Reload requested from client PID 1349 ('systemctl') (unit ensure-sysext.service)... May 17 10:32:54.881363 systemd[1]: Reloading... May 17 10:32:54.890996 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 17 10:32:54.891047 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 17 10:32:54.891328 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 17 10:32:54.891582 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 17 10:32:54.892445 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 17 10:32:54.892745 systemd-tmpfiles[1350]: ACLs are not supported, ignoring. May 17 10:32:54.892821 systemd-tmpfiles[1350]: ACLs are not supported, ignoring. May 17 10:32:54.898605 systemd-tmpfiles[1350]: Detected autofs mount point /boot during canonicalization of boot. May 17 10:32:54.898617 systemd-tmpfiles[1350]: Skipping /boot May 17 10:32:54.910511 systemd-tmpfiles[1350]: Detected autofs mount point /boot during canonicalization of boot. May 17 10:32:54.910526 systemd-tmpfiles[1350]: Skipping /boot May 17 10:32:54.932674 zram_generator::config[1377]: No configuration found. May 17 10:32:55.027410 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 17 10:32:55.106650 systemd[1]: Reloading finished in 224 ms. May 17 10:32:55.131058 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 17 10:32:55.148868 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 17 10:32:55.157169 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 17 10:32:55.159456 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 17 10:32:55.168690 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 17 10:32:55.172737 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 17 10:32:55.175717 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 17 10:32:55.178890 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 17 10:32:55.182397 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:55.182554 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 17 10:32:55.184538 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 17 10:32:55.186937 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 17 10:32:55.190548 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 17 10:32:55.191770 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 17 10:32:55.191866 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 17 10:32:55.191964 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:55.199175 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 17 10:32:55.200985 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 17 10:32:55.204411 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 17 10:32:55.204612 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 17 10:32:55.206314 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 17 10:32:55.206503 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 17 10:32:55.209224 systemd[1]: modprobe@loop.service: Deactivated successfully. May 17 10:32:55.209432 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 17 10:32:55.216570 systemd-udevd[1426]: Using default interface naming scheme 'v255'. May 17 10:32:55.219115 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:55.219422 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 17 10:32:55.221708 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 17 10:32:55.224465 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 17 10:32:55.227529 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 17 10:32:55.228221 augenrules[1451]: No rules May 17 10:32:55.228803 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 17 10:32:55.228917 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 17 10:32:55.234911 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 17 10:32:55.236062 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:55.238331 systemd[1]: audit-rules.service: Deactivated successfully. May 17 10:32:55.238572 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 17 10:32:55.240237 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 17 10:32:55.242251 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 17 10:32:55.242455 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 17 10:32:55.244363 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 17 10:32:55.244660 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 17 10:32:55.246386 systemd[1]: modprobe@loop.service: Deactivated successfully. May 17 10:32:55.246581 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 17 10:32:55.249978 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 17 10:32:55.256359 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 17 10:32:55.259238 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 17 10:32:55.262034 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 17 10:32:55.270853 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:55.273876 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 17 10:32:55.274981 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 17 10:32:55.280922 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 17 10:32:55.283850 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 17 10:32:55.285797 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 17 10:32:55.288313 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 17 10:32:55.289469 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 17 10:32:55.289584 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 17 10:32:55.293114 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 17 10:32:55.294302 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 17 10:32:55.294404 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 17 10:32:55.306244 systemd[1]: modprobe@loop.service: Deactivated successfully. May 17 10:32:55.307735 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 17 10:32:55.309443 augenrules[1495]: /sbin/augenrules: No change May 17 10:32:55.309821 systemd[1]: Finished ensure-sysext.service. May 17 10:32:55.313937 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 17 10:32:55.315414 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 17 10:32:55.315647 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 17 10:32:55.317073 systemd[1]: modprobe@drm.service: Deactivated successfully. May 17 10:32:55.317332 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 17 10:32:55.320014 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 17 10:32:55.328662 augenrules[1527]: No rules May 17 10:32:55.337364 systemd[1]: audit-rules.service: Deactivated successfully. May 17 10:32:55.338092 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 17 10:32:55.340107 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 17 10:32:55.340322 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 17 10:32:55.345829 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 17 10:32:55.377124 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 17 10:32:55.414648 kernel: mousedev: PS/2 mouse device common for all mice May 17 10:32:55.415512 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 17 10:32:55.418781 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 17 10:32:55.439675 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 May 17 10:32:55.448448 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 17 10:32:55.450500 kernel: ACPI: button: Power Button [PWRF] May 17 10:32:55.450534 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device May 17 10:32:55.453241 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 17 10:32:55.453407 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 17 10:32:55.486986 systemd-resolved[1420]: Positive Trust Anchors: May 17 10:32:55.487006 systemd-resolved[1420]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 17 10:32:55.487045 systemd-resolved[1420]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 17 10:32:55.494624 systemd-resolved[1420]: Defaulting to hostname 'linux'. May 17 10:32:55.495789 systemd-networkd[1506]: lo: Link UP May 17 10:32:55.495801 systemd-networkd[1506]: lo: Gained carrier May 17 10:32:55.497371 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 17 10:32:55.498379 systemd-networkd[1506]: Enumeration completed May 17 10:32:55.498751 systemd-networkd[1506]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 17 10:32:55.498756 systemd-networkd[1506]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 17 10:32:55.498793 systemd[1]: Started systemd-networkd.service - Network Configuration. May 17 10:32:55.499257 systemd-networkd[1506]: eth0: Link UP May 17 10:32:55.499503 systemd-networkd[1506]: eth0: Gained carrier May 17 10:32:55.499515 systemd-networkd[1506]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 17 10:32:55.500240 systemd[1]: Reached target network.target - Network. May 17 10:32:55.501193 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 17 10:32:55.503787 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 17 10:32:55.507855 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 17 10:32:55.513953 systemd-networkd[1506]: eth0: DHCPv4 address 10.0.0.118/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 17 10:32:55.524165 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 17 10:32:55.526787 systemd[1]: Reached target sysinit.target - System Initialization. May 17 10:32:57.145079 systemd-resolved[1420]: Clock change detected. Flushing caches. May 17 10:32:57.145158 systemd-timesyncd[1517]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 17 10:32:57.145196 systemd-timesyncd[1517]: Initial clock synchronization to Sat 2025-05-17 10:32:57.145047 UTC. May 17 10:32:57.145974 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 17 10:32:57.147256 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 17 10:32:57.148522 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 17 10:32:57.149664 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 17 10:32:57.150924 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 17 10:32:57.150948 systemd[1]: Reached target paths.target - Path Units. May 17 10:32:57.151873 systemd[1]: Reached target time-set.target - System Time Set. May 17 10:32:57.153621 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 17 10:32:57.155450 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 17 10:32:57.156714 systemd[1]: Reached target timers.target - Timer Units. May 17 10:32:57.158314 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 17 10:32:57.161191 systemd[1]: Starting docker.socket - Docker Socket for the API... May 17 10:32:57.165632 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 17 10:32:57.167602 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 17 10:32:57.169459 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 17 10:32:57.202290 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 17 10:32:57.204461 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 17 10:32:57.208478 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 17 10:32:57.210197 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 17 10:32:57.223977 systemd[1]: Reached target sockets.target - Socket Units. May 17 10:32:57.225578 systemd[1]: Reached target basic.target - Basic System. May 17 10:32:57.227891 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 17 10:32:57.228227 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 17 10:32:57.231096 systemd[1]: Starting containerd.service - containerd container runtime... May 17 10:32:57.233986 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 17 10:32:57.236761 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 17 10:32:57.243087 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 17 10:32:57.245721 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 17 10:32:57.247459 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 17 10:32:57.248694 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 17 10:32:57.251985 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 17 10:32:57.255847 kernel: kvm_amd: TSC scaling supported May 17 10:32:57.255876 kernel: kvm_amd: Nested Virtualization enabled May 17 10:32:57.255888 kernel: kvm_amd: Nested Paging enabled May 17 10:32:57.255900 kernel: kvm_amd: LBR virtualization supported May 17 10:32:57.257421 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported May 17 10:32:57.257484 kernel: kvm_amd: Virtual GIF supported May 17 10:32:57.260636 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 17 10:32:57.265693 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 17 10:32:57.265937 oslogin_cache_refresh[1574]: Refreshing passwd entry cache May 17 10:32:57.267682 google_oslogin_nss_cache[1574]: oslogin_cache_refresh[1574]: Refreshing passwd entry cache May 17 10:32:57.267823 jq[1572]: false May 17 10:32:57.273505 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 17 10:32:57.277071 google_oslogin_nss_cache[1574]: oslogin_cache_refresh[1574]: Failure getting users, quitting May 17 10:32:57.277071 google_oslogin_nss_cache[1574]: oslogin_cache_refresh[1574]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 17 10:32:57.277071 google_oslogin_nss_cache[1574]: oslogin_cache_refresh[1574]: Refreshing group entry cache May 17 10:32:57.276655 oslogin_cache_refresh[1574]: Failure getting users, quitting May 17 10:32:57.276671 oslogin_cache_refresh[1574]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 17 10:32:57.276716 oslogin_cache_refresh[1574]: Refreshing group entry cache May 17 10:32:57.277527 extend-filesystems[1573]: Found loop3 May 17 10:32:57.277807 extend-filesystems[1573]: Found loop4 May 17 10:32:57.277807 extend-filesystems[1573]: Found loop5 May 17 10:32:57.277807 extend-filesystems[1573]: Found sr0 May 17 10:32:57.277807 extend-filesystems[1573]: Found vda May 17 10:32:57.277807 extend-filesystems[1573]: Found vda1 May 17 10:32:57.277807 extend-filesystems[1573]: Found vda2 May 17 10:32:57.277807 extend-filesystems[1573]: Found vda3 May 17 10:32:57.277807 extend-filesystems[1573]: Found usr May 17 10:32:57.277807 extend-filesystems[1573]: Found vda4 May 17 10:32:57.289973 extend-filesystems[1573]: Found vda6 May 17 10:32:57.289973 extend-filesystems[1573]: Found vda7 May 17 10:32:57.289973 extend-filesystems[1573]: Found vda9 May 17 10:32:57.289973 extend-filesystems[1573]: Checking size of /dev/vda9 May 17 10:32:57.281322 systemd[1]: Starting systemd-logind.service - User Login Management... May 17 10:32:57.287302 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 17 10:32:57.288169 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 17 10:32:57.289376 systemd[1]: Starting update-engine.service - Update Engine... May 17 10:32:57.291649 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 17 10:32:57.299806 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 17 10:32:57.301689 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 17 10:32:57.301939 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 17 10:32:57.302910 jq[1589]: true May 17 10:32:57.302607 systemd[1]: motdgen.service: Deactivated successfully. May 17 10:32:57.302852 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 17 10:32:57.307084 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 17 10:32:57.307313 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 17 10:32:57.318307 google_oslogin_nss_cache[1574]: oslogin_cache_refresh[1574]: Failure getting groups, quitting May 17 10:32:57.318307 google_oslogin_nss_cache[1574]: oslogin_cache_refresh[1574]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 17 10:32:57.318190 oslogin_cache_refresh[1574]: Failure getting groups, quitting May 17 10:32:57.318203 oslogin_cache_refresh[1574]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 17 10:32:57.320042 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 17 10:32:57.321386 update_engine[1588]: I20250517 10:32:57.321329 1588 main.cc:92] Flatcar Update Engine starting May 17 10:32:57.322418 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 17 10:32:57.323597 (ntainerd)[1593]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 17 10:32:57.328553 jq[1592]: true May 17 10:32:57.338789 extend-filesystems[1573]: Resized partition /dev/vda9 May 17 10:32:57.339726 tar[1591]: linux-amd64/helm May 17 10:32:57.341107 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 17 10:32:57.347467 extend-filesystems[1612]: resize2fs 1.47.2 (1-Jan-2025) May 17 10:32:57.355443 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 17 10:32:57.365664 dbus-daemon[1569]: [system] SELinux support is enabled May 17 10:32:57.365842 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 17 10:32:57.369257 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 17 10:32:57.371434 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 17 10:32:57.372757 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 17 10:32:57.372779 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 17 10:32:57.378869 systemd[1]: Started update-engine.service - Update Engine. May 17 10:32:57.379443 update_engine[1588]: I20250517 10:32:57.379317 1588 update_check_scheduler.cc:74] Next update check in 11m38s May 17 10:32:57.382618 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 17 10:32:57.390464 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 17 10:32:57.410832 systemd-logind[1585]: Watching system buttons on /dev/input/event2 (Power Button) May 17 10:32:57.410851 systemd-logind[1585]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 17 10:32:57.412508 systemd-logind[1585]: New seat seat0. May 17 10:32:57.413302 extend-filesystems[1612]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 17 10:32:57.413302 extend-filesystems[1612]: old_desc_blocks = 1, new_desc_blocks = 1 May 17 10:32:57.413302 extend-filesystems[1612]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 17 10:32:57.442704 kernel: EDAC MC: Ver: 3.0.0 May 17 10:32:57.442727 bash[1631]: Updated "/home/core/.ssh/authorized_keys" May 17 10:32:57.442840 extend-filesystems[1573]: Resized filesystem in /dev/vda9 May 17 10:32:57.417174 systemd[1]: extend-filesystems.service: Deactivated successfully. May 17 10:32:57.429480 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 17 10:32:57.456295 locksmithd[1620]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 17 10:32:57.479164 systemd[1]: Started systemd-logind.service - User Login Management. May 17 10:32:57.480667 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 17 10:32:57.486004 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 17 10:32:57.508898 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 17 10:32:57.530172 containerd[1593]: time="2025-05-17T10:32:57Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 17 10:32:57.532556 containerd[1593]: time="2025-05-17T10:32:57.530945893Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 17 10:32:57.539386 containerd[1593]: time="2025-05-17T10:32:57.539334964Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.348µs" May 17 10:32:57.539386 containerd[1593]: time="2025-05-17T10:32:57.539375110Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 17 10:32:57.539479 containerd[1593]: time="2025-05-17T10:32:57.539456292Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 17 10:32:57.539695 containerd[1593]: time="2025-05-17T10:32:57.539665745Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 17 10:32:57.539695 containerd[1593]: time="2025-05-17T10:32:57.539688077Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 17 10:32:57.539750 containerd[1593]: time="2025-05-17T10:32:57.539717973Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 17 10:32:57.539840 containerd[1593]: time="2025-05-17T10:32:57.539802632Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 17 10:32:57.539840 containerd[1593]: time="2025-05-17T10:32:57.539822730Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 17 10:32:57.540146 containerd[1593]: time="2025-05-17T10:32:57.540108005Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 17 10:32:57.540146 containerd[1593]: time="2025-05-17T10:32:57.540128814Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 17 10:32:57.540197 containerd[1593]: time="2025-05-17T10:32:57.540146337Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 17 10:32:57.540197 containerd[1593]: time="2025-05-17T10:32:57.540159832Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 17 10:32:57.540286 containerd[1593]: time="2025-05-17T10:32:57.540262325Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 17 10:32:57.541404 containerd[1593]: time="2025-05-17T10:32:57.541362719Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 17 10:32:57.541456 containerd[1593]: time="2025-05-17T10:32:57.541418224Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 17 10:32:57.541456 containerd[1593]: time="2025-05-17T10:32:57.541429555Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 17 10:32:57.541500 containerd[1593]: time="2025-05-17T10:32:57.541467827Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 17 10:32:57.541799 containerd[1593]: time="2025-05-17T10:32:57.541761077Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 17 10:32:57.541880 containerd[1593]: time="2025-05-17T10:32:57.541852108Z" level=info msg="metadata content store policy set" policy=shared May 17 10:32:57.546643 containerd[1593]: time="2025-05-17T10:32:57.546607030Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 17 10:32:57.546679 containerd[1593]: time="2025-05-17T10:32:57.546645021Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 17 10:32:57.546679 containerd[1593]: time="2025-05-17T10:32:57.546670529Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 17 10:32:57.546731 containerd[1593]: time="2025-05-17T10:32:57.546681740Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 17 10:32:57.546731 containerd[1593]: time="2025-05-17T10:32:57.546694694Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 17 10:32:57.546731 containerd[1593]: time="2025-05-17T10:32:57.546705084Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 17 10:32:57.546731 containerd[1593]: time="2025-05-17T10:32:57.546718249Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 17 10:32:57.546731 containerd[1593]: time="2025-05-17T10:32:57.546729680Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 17 10:32:57.546830 containerd[1593]: time="2025-05-17T10:32:57.546746361Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 17 10:32:57.546830 containerd[1593]: time="2025-05-17T10:32:57.546757703Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 17 10:32:57.546830 containerd[1593]: time="2025-05-17T10:32:57.546774174Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 17 10:32:57.546830 containerd[1593]: time="2025-05-17T10:32:57.546787078Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 17 10:32:57.546903 containerd[1593]: time="2025-05-17T10:32:57.546893758Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 17 10:32:57.546923 containerd[1593]: time="2025-05-17T10:32:57.546911862Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 17 10:32:57.546943 containerd[1593]: time="2025-05-17T10:32:57.546929285Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 17 10:32:57.546962 containerd[1593]: time="2025-05-17T10:32:57.546943892Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 17 10:32:57.546962 containerd[1593]: time="2025-05-17T10:32:57.546955183Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 17 10:32:57.547002 containerd[1593]: time="2025-05-17T10:32:57.546965603Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 17 10:32:57.547002 containerd[1593]: time="2025-05-17T10:32:57.546977275Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 17 10:32:57.547002 containerd[1593]: time="2025-05-17T10:32:57.546987414Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 17 10:32:57.547002 containerd[1593]: time="2025-05-17T10:32:57.546998004Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 17 10:32:57.547079 containerd[1593]: time="2025-05-17T10:32:57.547009315Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 17 10:32:57.547079 containerd[1593]: time="2025-05-17T10:32:57.547020616Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 17 10:32:57.547116 containerd[1593]: time="2025-05-17T10:32:57.547085969Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 17 10:32:57.547116 containerd[1593]: time="2025-05-17T10:32:57.547101688Z" level=info msg="Start snapshots syncer" May 17 10:32:57.547152 containerd[1593]: time="2025-05-17T10:32:57.547123689Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 17 10:32:57.547447 containerd[1593]: time="2025-05-17T10:32:57.547403975Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 17 10:32:57.547447 containerd[1593]: time="2025-05-17T10:32:57.547446415Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 17 10:32:57.547575 containerd[1593]: time="2025-05-17T10:32:57.547511257Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 17 10:32:57.547628 containerd[1593]: time="2025-05-17T10:32:57.547600755Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 17 10:32:57.547727 containerd[1593]: time="2025-05-17T10:32:57.547696645Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 17 10:32:57.547727 containerd[1593]: time="2025-05-17T10:32:57.547713967Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 17 10:32:57.547795 containerd[1593]: time="2025-05-17T10:32:57.547730498Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 17 10:32:57.547795 containerd[1593]: time="2025-05-17T10:32:57.547742300Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 17 10:32:57.547795 containerd[1593]: time="2025-05-17T10:32:57.547753471Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 17 10:32:57.547795 containerd[1593]: time="2025-05-17T10:32:57.547772106Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 17 10:32:57.547795 containerd[1593]: time="2025-05-17T10:32:57.547793346Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 17 10:32:57.547894 containerd[1593]: time="2025-05-17T10:32:57.547803635Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 17 10:32:57.547894 containerd[1593]: time="2025-05-17T10:32:57.547813844Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 17 10:32:57.547894 containerd[1593]: time="2025-05-17T10:32:57.547854561Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 17 10:32:57.547894 containerd[1593]: time="2025-05-17T10:32:57.547867054Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 17 10:32:57.547894 containerd[1593]: time="2025-05-17T10:32:57.547875109Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 17 10:32:57.547894 containerd[1593]: time="2025-05-17T10:32:57.547883605Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 17 10:32:57.548010 containerd[1593]: time="2025-05-17T10:32:57.547891310Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 17 10:32:57.548010 containerd[1593]: time="2025-05-17T10:32:57.547960179Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 17 10:32:57.548010 containerd[1593]: time="2025-05-17T10:32:57.547970609Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 17 10:32:57.548010 containerd[1593]: time="2025-05-17T10:32:57.547992049Z" level=info msg="runtime interface created" May 17 10:32:57.548010 containerd[1593]: time="2025-05-17T10:32:57.547997168Z" level=info msg="created NRI interface" May 17 10:32:57.548010 containerd[1593]: time="2025-05-17T10:32:57.548004883Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 17 10:32:57.548117 containerd[1593]: time="2025-05-17T10:32:57.548015833Z" level=info msg="Connect containerd service" May 17 10:32:57.548117 containerd[1593]: time="2025-05-17T10:32:57.548039488Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 17 10:32:57.548844 containerd[1593]: time="2025-05-17T10:32:57.548813109Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 17 10:32:57.636302 containerd[1593]: time="2025-05-17T10:32:57.636252553Z" level=info msg="Start subscribing containerd event" May 17 10:32:57.636496 containerd[1593]: time="2025-05-17T10:32:57.636468158Z" level=info msg="Start recovering state" May 17 10:32:57.636635 containerd[1593]: time="2025-05-17T10:32:57.636621726Z" level=info msg="Start event monitor" May 17 10:32:57.636875 containerd[1593]: time="2025-05-17T10:32:57.636861486Z" level=info msg="Start cni network conf syncer for default" May 17 10:32:57.636961 containerd[1593]: time="2025-05-17T10:32:57.636948830Z" level=info msg="Start streaming server" May 17 10:32:57.637014 containerd[1593]: time="2025-05-17T10:32:57.637003963Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 17 10:32:57.637134 containerd[1593]: time="2025-05-17T10:32:57.637045631Z" level=info msg="runtime interface starting up..." May 17 10:32:57.637134 containerd[1593]: time="2025-05-17T10:32:57.637054398Z" level=info msg="starting plugins..." May 17 10:32:57.637134 containerd[1593]: time="2025-05-17T10:32:57.637071480Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 17 10:32:57.637471 containerd[1593]: time="2025-05-17T10:32:57.637445522Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 17 10:32:57.637570 containerd[1593]: time="2025-05-17T10:32:57.637557422Z" level=info msg=serving... address=/run/containerd/containerd.sock May 17 10:32:57.637785 systemd[1]: Started containerd.service - containerd container runtime. May 17 10:32:57.638051 containerd[1593]: time="2025-05-17T10:32:57.638033365Z" level=info msg="containerd successfully booted in 0.108325s" May 17 10:32:57.783177 tar[1591]: linux-amd64/LICENSE May 17 10:32:57.783177 tar[1591]: linux-amd64/README.md May 17 10:32:57.809620 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 17 10:32:57.872546 sshd_keygen[1613]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 17 10:32:57.896266 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 17 10:32:57.899156 systemd[1]: Starting issuegen.service - Generate /run/issue... May 17 10:32:57.923226 systemd[1]: issuegen.service: Deactivated successfully. May 17 10:32:57.923490 systemd[1]: Finished issuegen.service - Generate /run/issue. May 17 10:32:57.926010 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 17 10:32:57.957060 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 17 10:32:57.960061 systemd[1]: Started getty@tty1.service - Getty on tty1. May 17 10:32:57.962100 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 17 10:32:57.963349 systemd[1]: Reached target getty.target - Login Prompts. May 17 10:32:58.805585 systemd-networkd[1506]: eth0: Gained IPv6LL May 17 10:32:58.808529 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 17 10:32:58.810358 systemd[1]: Reached target network-online.target - Network is Online. May 17 10:32:58.812967 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 17 10:32:58.815256 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 17 10:32:58.817380 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 17 10:32:58.853650 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 17 10:32:58.855486 systemd[1]: coreos-metadata.service: Deactivated successfully. May 17 10:32:58.855771 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 17 10:32:58.858663 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 17 10:32:59.520743 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 17 10:32:59.522303 systemd[1]: Reached target multi-user.target - Multi-User System. May 17 10:32:59.523533 systemd[1]: Startup finished in 2.819s (kernel) + 5.931s (initrd) + 4.379s (userspace) = 13.131s. May 17 10:32:59.526668 (kubelet)[1709]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 17 10:32:59.928153 kubelet[1709]: E0517 10:32:59.928042 1709 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 17 10:32:59.932315 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 17 10:32:59.932521 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 17 10:32:59.932869 systemd[1]: kubelet.service: Consumed 963ms CPU time, 266.2M memory peak. May 17 10:33:01.913842 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 17 10:33:01.915174 systemd[1]: Started sshd@0-10.0.0.118:22-10.0.0.1:34226.service - OpenSSH per-connection server daemon (10.0.0.1:34226). May 17 10:33:01.966234 sshd[1723]: Accepted publickey for core from 10.0.0.1 port 34226 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:33:01.968036 sshd-session[1723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:33:01.974472 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 17 10:33:01.975564 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 17 10:33:01.982249 systemd-logind[1585]: New session 1 of user core. May 17 10:33:01.998107 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 17 10:33:02.001842 systemd[1]: Starting user@500.service - User Manager for UID 500... May 17 10:33:02.021651 (systemd)[1727]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 17 10:33:02.023828 systemd-logind[1585]: New session c1 of user core. May 17 10:33:02.180378 systemd[1727]: Queued start job for default target default.target. May 17 10:33:02.198674 systemd[1727]: Created slice app.slice - User Application Slice. May 17 10:33:02.198701 systemd[1727]: Reached target paths.target - Paths. May 17 10:33:02.198741 systemd[1727]: Reached target timers.target - Timers. May 17 10:33:02.200284 systemd[1727]: Starting dbus.socket - D-Bus User Message Bus Socket... May 17 10:33:02.210983 systemd[1727]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 17 10:33:02.211101 systemd[1727]: Reached target sockets.target - Sockets. May 17 10:33:02.211142 systemd[1727]: Reached target basic.target - Basic System. May 17 10:33:02.211180 systemd[1727]: Reached target default.target - Main User Target. May 17 10:33:02.211210 systemd[1727]: Startup finished in 180ms. May 17 10:33:02.211614 systemd[1]: Started user@500.service - User Manager for UID 500. May 17 10:33:02.213167 systemd[1]: Started session-1.scope - Session 1 of User core. May 17 10:33:02.276080 systemd[1]: Started sshd@1-10.0.0.118:22-10.0.0.1:34228.service - OpenSSH per-connection server daemon (10.0.0.1:34228). May 17 10:33:02.319236 sshd[1738]: Accepted publickey for core from 10.0.0.1 port 34228 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:33:02.320544 sshd-session[1738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:33:02.324477 systemd-logind[1585]: New session 2 of user core. May 17 10:33:02.339529 systemd[1]: Started session-2.scope - Session 2 of User core. May 17 10:33:02.391435 sshd[1740]: Connection closed by 10.0.0.1 port 34228 May 17 10:33:02.391780 sshd-session[1738]: pam_unix(sshd:session): session closed for user core May 17 10:33:02.412917 systemd[1]: sshd@1-10.0.0.118:22-10.0.0.1:34228.service: Deactivated successfully. May 17 10:33:02.414693 systemd[1]: session-2.scope: Deactivated successfully. May 17 10:33:02.415345 systemd-logind[1585]: Session 2 logged out. Waiting for processes to exit. May 17 10:33:02.418063 systemd[1]: Started sshd@2-10.0.0.118:22-10.0.0.1:34230.service - OpenSSH per-connection server daemon (10.0.0.1:34230). May 17 10:33:02.418965 systemd-logind[1585]: Removed session 2. May 17 10:33:02.468916 sshd[1746]: Accepted publickey for core from 10.0.0.1 port 34230 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:33:02.470186 sshd-session[1746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:33:02.474184 systemd-logind[1585]: New session 3 of user core. May 17 10:33:02.482521 systemd[1]: Started session-3.scope - Session 3 of User core. May 17 10:33:02.530791 sshd[1748]: Connection closed by 10.0.0.1 port 34230 May 17 10:33:02.531083 sshd-session[1746]: pam_unix(sshd:session): session closed for user core May 17 10:33:02.538876 systemd[1]: sshd@2-10.0.0.118:22-10.0.0.1:34230.service: Deactivated successfully. May 17 10:33:02.540386 systemd[1]: session-3.scope: Deactivated successfully. May 17 10:33:02.541161 systemd-logind[1585]: Session 3 logged out. Waiting for processes to exit. May 17 10:33:02.543707 systemd[1]: Started sshd@3-10.0.0.118:22-10.0.0.1:34232.service - OpenSSH per-connection server daemon (10.0.0.1:34232). May 17 10:33:02.544205 systemd-logind[1585]: Removed session 3. May 17 10:33:02.590528 sshd[1754]: Accepted publickey for core from 10.0.0.1 port 34232 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:33:02.591823 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:33:02.595845 systemd-logind[1585]: New session 4 of user core. May 17 10:33:02.606506 systemd[1]: Started session-4.scope - Session 4 of User core. May 17 10:33:02.657972 sshd[1756]: Connection closed by 10.0.0.1 port 34232 May 17 10:33:02.658293 sshd-session[1754]: pam_unix(sshd:session): session closed for user core May 17 10:33:02.669885 systemd[1]: sshd@3-10.0.0.118:22-10.0.0.1:34232.service: Deactivated successfully. May 17 10:33:02.671546 systemd[1]: session-4.scope: Deactivated successfully. May 17 10:33:02.672282 systemd-logind[1585]: Session 4 logged out. Waiting for processes to exit. May 17 10:33:02.674824 systemd[1]: Started sshd@4-10.0.0.118:22-10.0.0.1:34236.service - OpenSSH per-connection server daemon (10.0.0.1:34236). May 17 10:33:02.675561 systemd-logind[1585]: Removed session 4. May 17 10:33:02.723581 sshd[1762]: Accepted publickey for core from 10.0.0.1 port 34236 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:33:02.724709 sshd-session[1762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:33:02.728595 systemd-logind[1585]: New session 5 of user core. May 17 10:33:02.738520 systemd[1]: Started session-5.scope - Session 5 of User core. May 17 10:33:02.794621 sudo[1765]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 17 10:33:02.794945 sudo[1765]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 17 10:33:02.809126 sudo[1765]: pam_unix(sudo:session): session closed for user root May 17 10:33:02.810525 sshd[1764]: Connection closed by 10.0.0.1 port 34236 May 17 10:33:02.810942 sshd-session[1762]: pam_unix(sshd:session): session closed for user core May 17 10:33:02.830929 systemd[1]: sshd@4-10.0.0.118:22-10.0.0.1:34236.service: Deactivated successfully. May 17 10:33:02.832662 systemd[1]: session-5.scope: Deactivated successfully. May 17 10:33:02.833321 systemd-logind[1585]: Session 5 logged out. Waiting for processes to exit. May 17 10:33:02.836044 systemd[1]: Started sshd@5-10.0.0.118:22-10.0.0.1:34244.service - OpenSSH per-connection server daemon (10.0.0.1:34244). May 17 10:33:02.836743 systemd-logind[1585]: Removed session 5. May 17 10:33:02.878785 sshd[1771]: Accepted publickey for core from 10.0.0.1 port 34244 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:33:02.880103 sshd-session[1771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:33:02.884176 systemd-logind[1585]: New session 6 of user core. May 17 10:33:02.897518 systemd[1]: Started session-6.scope - Session 6 of User core. May 17 10:33:02.949157 sudo[1776]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 17 10:33:02.949455 sudo[1776]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 17 10:33:03.051817 sudo[1776]: pam_unix(sudo:session): session closed for user root May 17 10:33:03.057992 sudo[1775]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 17 10:33:03.058282 sudo[1775]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 17 10:33:03.067198 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 17 10:33:03.121011 augenrules[1798]: No rules May 17 10:33:03.122712 systemd[1]: audit-rules.service: Deactivated successfully. May 17 10:33:03.122985 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 17 10:33:03.124109 sudo[1775]: pam_unix(sudo:session): session closed for user root May 17 10:33:03.125553 sshd[1774]: Connection closed by 10.0.0.1 port 34244 May 17 10:33:03.125893 sshd-session[1771]: pam_unix(sshd:session): session closed for user core May 17 10:33:03.137820 systemd[1]: sshd@5-10.0.0.118:22-10.0.0.1:34244.service: Deactivated successfully. May 17 10:33:03.139300 systemd[1]: session-6.scope: Deactivated successfully. May 17 10:33:03.140076 systemd-logind[1585]: Session 6 logged out. Waiting for processes to exit. May 17 10:33:03.142870 systemd[1]: Started sshd@6-10.0.0.118:22-10.0.0.1:34248.service - OpenSSH per-connection server daemon (10.0.0.1:34248). May 17 10:33:03.143352 systemd-logind[1585]: Removed session 6. May 17 10:33:03.186405 sshd[1807]: Accepted publickey for core from 10.0.0.1 port 34248 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:33:03.187996 sshd-session[1807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:33:03.192170 systemd-logind[1585]: New session 7 of user core. May 17 10:33:03.201519 systemd[1]: Started session-7.scope - Session 7 of User core. May 17 10:33:03.255006 sudo[1810]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 17 10:33:03.255307 sudo[1810]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 17 10:33:03.544839 systemd[1]: Starting docker.service - Docker Application Container Engine... May 17 10:33:03.567704 (dockerd)[1830]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 17 10:33:03.774112 dockerd[1830]: time="2025-05-17T10:33:03.774046904Z" level=info msg="Starting up" May 17 10:33:03.775619 dockerd[1830]: time="2025-05-17T10:33:03.775588347Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 17 10:33:04.124670 dockerd[1830]: time="2025-05-17T10:33:04.124619268Z" level=info msg="Loading containers: start." May 17 10:33:04.136418 kernel: Initializing XFRM netlink socket May 17 10:33:04.369583 systemd-networkd[1506]: docker0: Link UP May 17 10:33:04.375146 dockerd[1830]: time="2025-05-17T10:33:04.375061575Z" level=info msg="Loading containers: done." May 17 10:33:04.388287 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck12275513-merged.mount: Deactivated successfully. May 17 10:33:04.391429 dockerd[1830]: time="2025-05-17T10:33:04.391366270Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 17 10:33:04.391497 dockerd[1830]: time="2025-05-17T10:33:04.391472449Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 17 10:33:04.391634 dockerd[1830]: time="2025-05-17T10:33:04.391596121Z" level=info msg="Initializing buildkit" May 17 10:33:04.420949 dockerd[1830]: time="2025-05-17T10:33:04.420922951Z" level=info msg="Completed buildkit initialization" May 17 10:33:04.428039 dockerd[1830]: time="2025-05-17T10:33:04.427999960Z" level=info msg="Daemon has completed initialization" May 17 10:33:04.428119 dockerd[1830]: time="2025-05-17T10:33:04.428075482Z" level=info msg="API listen on /run/docker.sock" May 17 10:33:04.428246 systemd[1]: Started docker.service - Docker Application Container Engine. May 17 10:33:05.155894 containerd[1593]: time="2025-05-17T10:33:05.155852253Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\"" May 17 10:33:05.744095 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1129240579.mount: Deactivated successfully. May 17 10:33:06.576727 containerd[1593]: time="2025-05-17T10:33:06.576682509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:06.577521 containerd[1593]: time="2025-05-17T10:33:06.577486598Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.9: active requests=0, bytes read=28078845" May 17 10:33:06.578687 containerd[1593]: time="2025-05-17T10:33:06.578659969Z" level=info msg="ImageCreate event name:\"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:06.581194 containerd[1593]: time="2025-05-17T10:33:06.581147237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:06.582013 containerd[1593]: time="2025-05-17T10:33:06.581978767Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.9\" with image id \"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\", size \"28075645\" in 1.426089204s" May 17 10:33:06.582013 containerd[1593]: time="2025-05-17T10:33:06.582010386Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\" returns image reference \"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\"" May 17 10:33:06.582633 containerd[1593]: time="2025-05-17T10:33:06.582592729Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\"" May 17 10:33:07.725885 containerd[1593]: time="2025-05-17T10:33:07.725817118Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:07.728248 containerd[1593]: time="2025-05-17T10:33:07.728225336Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.9: active requests=0, bytes read=24713522" May 17 10:33:07.729631 containerd[1593]: time="2025-05-17T10:33:07.729577263Z" level=info msg="ImageCreate event name:\"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:07.732221 containerd[1593]: time="2025-05-17T10:33:07.732187511Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:07.733017 containerd[1593]: time="2025-05-17T10:33:07.732986500Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.9\" with image id \"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\", size \"26315362\" in 1.150357713s" May 17 10:33:07.733056 containerd[1593]: time="2025-05-17T10:33:07.733018090Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\" returns image reference \"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\"" May 17 10:33:07.733476 containerd[1593]: time="2025-05-17T10:33:07.733452775Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\"" May 17 10:33:08.952156 containerd[1593]: time="2025-05-17T10:33:08.952083579Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:08.953030 containerd[1593]: time="2025-05-17T10:33:08.952974631Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.9: active requests=0, bytes read=18784311" May 17 10:33:08.954144 containerd[1593]: time="2025-05-17T10:33:08.954104000Z" level=info msg="ImageCreate event name:\"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:08.956735 containerd[1593]: time="2025-05-17T10:33:08.956711042Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:08.957662 containerd[1593]: time="2025-05-17T10:33:08.957612774Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.9\" with image id \"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\", size \"20386169\" in 1.224130152s" May 17 10:33:08.957662 containerd[1593]: time="2025-05-17T10:33:08.957659071Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\" returns image reference \"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\"" May 17 10:33:08.958455 containerd[1593]: time="2025-05-17T10:33:08.958421361Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\"" May 17 10:33:09.825904 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1493932256.mount: Deactivated successfully. May 17 10:33:10.182966 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 17 10:33:10.184570 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 17 10:33:10.707022 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 17 10:33:10.711318 (kubelet)[2122]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 17 10:33:10.738633 containerd[1593]: time="2025-05-17T10:33:10.738586664Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:10.739529 containerd[1593]: time="2025-05-17T10:33:10.739506820Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.9: active requests=0, bytes read=30355623" May 17 10:33:10.740774 containerd[1593]: time="2025-05-17T10:33:10.740753479Z" level=info msg="ImageCreate event name:\"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:10.742730 containerd[1593]: time="2025-05-17T10:33:10.742700032Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:10.743197 containerd[1593]: time="2025-05-17T10:33:10.743168942Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.9\" with image id \"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\", repo tag \"registry.k8s.io/kube-proxy:v1.31.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\", size \"30354642\" in 1.784718366s" May 17 10:33:10.743251 containerd[1593]: time="2025-05-17T10:33:10.743200802Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\" returns image reference \"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\"" May 17 10:33:10.743818 containerd[1593]: time="2025-05-17T10:33:10.743633153Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 17 10:33:10.753189 kubelet[2122]: E0517 10:33:10.753149 2122 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 17 10:33:10.759050 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 17 10:33:10.759243 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 17 10:33:10.759607 systemd[1]: kubelet.service: Consumed 215ms CPU time, 109.5M memory peak. May 17 10:33:11.271779 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1589419616.mount: Deactivated successfully. May 17 10:33:11.910590 containerd[1593]: time="2025-05-17T10:33:11.910522577Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:11.911412 containerd[1593]: time="2025-05-17T10:33:11.911349158Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" May 17 10:33:11.912540 containerd[1593]: time="2025-05-17T10:33:11.912493125Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:11.914935 containerd[1593]: time="2025-05-17T10:33:11.914886937Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:11.915844 containerd[1593]: time="2025-05-17T10:33:11.915815088Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.172153272s" May 17 10:33:11.915844 containerd[1593]: time="2025-05-17T10:33:11.915842349Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 17 10:33:11.916317 containerd[1593]: time="2025-05-17T10:33:11.916281744Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 17 10:33:12.376799 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount877158086.mount: Deactivated successfully. May 17 10:33:12.382305 containerd[1593]: time="2025-05-17T10:33:12.382264437Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 17 10:33:12.382964 containerd[1593]: time="2025-05-17T10:33:12.382934965Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 17 10:33:12.384052 containerd[1593]: time="2025-05-17T10:33:12.384022346Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 17 10:33:12.385881 containerd[1593]: time="2025-05-17T10:33:12.385830278Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 17 10:33:12.386378 containerd[1593]: time="2025-05-17T10:33:12.386339904Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 470.029276ms" May 17 10:33:12.386378 containerd[1593]: time="2025-05-17T10:33:12.386371504Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 17 10:33:12.386844 containerd[1593]: time="2025-05-17T10:33:12.386808313Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 17 10:33:12.998751 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3609830166.mount: Deactivated successfully. May 17 10:33:14.622562 containerd[1593]: time="2025-05-17T10:33:14.622507342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:14.623330 containerd[1593]: time="2025-05-17T10:33:14.623279651Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" May 17 10:33:14.624551 containerd[1593]: time="2025-05-17T10:33:14.624517384Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:14.627143 containerd[1593]: time="2025-05-17T10:33:14.627111632Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:14.627985 containerd[1593]: time="2025-05-17T10:33:14.627925018Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.241084144s" May 17 10:33:14.627985 containerd[1593]: time="2025-05-17T10:33:14.627965003Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 17 10:33:16.913418 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 17 10:33:16.913584 systemd[1]: kubelet.service: Consumed 215ms CPU time, 109.5M memory peak. May 17 10:33:16.915629 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 17 10:33:16.938499 systemd[1]: Reload requested from client PID 2271 ('systemctl') (unit session-7.scope)... May 17 10:33:16.938514 systemd[1]: Reloading... May 17 10:33:17.025421 zram_generator::config[2320]: No configuration found. May 17 10:33:17.104662 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 17 10:33:17.217181 systemd[1]: Reloading finished in 278 ms. May 17 10:33:17.278013 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 17 10:33:17.278106 systemd[1]: kubelet.service: Failed with result 'signal'. May 17 10:33:17.278444 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 17 10:33:17.278488 systemd[1]: kubelet.service: Consumed 141ms CPU time, 98.3M memory peak. May 17 10:33:17.280054 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 17 10:33:17.436893 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 17 10:33:17.440640 (kubelet)[2362]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 17 10:33:17.474684 kubelet[2362]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 17 10:33:17.474684 kubelet[2362]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 17 10:33:17.474684 kubelet[2362]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 17 10:33:17.474684 kubelet[2362]: I0517 10:33:17.474660 2362 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 17 10:33:17.627122 kubelet[2362]: I0517 10:33:17.627080 2362 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 17 10:33:17.627122 kubelet[2362]: I0517 10:33:17.627108 2362 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 17 10:33:17.627366 kubelet[2362]: I0517 10:33:17.627344 2362 server.go:934] "Client rotation is on, will bootstrap in background" May 17 10:33:17.652715 kubelet[2362]: E0517 10:33:17.652152 2362 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.118:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:17.652842 kubelet[2362]: I0517 10:33:17.652814 2362 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 17 10:33:17.658604 kubelet[2362]: I0517 10:33:17.658584 2362 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 17 10:33:17.664506 kubelet[2362]: I0517 10:33:17.664485 2362 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 17 10:33:17.665023 kubelet[2362]: I0517 10:33:17.665004 2362 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 17 10:33:17.665163 kubelet[2362]: I0517 10:33:17.665132 2362 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 17 10:33:17.665343 kubelet[2362]: I0517 10:33:17.665159 2362 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 17 10:33:17.665448 kubelet[2362]: I0517 10:33:17.665356 2362 topology_manager.go:138] "Creating topology manager with none policy" May 17 10:33:17.665448 kubelet[2362]: I0517 10:33:17.665363 2362 container_manager_linux.go:300] "Creating device plugin manager" May 17 10:33:17.665496 kubelet[2362]: I0517 10:33:17.665478 2362 state_mem.go:36] "Initialized new in-memory state store" May 17 10:33:17.667318 kubelet[2362]: I0517 10:33:17.667295 2362 kubelet.go:408] "Attempting to sync node with API server" May 17 10:33:17.667318 kubelet[2362]: I0517 10:33:17.667318 2362 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 17 10:33:17.667408 kubelet[2362]: I0517 10:33:17.667361 2362 kubelet.go:314] "Adding apiserver pod source" May 17 10:33:17.667408 kubelet[2362]: I0517 10:33:17.667379 2362 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 17 10:33:17.670245 kubelet[2362]: I0517 10:33:17.670220 2362 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 17 10:33:17.671039 kubelet[2362]: I0517 10:33:17.670639 2362 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 17 10:33:17.671689 kubelet[2362]: W0517 10:33:17.671662 2362 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 17 10:33:17.672430 kubelet[2362]: W0517 10:33:17.672369 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:17.672498 kubelet[2362]: E0517 10:33:17.672440 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:17.672533 kubelet[2362]: W0517 10:33:17.672502 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:17.672533 kubelet[2362]: E0517 10:33:17.672525 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:17.673685 kubelet[2362]: I0517 10:33:17.673662 2362 server.go:1274] "Started kubelet" May 17 10:33:17.676210 kubelet[2362]: I0517 10:33:17.674575 2362 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 17 10:33:17.676210 kubelet[2362]: I0517 10:33:17.675843 2362 server.go:449] "Adding debug handlers to kubelet server" May 17 10:33:17.676210 kubelet[2362]: I0517 10:33:17.676049 2362 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 17 10:33:17.677568 kubelet[2362]: I0517 10:33:17.677540 2362 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 17 10:33:17.677703 kubelet[2362]: I0517 10:33:17.677682 2362 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 17 10:33:17.677857 kubelet[2362]: I0517 10:33:17.677845 2362 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 17 10:33:17.678245 kubelet[2362]: I0517 10:33:17.678230 2362 factory.go:221] Registration of the systemd container factory successfully May 17 10:33:17.678389 kubelet[2362]: I0517 10:33:17.678373 2362 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 17 10:33:17.679308 kubelet[2362]: I0517 10:33:17.679295 2362 factory.go:221] Registration of the containerd container factory successfully May 17 10:33:17.680527 kubelet[2362]: I0517 10:33:17.680497 2362 volume_manager.go:289] "Starting Kubelet Volume Manager" May 17 10:33:17.680596 kubelet[2362]: I0517 10:33:17.680585 2362 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 17 10:33:17.680633 kubelet[2362]: I0517 10:33:17.680623 2362 reconciler.go:26] "Reconciler: start to sync state" May 17 10:33:17.680754 kubelet[2362]: E0517 10:33:17.680736 2362 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 17 10:33:17.685480 kubelet[2362]: W0517 10:33:17.685370 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:17.685480 kubelet[2362]: E0517 10:33:17.685455 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:17.685761 kubelet[2362]: E0517 10:33:17.685742 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="200ms" May 17 10:33:17.686484 kubelet[2362]: E0517 10:33:17.685380 2362 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.118:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.118:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.184049faf825e9d9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-17 10:33:17.673638361 +0000 UTC m=+0.229689668,LastTimestamp:2025-05-17 10:33:17.673638361 +0000 UTC m=+0.229689668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 17 10:33:17.690564 kubelet[2362]: I0517 10:33:17.690540 2362 cpu_manager.go:214] "Starting CPU manager" policy="none" May 17 10:33:17.690564 kubelet[2362]: I0517 10:33:17.690557 2362 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 17 10:33:17.690635 kubelet[2362]: I0517 10:33:17.690571 2362 state_mem.go:36] "Initialized new in-memory state store" May 17 10:33:17.781691 kubelet[2362]: E0517 10:33:17.781603 2362 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 17 10:33:17.881967 kubelet[2362]: E0517 10:33:17.881951 2362 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 17 10:33:17.886668 kubelet[2362]: E0517 10:33:17.886615 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="400ms" May 17 10:33:17.982545 kubelet[2362]: E0517 10:33:17.982491 2362 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 17 10:33:18.051043 kubelet[2362]: I0517 10:33:18.050901 2362 policy_none.go:49] "None policy: Start" May 17 10:33:18.051635 kubelet[2362]: I0517 10:33:18.051602 2362 memory_manager.go:170] "Starting memorymanager" policy="None" May 17 10:33:18.051635 kubelet[2362]: I0517 10:33:18.051622 2362 state_mem.go:35] "Initializing new in-memory state store" May 17 10:33:18.055501 kubelet[2362]: I0517 10:33:18.055452 2362 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 17 10:33:18.057036 kubelet[2362]: I0517 10:33:18.056666 2362 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 17 10:33:18.057036 kubelet[2362]: I0517 10:33:18.056795 2362 status_manager.go:217] "Starting to sync pod status with apiserver" May 17 10:33:18.057036 kubelet[2362]: I0517 10:33:18.056814 2362 kubelet.go:2321] "Starting kubelet main sync loop" May 17 10:33:18.057637 kubelet[2362]: E0517 10:33:18.057609 2362 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 17 10:33:18.058824 kubelet[2362]: W0517 10:33:18.058794 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:18.058870 kubelet[2362]: E0517 10:33:18.058830 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:18.059195 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 17 10:33:18.076053 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 17 10:33:18.079528 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 17 10:33:18.083541 kubelet[2362]: E0517 10:33:18.083511 2362 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 17 10:33:18.103697 kubelet[2362]: I0517 10:33:18.103675 2362 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 17 10:33:18.103939 kubelet[2362]: I0517 10:33:18.103902 2362 eviction_manager.go:189] "Eviction manager: starting control loop" May 17 10:33:18.103981 kubelet[2362]: I0517 10:33:18.103918 2362 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 17 10:33:18.104414 kubelet[2362]: I0517 10:33:18.104133 2362 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 17 10:33:18.105040 kubelet[2362]: E0517 10:33:18.105023 2362 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 17 10:33:18.166571 systemd[1]: Created slice kubepods-burstable-podaa613cf2a3aa1631033356dcc03a04b5.slice - libcontainer container kubepods-burstable-podaa613cf2a3aa1631033356dcc03a04b5.slice. May 17 10:33:18.184147 kubelet[2362]: I0517 10:33:18.184118 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ea5884ad3481d5218ff4c8f11f2934d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"ea5884ad3481d5218ff4c8f11f2934d5\") " pod="kube-system/kube-scheduler-localhost" May 17 10:33:18.184147 kubelet[2362]: I0517 10:33:18.184144 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa613cf2a3aa1631033356dcc03a04b5-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa613cf2a3aa1631033356dcc03a04b5\") " pod="kube-system/kube-apiserver-localhost" May 17 10:33:18.184301 kubelet[2362]: I0517 10:33:18.184161 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa613cf2a3aa1631033356dcc03a04b5-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"aa613cf2a3aa1631033356dcc03a04b5\") " pod="kube-system/kube-apiserver-localhost" May 17 10:33:18.184301 kubelet[2362]: I0517 10:33:18.184175 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:18.184301 kubelet[2362]: I0517 10:33:18.184187 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:18.184301 kubelet[2362]: I0517 10:33:18.184200 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa613cf2a3aa1631033356dcc03a04b5-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa613cf2a3aa1631033356dcc03a04b5\") " pod="kube-system/kube-apiserver-localhost" May 17 10:33:18.184301 kubelet[2362]: I0517 10:33:18.184213 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:18.184446 kubelet[2362]: I0517 10:33:18.184226 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:18.184446 kubelet[2362]: I0517 10:33:18.184238 2362 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:18.195197 systemd[1]: Created slice kubepods-burstable-poda3416600bab1918b24583836301c9096.slice - libcontainer container kubepods-burstable-poda3416600bab1918b24583836301c9096.slice. May 17 10:33:18.199179 systemd[1]: Created slice kubepods-burstable-podea5884ad3481d5218ff4c8f11f2934d5.slice - libcontainer container kubepods-burstable-podea5884ad3481d5218ff4c8f11f2934d5.slice. May 17 10:33:18.205287 kubelet[2362]: I0517 10:33:18.205248 2362 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 17 10:33:18.205619 kubelet[2362]: E0517 10:33:18.205580 2362 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" May 17 10:33:18.287592 kubelet[2362]: E0517 10:33:18.287542 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="800ms" May 17 10:33:18.406902 kubelet[2362]: I0517 10:33:18.406830 2362 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 17 10:33:18.407024 kubelet[2362]: E0517 10:33:18.407004 2362 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" May 17 10:33:18.492356 containerd[1593]: time="2025-05-17T10:33:18.492299449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:aa613cf2a3aa1631033356dcc03a04b5,Namespace:kube-system,Attempt:0,}" May 17 10:33:18.498123 containerd[1593]: time="2025-05-17T10:33:18.498049339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a3416600bab1918b24583836301c9096,Namespace:kube-system,Attempt:0,}" May 17 10:33:18.501615 containerd[1593]: time="2025-05-17T10:33:18.501583610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:ea5884ad3481d5218ff4c8f11f2934d5,Namespace:kube-system,Attempt:0,}" May 17 10:33:18.522916 containerd[1593]: time="2025-05-17T10:33:18.522866677Z" level=info msg="connecting to shim 7a014ebc56b9b7a1baac8ee18ae7ad3fa9ccac381a065419f1ac04acac52d9f9" address="unix:///run/containerd/s/7a0772854e819567dd174ad2b884f7606c2387d62096bdf92f244dff4db2ae7e" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:18.547562 systemd[1]: Started cri-containerd-7a014ebc56b9b7a1baac8ee18ae7ad3fa9ccac381a065419f1ac04acac52d9f9.scope - libcontainer container 7a014ebc56b9b7a1baac8ee18ae7ad3fa9ccac381a065419f1ac04acac52d9f9. May 17 10:33:18.727288 kubelet[2362]: W0517 10:33:18.727211 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:18.727288 kubelet[2362]: E0517 10:33:18.727266 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:18.795260 kubelet[2362]: W0517 10:33:18.795183 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:18.795260 kubelet[2362]: E0517 10:33:18.795253 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:18.810408 kubelet[2362]: I0517 10:33:18.809840 2362 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 17 10:33:18.810408 kubelet[2362]: E0517 10:33:18.810151 2362 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" May 17 10:33:18.889356 kubelet[2362]: W0517 10:33:18.889269 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:18.889356 kubelet[2362]: E0517 10:33:18.889358 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.118:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:19.088578 kubelet[2362]: E0517 10:33:19.088445 2362 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="1.6s" May 17 10:33:19.408036 kubelet[2362]: E0517 10:33:19.407843 2362 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.118:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.118:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.184049faf825e9d9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-17 10:33:17.673638361 +0000 UTC m=+0.229689668,LastTimestamp:2025-05-17 10:33:17.673638361 +0000 UTC m=+0.229689668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 17 10:33:19.612007 kubelet[2362]: I0517 10:33:19.611962 2362 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 17 10:33:19.612375 kubelet[2362]: E0517 10:33:19.612325 2362 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" May 17 10:33:19.652977 kubelet[2362]: W0517 10:33:19.652921 2362 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused May 17 10:33:19.653034 kubelet[2362]: E0517 10:33:19.652982 2362 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:19.663979 kubelet[2362]: E0517 10:33:19.663901 2362 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.118:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.118:6443: connect: connection refused" logger="UnhandledError" May 17 10:33:19.766197 containerd[1593]: time="2025-05-17T10:33:19.766147594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:aa613cf2a3aa1631033356dcc03a04b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"7a014ebc56b9b7a1baac8ee18ae7ad3fa9ccac381a065419f1ac04acac52d9f9\"" May 17 10:33:19.768874 containerd[1593]: time="2025-05-17T10:33:19.768840166Z" level=info msg="CreateContainer within sandbox \"7a014ebc56b9b7a1baac8ee18ae7ad3fa9ccac381a065419f1ac04acac52d9f9\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 17 10:33:19.886182 containerd[1593]: time="2025-05-17T10:33:19.886139470Z" level=info msg="connecting to shim 488718e7a8145a8256ff1849d15f1f5a79ad4e9a7bd4e094864ed06ac0159803" address="unix:///run/containerd/s/a714d67763143d49d3eaa5ec2aea13439cb1109d5192840192f8f223455cdd3c" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:19.912521 systemd[1]: Started cri-containerd-488718e7a8145a8256ff1849d15f1f5a79ad4e9a7bd4e094864ed06ac0159803.scope - libcontainer container 488718e7a8145a8256ff1849d15f1f5a79ad4e9a7bd4e094864ed06ac0159803. May 17 10:33:19.956598 containerd[1593]: time="2025-05-17T10:33:19.956556803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a3416600bab1918b24583836301c9096,Namespace:kube-system,Attempt:0,} returns sandbox id \"488718e7a8145a8256ff1849d15f1f5a79ad4e9a7bd4e094864ed06ac0159803\"" May 17 10:33:19.958900 containerd[1593]: time="2025-05-17T10:33:19.958859624Z" level=info msg="CreateContainer within sandbox \"488718e7a8145a8256ff1849d15f1f5a79ad4e9a7bd4e094864ed06ac0159803\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 17 10:33:19.964714 containerd[1593]: time="2025-05-17T10:33:19.964679104Z" level=info msg="Container 80edbc24d5c62541b91b8c1bc3854d43e797b14b45d4f6f268150519f2aa476f: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:19.969492 containerd[1593]: time="2025-05-17T10:33:19.969466327Z" level=info msg="connecting to shim 7b2393a731ea25ad207c36d19eea9573f09e43805033cc6ac6403b3afa6f31c7" address="unix:///run/containerd/s/02e9b1eb51c4dcb2b627831934e4a6a45ae6b4e08c01a42cf1dbc6ed7b753d30" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:19.975883 containerd[1593]: time="2025-05-17T10:33:19.975840598Z" level=info msg="CreateContainer within sandbox \"7a014ebc56b9b7a1baac8ee18ae7ad3fa9ccac381a065419f1ac04acac52d9f9\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"80edbc24d5c62541b91b8c1bc3854d43e797b14b45d4f6f268150519f2aa476f\"" May 17 10:33:19.976567 containerd[1593]: time="2025-05-17T10:33:19.976489916Z" level=info msg="StartContainer for \"80edbc24d5c62541b91b8c1bc3854d43e797b14b45d4f6f268150519f2aa476f\"" May 17 10:33:19.977455 containerd[1593]: time="2025-05-17T10:33:19.977427686Z" level=info msg="connecting to shim 80edbc24d5c62541b91b8c1bc3854d43e797b14b45d4f6f268150519f2aa476f" address="unix:///run/containerd/s/7a0772854e819567dd174ad2b884f7606c2387d62096bdf92f244dff4db2ae7e" protocol=ttrpc version=3 May 17 10:33:19.978710 containerd[1593]: time="2025-05-17T10:33:19.978430598Z" level=info msg="Container 8fa97dfd0c95f143193ee2295442983c386202c9b51edd71cbe4a3fdb0c75641: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:19.987147 containerd[1593]: time="2025-05-17T10:33:19.987108251Z" level=info msg="CreateContainer within sandbox \"488718e7a8145a8256ff1849d15f1f5a79ad4e9a7bd4e094864ed06ac0159803\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8fa97dfd0c95f143193ee2295442983c386202c9b51edd71cbe4a3fdb0c75641\"" May 17 10:33:19.987831 containerd[1593]: time="2025-05-17T10:33:19.987796342Z" level=info msg="StartContainer for \"8fa97dfd0c95f143193ee2295442983c386202c9b51edd71cbe4a3fdb0c75641\"" May 17 10:33:19.989190 containerd[1593]: time="2025-05-17T10:33:19.989143019Z" level=info msg="connecting to shim 8fa97dfd0c95f143193ee2295442983c386202c9b51edd71cbe4a3fdb0c75641" address="unix:///run/containerd/s/a714d67763143d49d3eaa5ec2aea13439cb1109d5192840192f8f223455cdd3c" protocol=ttrpc version=3 May 17 10:33:19.995535 systemd[1]: Started cri-containerd-7b2393a731ea25ad207c36d19eea9573f09e43805033cc6ac6403b3afa6f31c7.scope - libcontainer container 7b2393a731ea25ad207c36d19eea9573f09e43805033cc6ac6403b3afa6f31c7. May 17 10:33:19.999170 systemd[1]: Started cri-containerd-80edbc24d5c62541b91b8c1bc3854d43e797b14b45d4f6f268150519f2aa476f.scope - libcontainer container 80edbc24d5c62541b91b8c1bc3854d43e797b14b45d4f6f268150519f2aa476f. May 17 10:33:20.009506 systemd[1]: Started cri-containerd-8fa97dfd0c95f143193ee2295442983c386202c9b51edd71cbe4a3fdb0c75641.scope - libcontainer container 8fa97dfd0c95f143193ee2295442983c386202c9b51edd71cbe4a3fdb0c75641. May 17 10:33:20.050254 containerd[1593]: time="2025-05-17T10:33:20.050158350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:ea5884ad3481d5218ff4c8f11f2934d5,Namespace:kube-system,Attempt:0,} returns sandbox id \"7b2393a731ea25ad207c36d19eea9573f09e43805033cc6ac6403b3afa6f31c7\"" May 17 10:33:20.053126 containerd[1593]: time="2025-05-17T10:33:20.052893192Z" level=info msg="CreateContainer within sandbox \"7b2393a731ea25ad207c36d19eea9573f09e43805033cc6ac6403b3afa6f31c7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 17 10:33:20.058928 containerd[1593]: time="2025-05-17T10:33:20.058897528Z" level=info msg="StartContainer for \"80edbc24d5c62541b91b8c1bc3854d43e797b14b45d4f6f268150519f2aa476f\" returns successfully" May 17 10:33:20.063990 containerd[1593]: time="2025-05-17T10:33:20.063959447Z" level=info msg="StartContainer for \"8fa97dfd0c95f143193ee2295442983c386202c9b51edd71cbe4a3fdb0c75641\" returns successfully" May 17 10:33:20.068587 containerd[1593]: time="2025-05-17T10:33:20.067758796Z" level=info msg="Container 49a1111ba626b5fb287d5d6e1b9f5547862d4b7496d1bed11f89abc773fab2f0: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:20.077716 containerd[1593]: time="2025-05-17T10:33:20.077683739Z" level=info msg="CreateContainer within sandbox \"7b2393a731ea25ad207c36d19eea9573f09e43805033cc6ac6403b3afa6f31c7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"49a1111ba626b5fb287d5d6e1b9f5547862d4b7496d1bed11f89abc773fab2f0\"" May 17 10:33:20.078366 containerd[1593]: time="2025-05-17T10:33:20.078350200Z" level=info msg="StartContainer for \"49a1111ba626b5fb287d5d6e1b9f5547862d4b7496d1bed11f89abc773fab2f0\"" May 17 10:33:20.079467 containerd[1593]: time="2025-05-17T10:33:20.079447910Z" level=info msg="connecting to shim 49a1111ba626b5fb287d5d6e1b9f5547862d4b7496d1bed11f89abc773fab2f0" address="unix:///run/containerd/s/02e9b1eb51c4dcb2b627831934e4a6a45ae6b4e08c01a42cf1dbc6ed7b753d30" protocol=ttrpc version=3 May 17 10:33:20.111795 systemd[1]: Started cri-containerd-49a1111ba626b5fb287d5d6e1b9f5547862d4b7496d1bed11f89abc773fab2f0.scope - libcontainer container 49a1111ba626b5fb287d5d6e1b9f5547862d4b7496d1bed11f89abc773fab2f0. May 17 10:33:20.164794 containerd[1593]: time="2025-05-17T10:33:20.164463175Z" level=info msg="StartContainer for \"49a1111ba626b5fb287d5d6e1b9f5547862d4b7496d1bed11f89abc773fab2f0\" returns successfully" May 17 10:33:20.993957 kubelet[2362]: E0517 10:33:20.993908 2362 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 17 10:33:21.214468 kubelet[2362]: I0517 10:33:21.214439 2362 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 17 10:33:21.220942 kubelet[2362]: I0517 10:33:21.220906 2362 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 17 10:33:21.669779 kubelet[2362]: I0517 10:33:21.669695 2362 apiserver.go:52] "Watching apiserver" May 17 10:33:21.681559 kubelet[2362]: I0517 10:33:21.681505 2362 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 17 10:33:23.894882 systemd[1]: Reload requested from client PID 2634 ('systemctl') (unit session-7.scope)... May 17 10:33:23.894899 systemd[1]: Reloading... May 17 10:33:23.975458 zram_generator::config[2677]: No configuration found. May 17 10:33:24.284608 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 17 10:33:24.412893 systemd[1]: Reloading finished in 517 ms. May 17 10:33:24.439506 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 17 10:33:24.460620 systemd[1]: kubelet.service: Deactivated successfully. May 17 10:33:24.460897 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 17 10:33:24.460950 systemd[1]: kubelet.service: Consumed 686ms CPU time, 132.1M memory peak. May 17 10:33:24.462766 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 17 10:33:24.655991 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 17 10:33:24.660178 (kubelet)[2722]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 17 10:33:24.694996 kubelet[2722]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 17 10:33:24.694996 kubelet[2722]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 17 10:33:24.694996 kubelet[2722]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 17 10:33:24.695380 kubelet[2722]: I0517 10:33:24.695052 2722 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 17 10:33:24.700627 kubelet[2722]: I0517 10:33:24.700583 2722 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 17 10:33:24.700627 kubelet[2722]: I0517 10:33:24.700611 2722 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 17 10:33:24.702179 kubelet[2722]: I0517 10:33:24.702149 2722 server.go:934] "Client rotation is on, will bootstrap in background" May 17 10:33:24.703409 kubelet[2722]: I0517 10:33:24.703285 2722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 17 10:33:24.706413 kubelet[2722]: I0517 10:33:24.705308 2722 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 17 10:33:24.709244 kubelet[2722]: I0517 10:33:24.709219 2722 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 17 10:33:24.714325 kubelet[2722]: I0517 10:33:24.714297 2722 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 17 10:33:24.714469 kubelet[2722]: I0517 10:33:24.714452 2722 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 17 10:33:24.714628 kubelet[2722]: I0517 10:33:24.714595 2722 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 17 10:33:24.714803 kubelet[2722]: I0517 10:33:24.714628 2722 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 17 10:33:24.714880 kubelet[2722]: I0517 10:33:24.714813 2722 topology_manager.go:138] "Creating topology manager with none policy" May 17 10:33:24.714880 kubelet[2722]: I0517 10:33:24.714821 2722 container_manager_linux.go:300] "Creating device plugin manager" May 17 10:33:24.714880 kubelet[2722]: I0517 10:33:24.714848 2722 state_mem.go:36] "Initialized new in-memory state store" May 17 10:33:24.716409 kubelet[2722]: I0517 10:33:24.714955 2722 kubelet.go:408] "Attempting to sync node with API server" May 17 10:33:24.716409 kubelet[2722]: I0517 10:33:24.714967 2722 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 17 10:33:24.716409 kubelet[2722]: I0517 10:33:24.715686 2722 kubelet.go:314] "Adding apiserver pod source" May 17 10:33:24.716409 kubelet[2722]: I0517 10:33:24.715697 2722 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 17 10:33:24.721052 kubelet[2722]: I0517 10:33:24.721020 2722 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 17 10:33:24.721444 kubelet[2722]: I0517 10:33:24.721426 2722 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 17 10:33:24.721856 kubelet[2722]: I0517 10:33:24.721840 2722 server.go:1274] "Started kubelet" May 17 10:33:24.727769 kubelet[2722]: I0517 10:33:24.727730 2722 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 17 10:33:24.728746 kubelet[2722]: I0517 10:33:24.728730 2722 server.go:449] "Adding debug handlers to kubelet server" May 17 10:33:24.729666 kubelet[2722]: I0517 10:33:24.729638 2722 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 17 10:33:24.729847 kubelet[2722]: I0517 10:33:24.729831 2722 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 17 10:33:24.730707 kubelet[2722]: E0517 10:33:24.730691 2722 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 17 10:33:24.737415 kubelet[2722]: I0517 10:33:24.735569 2722 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 17 10:33:24.737415 kubelet[2722]: I0517 10:33:24.735678 2722 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 17 10:33:24.737415 kubelet[2722]: I0517 10:33:24.737116 2722 volume_manager.go:289] "Starting Kubelet Volume Manager" May 17 10:33:24.737415 kubelet[2722]: I0517 10:33:24.737193 2722 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 17 10:33:24.737415 kubelet[2722]: I0517 10:33:24.737311 2722 reconciler.go:26] "Reconciler: start to sync state" May 17 10:33:24.739040 kubelet[2722]: I0517 10:33:24.739014 2722 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 17 10:33:24.740734 kubelet[2722]: I0517 10:33:24.740721 2722 factory.go:221] Registration of the containerd container factory successfully May 17 10:33:24.740807 kubelet[2722]: I0517 10:33:24.740798 2722 factory.go:221] Registration of the systemd container factory successfully May 17 10:33:24.747669 kubelet[2722]: I0517 10:33:24.747642 2722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 17 10:33:24.748968 kubelet[2722]: I0517 10:33:24.748943 2722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 17 10:33:24.749044 kubelet[2722]: I0517 10:33:24.749036 2722 status_manager.go:217] "Starting to sync pod status with apiserver" May 17 10:33:24.749108 kubelet[2722]: I0517 10:33:24.749099 2722 kubelet.go:2321] "Starting kubelet main sync loop" May 17 10:33:24.749195 kubelet[2722]: E0517 10:33:24.749178 2722 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 17 10:33:24.776025 kubelet[2722]: I0517 10:33:24.775993 2722 cpu_manager.go:214] "Starting CPU manager" policy="none" May 17 10:33:24.776025 kubelet[2722]: I0517 10:33:24.776013 2722 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 17 10:33:24.776025 kubelet[2722]: I0517 10:33:24.776035 2722 state_mem.go:36] "Initialized new in-memory state store" May 17 10:33:24.776219 kubelet[2722]: I0517 10:33:24.776164 2722 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 17 10:33:24.776219 kubelet[2722]: I0517 10:33:24.776181 2722 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 17 10:33:24.776219 kubelet[2722]: I0517 10:33:24.776198 2722 policy_none.go:49] "None policy: Start" May 17 10:33:24.776665 kubelet[2722]: I0517 10:33:24.776648 2722 memory_manager.go:170] "Starting memorymanager" policy="None" May 17 10:33:24.776665 kubelet[2722]: I0517 10:33:24.776667 2722 state_mem.go:35] "Initializing new in-memory state store" May 17 10:33:24.776779 kubelet[2722]: I0517 10:33:24.776766 2722 state_mem.go:75] "Updated machine memory state" May 17 10:33:24.781720 kubelet[2722]: I0517 10:33:24.781690 2722 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 17 10:33:24.781886 kubelet[2722]: I0517 10:33:24.781871 2722 eviction_manager.go:189] "Eviction manager: starting control loop" May 17 10:33:24.781916 kubelet[2722]: I0517 10:33:24.781886 2722 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 17 10:33:24.782087 kubelet[2722]: I0517 10:33:24.782069 2722 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 17 10:33:24.885911 kubelet[2722]: I0517 10:33:24.885883 2722 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 17 10:33:24.971861 kubelet[2722]: E0517 10:33:24.971812 2722 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 17 10:33:25.015099 kubelet[2722]: I0517 10:33:25.015070 2722 kubelet_node_status.go:111] "Node was previously registered" node="localhost" May 17 10:33:25.015259 kubelet[2722]: I0517 10:33:25.015147 2722 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 17 10:33:25.038565 kubelet[2722]: I0517 10:33:25.038536 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa613cf2a3aa1631033356dcc03a04b5-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa613cf2a3aa1631033356dcc03a04b5\") " pod="kube-system/kube-apiserver-localhost" May 17 10:33:25.038565 kubelet[2722]: I0517 10:33:25.038565 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa613cf2a3aa1631033356dcc03a04b5-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"aa613cf2a3aa1631033356dcc03a04b5\") " pod="kube-system/kube-apiserver-localhost" May 17 10:33:25.038718 kubelet[2722]: I0517 10:33:25.038586 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:25.038718 kubelet[2722]: I0517 10:33:25.038601 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:25.038718 kubelet[2722]: I0517 10:33:25.038616 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:25.038718 kubelet[2722]: I0517 10:33:25.038632 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa613cf2a3aa1631033356dcc03a04b5-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa613cf2a3aa1631033356dcc03a04b5\") " pod="kube-system/kube-apiserver-localhost" May 17 10:33:25.038718 kubelet[2722]: I0517 10:33:25.038647 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:25.038841 kubelet[2722]: I0517 10:33:25.038665 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 17 10:33:25.038841 kubelet[2722]: I0517 10:33:25.038686 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ea5884ad3481d5218ff4c8f11f2934d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"ea5884ad3481d5218ff4c8f11f2934d5\") " pod="kube-system/kube-scheduler-localhost" May 17 10:33:25.717542 kubelet[2722]: I0517 10:33:25.717492 2722 apiserver.go:52] "Watching apiserver" May 17 10:33:25.738097 kubelet[2722]: I0517 10:33:25.738067 2722 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 17 10:33:25.780163 kubelet[2722]: I0517 10:33:25.780101 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.780085116 podStartE2EDuration="3.780085116s" podCreationTimestamp="2025-05-17 10:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-17 10:33:25.77785956 +0000 UTC m=+1.114260150" watchObservedRunningTime="2025-05-17 10:33:25.780085116 +0000 UTC m=+1.116485707" May 17 10:33:25.784448 kubelet[2722]: I0517 10:33:25.784371 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.784352143 podStartE2EDuration="1.784352143s" podCreationTimestamp="2025-05-17 10:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-17 10:33:25.784334139 +0000 UTC m=+1.120734719" watchObservedRunningTime="2025-05-17 10:33:25.784352143 +0000 UTC m=+1.120752733" May 17 10:33:29.029712 kubelet[2722]: I0517 10:33:29.029659 2722 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 17 10:33:29.030147 containerd[1593]: time="2025-05-17T10:33:29.030011150Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 17 10:33:29.030387 kubelet[2722]: I0517 10:33:29.030192 2722 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 17 10:33:29.679508 kubelet[2722]: I0517 10:33:29.679409 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=5.679373201 podStartE2EDuration="5.679373201s" podCreationTimestamp="2025-05-17 10:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-17 10:33:25.793080371 +0000 UTC m=+1.129480961" watchObservedRunningTime="2025-05-17 10:33:29.679373201 +0000 UTC m=+5.015773781" May 17 10:33:29.686653 systemd[1]: Created slice kubepods-besteffort-pod660c180f_cc97_4d42_95c5_5d71925417d7.slice - libcontainer container kubepods-besteffort-pod660c180f_cc97_4d42_95c5_5d71925417d7.slice. May 17 10:33:29.771469 kubelet[2722]: I0517 10:33:29.771436 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/660c180f-cc97-4d42-95c5-5d71925417d7-xtables-lock\") pod \"kube-proxy-rh55r\" (UID: \"660c180f-cc97-4d42-95c5-5d71925417d7\") " pod="kube-system/kube-proxy-rh55r" May 17 10:33:29.771469 kubelet[2722]: I0517 10:33:29.771467 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/660c180f-cc97-4d42-95c5-5d71925417d7-lib-modules\") pod \"kube-proxy-rh55r\" (UID: \"660c180f-cc97-4d42-95c5-5d71925417d7\") " pod="kube-system/kube-proxy-rh55r" May 17 10:33:29.771645 kubelet[2722]: I0517 10:33:29.771485 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tcxm\" (UniqueName: \"kubernetes.io/projected/660c180f-cc97-4d42-95c5-5d71925417d7-kube-api-access-4tcxm\") pod \"kube-proxy-rh55r\" (UID: \"660c180f-cc97-4d42-95c5-5d71925417d7\") " pod="kube-system/kube-proxy-rh55r" May 17 10:33:29.771645 kubelet[2722]: I0517 10:33:29.771502 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/660c180f-cc97-4d42-95c5-5d71925417d7-kube-proxy\") pod \"kube-proxy-rh55r\" (UID: \"660c180f-cc97-4d42-95c5-5d71925417d7\") " pod="kube-system/kube-proxy-rh55r" May 17 10:33:29.999063 containerd[1593]: time="2025-05-17T10:33:29.998951547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rh55r,Uid:660c180f-cc97-4d42-95c5-5d71925417d7,Namespace:kube-system,Attempt:0,}" May 17 10:33:30.027252 containerd[1593]: time="2025-05-17T10:33:30.027181409Z" level=info msg="connecting to shim bad27e32952112341c639c4fa388d901baea885259ab3284b37af4680dbad6e7" address="unix:///run/containerd/s/2a8430e3e57d07e4da73518d4a8bb4103233d5227708d6454d0e3f317850edcd" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:30.066645 systemd[1]: Started cri-containerd-bad27e32952112341c639c4fa388d901baea885259ab3284b37af4680dbad6e7.scope - libcontainer container bad27e32952112341c639c4fa388d901baea885259ab3284b37af4680dbad6e7. May 17 10:33:30.091871 containerd[1593]: time="2025-05-17T10:33:30.091833644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rh55r,Uid:660c180f-cc97-4d42-95c5-5d71925417d7,Namespace:kube-system,Attempt:0,} returns sandbox id \"bad27e32952112341c639c4fa388d901baea885259ab3284b37af4680dbad6e7\"" May 17 10:33:30.094259 containerd[1593]: time="2025-05-17T10:33:30.094221135Z" level=info msg="CreateContainer within sandbox \"bad27e32952112341c639c4fa388d901baea885259ab3284b37af4680dbad6e7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 17 10:33:30.108963 containerd[1593]: time="2025-05-17T10:33:30.108907022Z" level=info msg="Container c93696c4aefa0a6af95197c4eb44cbe90c44753bf740bdd6ac466328f77962cc: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:30.118644 containerd[1593]: time="2025-05-17T10:33:30.118596484Z" level=info msg="CreateContainer within sandbox \"bad27e32952112341c639c4fa388d901baea885259ab3284b37af4680dbad6e7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c93696c4aefa0a6af95197c4eb44cbe90c44753bf740bdd6ac466328f77962cc\"" May 17 10:33:30.119296 containerd[1593]: time="2025-05-17T10:33:30.119247902Z" level=info msg="StartContainer for \"c93696c4aefa0a6af95197c4eb44cbe90c44753bf740bdd6ac466328f77962cc\"" May 17 10:33:30.121348 containerd[1593]: time="2025-05-17T10:33:30.121285653Z" level=info msg="connecting to shim c93696c4aefa0a6af95197c4eb44cbe90c44753bf740bdd6ac466328f77962cc" address="unix:///run/containerd/s/2a8430e3e57d07e4da73518d4a8bb4103233d5227708d6454d0e3f317850edcd" protocol=ttrpc version=3 May 17 10:33:30.152610 systemd[1]: Started cri-containerd-c93696c4aefa0a6af95197c4eb44cbe90c44753bf740bdd6ac466328f77962cc.scope - libcontainer container c93696c4aefa0a6af95197c4eb44cbe90c44753bf740bdd6ac466328f77962cc. May 17 10:33:30.172855 systemd[1]: Created slice kubepods-besteffort-pod4801f526_9d50_461f_b781_57c928c1318a.slice - libcontainer container kubepods-besteffort-pod4801f526_9d50_461f_b781_57c928c1318a.slice. May 17 10:33:30.173764 kubelet[2722]: I0517 10:33:30.173726 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4801f526-9d50-461f-b781-57c928c1318a-var-lib-calico\") pod \"tigera-operator-7c5755cdcb-4qv4g\" (UID: \"4801f526-9d50-461f-b781-57c928c1318a\") " pod="tigera-operator/tigera-operator-7c5755cdcb-4qv4g" May 17 10:33:30.174150 kubelet[2722]: I0517 10:33:30.173773 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p558\" (UniqueName: \"kubernetes.io/projected/4801f526-9d50-461f-b781-57c928c1318a-kube-api-access-2p558\") pod \"tigera-operator-7c5755cdcb-4qv4g\" (UID: \"4801f526-9d50-461f-b781-57c928c1318a\") " pod="tigera-operator/tigera-operator-7c5755cdcb-4qv4g" May 17 10:33:30.227158 containerd[1593]: time="2025-05-17T10:33:30.226961081Z" level=info msg="StartContainer for \"c93696c4aefa0a6af95197c4eb44cbe90c44753bf740bdd6ac466328f77962cc\" returns successfully" May 17 10:33:30.478008 containerd[1593]: time="2025-05-17T10:33:30.477955100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-4qv4g,Uid:4801f526-9d50-461f-b781-57c928c1318a,Namespace:tigera-operator,Attempt:0,}" May 17 10:33:30.502378 containerd[1593]: time="2025-05-17T10:33:30.501923210Z" level=info msg="connecting to shim 8569f89011427c383b1dd5a7f06a3e40dba02692feb938de67d5e3c4f406b680" address="unix:///run/containerd/s/95ee8630f65aeb6e662edbbd3f2627e0a61447f64cb1730e427cd90fd510bb00" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:30.525584 systemd[1]: Started cri-containerd-8569f89011427c383b1dd5a7f06a3e40dba02692feb938de67d5e3c4f406b680.scope - libcontainer container 8569f89011427c383b1dd5a7f06a3e40dba02692feb938de67d5e3c4f406b680. May 17 10:33:30.571933 containerd[1593]: time="2025-05-17T10:33:30.571889949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-4qv4g,Uid:4801f526-9d50-461f-b781-57c928c1318a,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8569f89011427c383b1dd5a7f06a3e40dba02692feb938de67d5e3c4f406b680\"" May 17 10:33:30.574758 containerd[1593]: time="2025-05-17T10:33:30.574725518Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 17 10:33:30.779998 kubelet[2722]: I0517 10:33:30.779578 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-rh55r" podStartSLOduration=1.779559758 podStartE2EDuration="1.779559758s" podCreationTimestamp="2025-05-17 10:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-17 10:33:30.77922136 +0000 UTC m=+6.115621960" watchObservedRunningTime="2025-05-17 10:33:30.779559758 +0000 UTC m=+6.115960338" May 17 10:33:31.863011 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount280399977.mount: Deactivated successfully. May 17 10:33:32.179680 containerd[1593]: time="2025-05-17T10:33:32.179623512Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:32.180446 containerd[1593]: time="2025-05-17T10:33:32.180418170Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 17 10:33:32.181716 containerd[1593]: time="2025-05-17T10:33:32.181658820Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:32.183595 containerd[1593]: time="2025-05-17T10:33:32.183565051Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:32.184161 containerd[1593]: time="2025-05-17T10:33:32.184120932Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 1.609358642s" May 17 10:33:32.184186 containerd[1593]: time="2025-05-17T10:33:32.184159336Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 17 10:33:32.186141 containerd[1593]: time="2025-05-17T10:33:32.186082779Z" level=info msg="CreateContainer within sandbox \"8569f89011427c383b1dd5a7f06a3e40dba02692feb938de67d5e3c4f406b680\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 17 10:33:32.192732 containerd[1593]: time="2025-05-17T10:33:32.192701070Z" level=info msg="Container d6ceba9d303ea6c7f119eb8d54869e6b8567916c7908dcf12d1191ed6c46bb09: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:32.196327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4224471525.mount: Deactivated successfully. May 17 10:33:32.199051 containerd[1593]: time="2025-05-17T10:33:32.199013827Z" level=info msg="CreateContainer within sandbox \"8569f89011427c383b1dd5a7f06a3e40dba02692feb938de67d5e3c4f406b680\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d6ceba9d303ea6c7f119eb8d54869e6b8567916c7908dcf12d1191ed6c46bb09\"" May 17 10:33:32.199674 containerd[1593]: time="2025-05-17T10:33:32.199499736Z" level=info msg="StartContainer for \"d6ceba9d303ea6c7f119eb8d54869e6b8567916c7908dcf12d1191ed6c46bb09\"" May 17 10:33:32.200281 containerd[1593]: time="2025-05-17T10:33:32.200249077Z" level=info msg="connecting to shim d6ceba9d303ea6c7f119eb8d54869e6b8567916c7908dcf12d1191ed6c46bb09" address="unix:///run/containerd/s/95ee8630f65aeb6e662edbbd3f2627e0a61447f64cb1730e427cd90fd510bb00" protocol=ttrpc version=3 May 17 10:33:32.247515 systemd[1]: Started cri-containerd-d6ceba9d303ea6c7f119eb8d54869e6b8567916c7908dcf12d1191ed6c46bb09.scope - libcontainer container d6ceba9d303ea6c7f119eb8d54869e6b8567916c7908dcf12d1191ed6c46bb09. May 17 10:33:32.275885 containerd[1593]: time="2025-05-17T10:33:32.275815749Z" level=info msg="StartContainer for \"d6ceba9d303ea6c7f119eb8d54869e6b8567916c7908dcf12d1191ed6c46bb09\" returns successfully" May 17 10:33:32.785468 kubelet[2722]: I0517 10:33:32.785388 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7c5755cdcb-4qv4g" podStartSLOduration=1.173598706 podStartE2EDuration="2.785373342s" podCreationTimestamp="2025-05-17 10:33:30 +0000 UTC" firstStartedPulling="2025-05-17 10:33:30.573022598 +0000 UTC m=+5.909423188" lastFinishedPulling="2025-05-17 10:33:32.184797234 +0000 UTC m=+7.521197824" observedRunningTime="2025-05-17 10:33:32.785368803 +0000 UTC m=+8.121769393" watchObservedRunningTime="2025-05-17 10:33:32.785373342 +0000 UTC m=+8.121773982" May 17 10:33:37.398153 sudo[1810]: pam_unix(sudo:session): session closed for user root May 17 10:33:37.400658 sshd[1809]: Connection closed by 10.0.0.1 port 34248 May 17 10:33:37.400275 sshd-session[1807]: pam_unix(sshd:session): session closed for user core May 17 10:33:37.406306 systemd[1]: sshd@6-10.0.0.118:22-10.0.0.1:34248.service: Deactivated successfully. May 17 10:33:37.409505 systemd[1]: session-7.scope: Deactivated successfully. May 17 10:33:37.409854 systemd[1]: session-7.scope: Consumed 4.146s CPU time, 218.8M memory peak. May 17 10:33:37.411956 systemd-logind[1585]: Session 7 logged out. Waiting for processes to exit. May 17 10:33:37.416978 systemd-logind[1585]: Removed session 7. May 17 10:33:39.985532 systemd[1]: Created slice kubepods-besteffort-podf50663fb_0e6e_4660_aca1_7052d4533d16.slice - libcontainer container kubepods-besteffort-podf50663fb_0e6e_4660_aca1_7052d4533d16.slice. May 17 10:33:40.036333 kubelet[2722]: I0517 10:33:40.036284 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f50663fb-0e6e-4660-aca1-7052d4533d16-tigera-ca-bundle\") pod \"calico-typha-6bbf6c59cd-fmmjt\" (UID: \"f50663fb-0e6e-4660-aca1-7052d4533d16\") " pod="calico-system/calico-typha-6bbf6c59cd-fmmjt" May 17 10:33:40.036333 kubelet[2722]: I0517 10:33:40.036327 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f50663fb-0e6e-4660-aca1-7052d4533d16-typha-certs\") pod \"calico-typha-6bbf6c59cd-fmmjt\" (UID: \"f50663fb-0e6e-4660-aca1-7052d4533d16\") " pod="calico-system/calico-typha-6bbf6c59cd-fmmjt" May 17 10:33:40.036333 kubelet[2722]: I0517 10:33:40.036345 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrcjj\" (UniqueName: \"kubernetes.io/projected/f50663fb-0e6e-4660-aca1-7052d4533d16-kube-api-access-xrcjj\") pod \"calico-typha-6bbf6c59cd-fmmjt\" (UID: \"f50663fb-0e6e-4660-aca1-7052d4533d16\") " pod="calico-system/calico-typha-6bbf6c59cd-fmmjt" May 17 10:33:40.289540 containerd[1593]: time="2025-05-17T10:33:40.289388937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6bbf6c59cd-fmmjt,Uid:f50663fb-0e6e-4660-aca1-7052d4533d16,Namespace:calico-system,Attempt:0,}" May 17 10:33:40.401739 systemd[1]: Created slice kubepods-besteffort-pod255c2745_cd7d_4f7d_ac10_087758f095bd.slice - libcontainer container kubepods-besteffort-pod255c2745_cd7d_4f7d_ac10_087758f095bd.slice. May 17 10:33:40.411033 containerd[1593]: time="2025-05-17T10:33:40.409778641Z" level=info msg="connecting to shim 36951d8f6c65b3cdecde0e89b9fe5f4d6d3cca25fb69116ef07c5c67861043c7" address="unix:///run/containerd/s/ec3daec7822cbcc657181ceb4776dc37a3ff02e832512152aa04c45bd57e69a4" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:40.438724 kubelet[2722]: I0517 10:33:40.438669 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-cni-log-dir\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.438724 kubelet[2722]: I0517 10:33:40.438715 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-flexvol-driver-host\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.438922 kubelet[2722]: I0517 10:33:40.438746 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxmcf\" (UniqueName: \"kubernetes.io/projected/255c2745-cd7d-4f7d-ac10-087758f095bd-kube-api-access-zxmcf\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.438922 kubelet[2722]: I0517 10:33:40.438760 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-var-run-calico\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.438922 kubelet[2722]: I0517 10:33:40.438777 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-cni-net-dir\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.438922 kubelet[2722]: I0517 10:33:40.438790 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/255c2745-cd7d-4f7d-ac10-087758f095bd-tigera-ca-bundle\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.438922 kubelet[2722]: I0517 10:33:40.438806 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-cni-bin-dir\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.439034 kubelet[2722]: I0517 10:33:40.438819 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-xtables-lock\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.439034 kubelet[2722]: I0517 10:33:40.438832 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-policysync\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.439034 kubelet[2722]: I0517 10:33:40.438843 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-var-lib-calico\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.439034 kubelet[2722]: I0517 10:33:40.438857 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/255c2745-cd7d-4f7d-ac10-087758f095bd-node-certs\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.439034 kubelet[2722]: I0517 10:33:40.438869 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/255c2745-cd7d-4f7d-ac10-087758f095bd-lib-modules\") pod \"calico-node-79btc\" (UID: \"255c2745-cd7d-4f7d-ac10-087758f095bd\") " pod="calico-system/calico-node-79btc" May 17 10:33:40.465548 systemd[1]: Started cri-containerd-36951d8f6c65b3cdecde0e89b9fe5f4d6d3cca25fb69116ef07c5c67861043c7.scope - libcontainer container 36951d8f6c65b3cdecde0e89b9fe5f4d6d3cca25fb69116ef07c5c67861043c7. May 17 10:33:40.529250 containerd[1593]: time="2025-05-17T10:33:40.529207853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6bbf6c59cd-fmmjt,Uid:f50663fb-0e6e-4660-aca1-7052d4533d16,Namespace:calico-system,Attempt:0,} returns sandbox id \"36951d8f6c65b3cdecde0e89b9fe5f4d6d3cca25fb69116ef07c5c67861043c7\"" May 17 10:33:40.530464 containerd[1593]: time="2025-05-17T10:33:40.530425032Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 17 10:33:40.541208 kubelet[2722]: E0517 10:33:40.541084 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.541208 kubelet[2722]: W0517 10:33:40.541106 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.541208 kubelet[2722]: E0517 10:33:40.541136 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.544550 kubelet[2722]: E0517 10:33:40.544530 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.544550 kubelet[2722]: W0517 10:33:40.544544 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.544636 kubelet[2722]: E0517 10:33:40.544554 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.547930 kubelet[2722]: E0517 10:33:40.547904 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.547930 kubelet[2722]: W0517 10:33:40.547925 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.548003 kubelet[2722]: E0517 10:33:40.547949 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.617529 kubelet[2722]: E0517 10:33:40.617485 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4b84" podUID="c9cc419c-75dd-4426-acd6-306f9b76a4b6" May 17 10:33:40.631114 kubelet[2722]: E0517 10:33:40.631079 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.631114 kubelet[2722]: W0517 10:33:40.631103 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.631245 kubelet[2722]: E0517 10:33:40.631133 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.631354 kubelet[2722]: E0517 10:33:40.631339 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.631354 kubelet[2722]: W0517 10:33:40.631350 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.631435 kubelet[2722]: E0517 10:33:40.631358 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.631550 kubelet[2722]: E0517 10:33:40.631535 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.631550 kubelet[2722]: W0517 10:33:40.631547 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.631608 kubelet[2722]: E0517 10:33:40.631556 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.631741 kubelet[2722]: E0517 10:33:40.631719 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.631741 kubelet[2722]: W0517 10:33:40.631730 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.631741 kubelet[2722]: E0517 10:33:40.631737 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.631912 kubelet[2722]: E0517 10:33:40.631897 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.631912 kubelet[2722]: W0517 10:33:40.631907 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.631912 kubelet[2722]: E0517 10:33:40.631915 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.632070 kubelet[2722]: E0517 10:33:40.632056 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.632070 kubelet[2722]: W0517 10:33:40.632066 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.632133 kubelet[2722]: E0517 10:33:40.632073 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.632240 kubelet[2722]: E0517 10:33:40.632225 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.632240 kubelet[2722]: W0517 10:33:40.632235 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.632289 kubelet[2722]: E0517 10:33:40.632242 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.632423 kubelet[2722]: E0517 10:33:40.632386 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.632423 kubelet[2722]: W0517 10:33:40.632415 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.632423 kubelet[2722]: E0517 10:33:40.632424 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.632586 kubelet[2722]: E0517 10:33:40.632572 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.632586 kubelet[2722]: W0517 10:33:40.632583 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.632636 kubelet[2722]: E0517 10:33:40.632591 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.632743 kubelet[2722]: E0517 10:33:40.632729 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.632743 kubelet[2722]: W0517 10:33:40.632739 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.632796 kubelet[2722]: E0517 10:33:40.632747 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.632900 kubelet[2722]: E0517 10:33:40.632887 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.632900 kubelet[2722]: W0517 10:33:40.632896 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.632945 kubelet[2722]: E0517 10:33:40.632904 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.633058 kubelet[2722]: E0517 10:33:40.633044 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.633058 kubelet[2722]: W0517 10:33:40.633054 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.633115 kubelet[2722]: E0517 10:33:40.633063 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.633233 kubelet[2722]: E0517 10:33:40.633218 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.633233 kubelet[2722]: W0517 10:33:40.633228 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.633284 kubelet[2722]: E0517 10:33:40.633236 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.633388 kubelet[2722]: E0517 10:33:40.633375 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.633388 kubelet[2722]: W0517 10:33:40.633384 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.633388 kubelet[2722]: E0517 10:33:40.633403 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.633562 kubelet[2722]: E0517 10:33:40.633548 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.633562 kubelet[2722]: W0517 10:33:40.633558 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.633612 kubelet[2722]: E0517 10:33:40.633565 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.633724 kubelet[2722]: E0517 10:33:40.633710 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.633724 kubelet[2722]: W0517 10:33:40.633720 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.633778 kubelet[2722]: E0517 10:33:40.633728 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.633889 kubelet[2722]: E0517 10:33:40.633876 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.633889 kubelet[2722]: W0517 10:33:40.633886 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.633932 kubelet[2722]: E0517 10:33:40.633893 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.634047 kubelet[2722]: E0517 10:33:40.634033 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.634047 kubelet[2722]: W0517 10:33:40.634043 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.634098 kubelet[2722]: E0517 10:33:40.634050 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.634218 kubelet[2722]: E0517 10:33:40.634205 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.634218 kubelet[2722]: W0517 10:33:40.634215 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.634266 kubelet[2722]: E0517 10:33:40.634222 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.634376 kubelet[2722]: E0517 10:33:40.634363 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.634376 kubelet[2722]: W0517 10:33:40.634372 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.634456 kubelet[2722]: E0517 10:33:40.634380 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.640815 kubelet[2722]: E0517 10:33:40.640778 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.640815 kubelet[2722]: W0517 10:33:40.640798 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.640815 kubelet[2722]: E0517 10:33:40.640819 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.640994 kubelet[2722]: I0517 10:33:40.640847 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c9cc419c-75dd-4426-acd6-306f9b76a4b6-socket-dir\") pod \"csi-node-driver-b4b84\" (UID: \"c9cc419c-75dd-4426-acd6-306f9b76a4b6\") " pod="calico-system/csi-node-driver-b4b84" May 17 10:33:40.641053 kubelet[2722]: E0517 10:33:40.641027 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.641053 kubelet[2722]: W0517 10:33:40.641042 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.641103 kubelet[2722]: E0517 10:33:40.641056 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.641103 kubelet[2722]: I0517 10:33:40.641072 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c9cc419c-75dd-4426-acd6-306f9b76a4b6-registration-dir\") pod \"csi-node-driver-b4b84\" (UID: \"c9cc419c-75dd-4426-acd6-306f9b76a4b6\") " pod="calico-system/csi-node-driver-b4b84" May 17 10:33:40.641348 kubelet[2722]: E0517 10:33:40.641330 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.641348 kubelet[2722]: W0517 10:33:40.641343 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.641439 kubelet[2722]: E0517 10:33:40.641358 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.641551 kubelet[2722]: E0517 10:33:40.641536 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.641551 kubelet[2722]: W0517 10:33:40.641546 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.641596 kubelet[2722]: E0517 10:33:40.641560 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.641767 kubelet[2722]: E0517 10:33:40.641738 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.641767 kubelet[2722]: W0517 10:33:40.641753 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.641767 kubelet[2722]: E0517 10:33:40.641770 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.641922 kubelet[2722]: I0517 10:33:40.641798 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c9cc419c-75dd-4426-acd6-306f9b76a4b6-varrun\") pod \"csi-node-driver-b4b84\" (UID: \"c9cc419c-75dd-4426-acd6-306f9b76a4b6\") " pod="calico-system/csi-node-driver-b4b84" May 17 10:33:40.642023 kubelet[2722]: E0517 10:33:40.642005 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.642023 kubelet[2722]: W0517 10:33:40.642018 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.642085 kubelet[2722]: E0517 10:33:40.642031 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.642085 kubelet[2722]: I0517 10:33:40.642057 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9cc419c-75dd-4426-acd6-306f9b76a4b6-kubelet-dir\") pod \"csi-node-driver-b4b84\" (UID: \"c9cc419c-75dd-4426-acd6-306f9b76a4b6\") " pod="calico-system/csi-node-driver-b4b84" May 17 10:33:40.642314 kubelet[2722]: E0517 10:33:40.642281 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.642314 kubelet[2722]: W0517 10:33:40.642293 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.642375 kubelet[2722]: E0517 10:33:40.642317 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.642375 kubelet[2722]: I0517 10:33:40.642352 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44lzh\" (UniqueName: \"kubernetes.io/projected/c9cc419c-75dd-4426-acd6-306f9b76a4b6-kube-api-access-44lzh\") pod \"csi-node-driver-b4b84\" (UID: \"c9cc419c-75dd-4426-acd6-306f9b76a4b6\") " pod="calico-system/csi-node-driver-b4b84" May 17 10:33:40.642474 kubelet[2722]: E0517 10:33:40.642459 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.642474 kubelet[2722]: W0517 10:33:40.642469 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.642524 kubelet[2722]: E0517 10:33:40.642495 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.642742 kubelet[2722]: E0517 10:33:40.642665 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.642742 kubelet[2722]: W0517 10:33:40.642679 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.642742 kubelet[2722]: E0517 10:33:40.642698 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.643027 kubelet[2722]: E0517 10:33:40.642979 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.643027 kubelet[2722]: W0517 10:33:40.642990 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.643147 kubelet[2722]: E0517 10:33:40.643112 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.643353 kubelet[2722]: E0517 10:33:40.643316 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.643353 kubelet[2722]: W0517 10:33:40.643339 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.643451 kubelet[2722]: E0517 10:33:40.643366 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.643556 kubelet[2722]: E0517 10:33:40.643538 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.643556 kubelet[2722]: W0517 10:33:40.643549 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.643556 kubelet[2722]: E0517 10:33:40.643557 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.643762 kubelet[2722]: E0517 10:33:40.643747 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.643762 kubelet[2722]: W0517 10:33:40.643757 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.643813 kubelet[2722]: E0517 10:33:40.643765 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.644006 kubelet[2722]: E0517 10:33:40.643980 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.644006 kubelet[2722]: W0517 10:33:40.643994 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.644006 kubelet[2722]: E0517 10:33:40.644005 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.644280 kubelet[2722]: E0517 10:33:40.644234 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.644280 kubelet[2722]: W0517 10:33:40.644245 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.644280 kubelet[2722]: E0517 10:33:40.644254 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.714229 containerd[1593]: time="2025-05-17T10:33:40.714116745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79btc,Uid:255c2745-cd7d-4f7d-ac10-087758f095bd,Namespace:calico-system,Attempt:0,}" May 17 10:33:40.735159 containerd[1593]: time="2025-05-17T10:33:40.735101201Z" level=info msg="connecting to shim 2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681" address="unix:///run/containerd/s/adf07145e5045271c74b1f2ad5294a33c01be0ad51169b8dc883d2ca554e5f25" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:40.743775 kubelet[2722]: E0517 10:33:40.743733 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.743775 kubelet[2722]: W0517 10:33:40.743779 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.743930 kubelet[2722]: E0517 10:33:40.743800 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.744081 kubelet[2722]: E0517 10:33:40.744057 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.744081 kubelet[2722]: W0517 10:33:40.744070 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.744081 kubelet[2722]: E0517 10:33:40.744087 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.744300 kubelet[2722]: E0517 10:33:40.744282 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.744300 kubelet[2722]: W0517 10:33:40.744294 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.744377 kubelet[2722]: E0517 10:33:40.744311 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.744495 kubelet[2722]: E0517 10:33:40.744472 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.744495 kubelet[2722]: W0517 10:33:40.744483 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.744495 kubelet[2722]: E0517 10:33:40.744491 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.745029 kubelet[2722]: E0517 10:33:40.745001 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.745074 kubelet[2722]: W0517 10:33:40.745027 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.745074 kubelet[2722]: E0517 10:33:40.745056 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.745267 kubelet[2722]: E0517 10:33:40.745250 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.745267 kubelet[2722]: W0517 10:33:40.745262 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.745332 kubelet[2722]: E0517 10:33:40.745322 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.745473 kubelet[2722]: E0517 10:33:40.745455 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.745473 kubelet[2722]: W0517 10:33:40.745470 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.745552 kubelet[2722]: E0517 10:33:40.745536 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.746081 kubelet[2722]: E0517 10:33:40.746061 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.746081 kubelet[2722]: W0517 10:33:40.746074 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.746164 kubelet[2722]: E0517 10:33:40.746108 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.746322 kubelet[2722]: E0517 10:33:40.746276 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.746322 kubelet[2722]: W0517 10:33:40.746316 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.746488 kubelet[2722]: E0517 10:33:40.746465 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.746737 kubelet[2722]: E0517 10:33:40.746699 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.746737 kubelet[2722]: W0517 10:33:40.746714 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.746992 kubelet[2722]: E0517 10:33:40.746873 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.747414 kubelet[2722]: E0517 10:33:40.747047 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.747414 kubelet[2722]: W0517 10:33:40.747059 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.747414 kubelet[2722]: E0517 10:33:40.747106 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.747414 kubelet[2722]: E0517 10:33:40.747259 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.747414 kubelet[2722]: W0517 10:33:40.747267 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.747414 kubelet[2722]: E0517 10:33:40.747383 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.747571 kubelet[2722]: E0517 10:33:40.747432 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.747571 kubelet[2722]: W0517 10:33:40.747441 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.747571 kubelet[2722]: E0517 10:33:40.747504 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.748166 kubelet[2722]: E0517 10:33:40.748116 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.748166 kubelet[2722]: W0517 10:33:40.748141 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.748166 kubelet[2722]: E0517 10:33:40.748154 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.749673 kubelet[2722]: E0517 10:33:40.749609 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.749673 kubelet[2722]: W0517 10:33:40.749636 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.749788 kubelet[2722]: E0517 10:33:40.749763 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.750522 kubelet[2722]: E0517 10:33:40.750482 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.750522 kubelet[2722]: W0517 10:33:40.750495 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.750879 kubelet[2722]: E0517 10:33:40.750852 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.750879 kubelet[2722]: E0517 10:33:40.750854 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.750879 kubelet[2722]: W0517 10:33:40.750863 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.751005 kubelet[2722]: E0517 10:33:40.750992 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.751566 kubelet[2722]: E0517 10:33:40.751541 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.751566 kubelet[2722]: W0517 10:33:40.751561 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.751744 kubelet[2722]: E0517 10:33:40.751719 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.752119 kubelet[2722]: E0517 10:33:40.752095 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.752119 kubelet[2722]: W0517 10:33:40.752109 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.752341 kubelet[2722]: E0517 10:33:40.752308 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.752629 kubelet[2722]: E0517 10:33:40.752606 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.752629 kubelet[2722]: W0517 10:33:40.752621 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.752728 kubelet[2722]: E0517 10:33:40.752705 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.753423 kubelet[2722]: E0517 10:33:40.753148 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.753423 kubelet[2722]: W0517 10:33:40.753161 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.753423 kubelet[2722]: E0517 10:33:40.753279 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.753775 kubelet[2722]: E0517 10:33:40.753637 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.753775 kubelet[2722]: W0517 10:33:40.753754 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.753939 kubelet[2722]: E0517 10:33:40.753841 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.754084 kubelet[2722]: E0517 10:33:40.754033 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.754084 kubelet[2722]: W0517 10:33:40.754045 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.754185 kubelet[2722]: E0517 10:33:40.754146 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.754592 kubelet[2722]: E0517 10:33:40.754532 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.754592 kubelet[2722]: W0517 10:33:40.754549 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.754702 kubelet[2722]: E0517 10:33:40.754688 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.754994 kubelet[2722]: E0517 10:33:40.754961 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.754994 kubelet[2722]: W0517 10:33:40.754976 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.754994 kubelet[2722]: E0517 10:33:40.754987 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.765324 systemd[1]: Started cri-containerd-2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681.scope - libcontainer container 2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681. May 17 10:33:40.772160 kubelet[2722]: E0517 10:33:40.769930 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:40.772160 kubelet[2722]: W0517 10:33:40.769956 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:40.772160 kubelet[2722]: E0517 10:33:40.769978 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:40.848349 containerd[1593]: time="2025-05-17T10:33:40.848216320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79btc,Uid:255c2745-cd7d-4f7d-ac10-087758f095bd,Namespace:calico-system,Attempt:0,} returns sandbox id \"2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681\"" May 17 10:33:41.850138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount509801006.mount: Deactivated successfully. May 17 10:33:42.393502 update_engine[1588]: I20250517 10:33:42.393443 1588 update_attempter.cc:509] Updating boot flags... May 17 10:33:42.656667 containerd[1593]: time="2025-05-17T10:33:42.656558554Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:42.657311 containerd[1593]: time="2025-05-17T10:33:42.657262868Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 17 10:33:42.658349 containerd[1593]: time="2025-05-17T10:33:42.658318037Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:42.660064 containerd[1593]: time="2025-05-17T10:33:42.660035630Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:42.660605 containerd[1593]: time="2025-05-17T10:33:42.660573319Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 2.130113111s" May 17 10:33:42.660644 containerd[1593]: time="2025-05-17T10:33:42.660609717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 17 10:33:42.661721 containerd[1593]: time="2025-05-17T10:33:42.661473103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 17 10:33:42.669458 containerd[1593]: time="2025-05-17T10:33:42.669428911Z" level=info msg="CreateContainer within sandbox \"36951d8f6c65b3cdecde0e89b9fe5f4d6d3cca25fb69116ef07c5c67861043c7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 17 10:33:42.677686 containerd[1593]: time="2025-05-17T10:33:42.677652878Z" level=info msg="Container 8d582555036b147c55d64df7b306c2e1bf8390643e18cb4b6ad681e89c41b8d5: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:42.685562 containerd[1593]: time="2025-05-17T10:33:42.685524447Z" level=info msg="CreateContainer within sandbox \"36951d8f6c65b3cdecde0e89b9fe5f4d6d3cca25fb69116ef07c5c67861043c7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"8d582555036b147c55d64df7b306c2e1bf8390643e18cb4b6ad681e89c41b8d5\"" May 17 10:33:42.685996 containerd[1593]: time="2025-05-17T10:33:42.685940955Z" level=info msg="StartContainer for \"8d582555036b147c55d64df7b306c2e1bf8390643e18cb4b6ad681e89c41b8d5\"" May 17 10:33:42.686871 containerd[1593]: time="2025-05-17T10:33:42.686822726Z" level=info msg="connecting to shim 8d582555036b147c55d64df7b306c2e1bf8390643e18cb4b6ad681e89c41b8d5" address="unix:///run/containerd/s/ec3daec7822cbcc657181ceb4776dc37a3ff02e832512152aa04c45bd57e69a4" protocol=ttrpc version=3 May 17 10:33:42.707521 systemd[1]: Started cri-containerd-8d582555036b147c55d64df7b306c2e1bf8390643e18cb4b6ad681e89c41b8d5.scope - libcontainer container 8d582555036b147c55d64df7b306c2e1bf8390643e18cb4b6ad681e89c41b8d5. May 17 10:33:42.750616 kubelet[2722]: E0517 10:33:42.750568 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4b84" podUID="c9cc419c-75dd-4426-acd6-306f9b76a4b6" May 17 10:33:42.756613 containerd[1593]: time="2025-05-17T10:33:42.756560582Z" level=info msg="StartContainer for \"8d582555036b147c55d64df7b306c2e1bf8390643e18cb4b6ad681e89c41b8d5\" returns successfully" May 17 10:33:42.850540 kubelet[2722]: E0517 10:33:42.850491 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.850540 kubelet[2722]: W0517 10:33:42.850515 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.850740 kubelet[2722]: E0517 10:33:42.850656 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.851077 kubelet[2722]: E0517 10:33:42.851054 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.851077 kubelet[2722]: W0517 10:33:42.851068 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.851159 kubelet[2722]: E0517 10:33:42.851106 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.851450 kubelet[2722]: E0517 10:33:42.851433 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.851450 kubelet[2722]: W0517 10:33:42.851448 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.851560 kubelet[2722]: E0517 10:33:42.851458 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.851799 kubelet[2722]: E0517 10:33:42.851780 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.851799 kubelet[2722]: W0517 10:33:42.851797 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.851888 kubelet[2722]: E0517 10:33:42.851810 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.852426 kubelet[2722]: E0517 10:33:42.852309 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.852426 kubelet[2722]: W0517 10:33:42.852323 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.852426 kubelet[2722]: E0517 10:33:42.852333 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.853071 kubelet[2722]: E0517 10:33:42.852548 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.853071 kubelet[2722]: W0517 10:33:42.852565 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.853071 kubelet[2722]: E0517 10:33:42.852577 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.853222 kubelet[2722]: E0517 10:33:42.853198 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.853222 kubelet[2722]: W0517 10:33:42.853216 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.853274 kubelet[2722]: E0517 10:33:42.853229 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.854417 kubelet[2722]: E0517 10:33:42.853542 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.854417 kubelet[2722]: W0517 10:33:42.853555 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.854417 kubelet[2722]: E0517 10:33:42.853564 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.854417 kubelet[2722]: E0517 10:33:42.853721 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.854417 kubelet[2722]: W0517 10:33:42.853728 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.854417 kubelet[2722]: E0517 10:33:42.853735 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.854417 kubelet[2722]: E0517 10:33:42.853903 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.854417 kubelet[2722]: W0517 10:33:42.853910 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.854417 kubelet[2722]: E0517 10:33:42.853919 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.854417 kubelet[2722]: E0517 10:33:42.854057 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.854669 kubelet[2722]: W0517 10:33:42.854064 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.854669 kubelet[2722]: E0517 10:33:42.854071 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.854669 kubelet[2722]: E0517 10:33:42.854356 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.854669 kubelet[2722]: W0517 10:33:42.854365 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.854669 kubelet[2722]: E0517 10:33:42.854373 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.854778 kubelet[2722]: E0517 10:33:42.854678 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.854778 kubelet[2722]: W0517 10:33:42.854687 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.854778 kubelet[2722]: E0517 10:33:42.854696 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.854904 kubelet[2722]: E0517 10:33:42.854883 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.854904 kubelet[2722]: W0517 10:33:42.854897 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.854961 kubelet[2722]: E0517 10:33:42.854906 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.855093 kubelet[2722]: E0517 10:33:42.855067 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.855093 kubelet[2722]: W0517 10:33:42.855085 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.855093 kubelet[2722]: E0517 10:33:42.855093 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.862833 kubelet[2722]: E0517 10:33:42.862711 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.862833 kubelet[2722]: W0517 10:33:42.862735 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.862833 kubelet[2722]: E0517 10:33:42.862752 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.863433 kubelet[2722]: E0517 10:33:42.863015 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.863433 kubelet[2722]: W0517 10:33:42.863027 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.863433 kubelet[2722]: E0517 10:33:42.863047 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.863433 kubelet[2722]: E0517 10:33:42.863262 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.863433 kubelet[2722]: W0517 10:33:42.863269 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.863433 kubelet[2722]: E0517 10:33:42.863291 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.863656 kubelet[2722]: E0517 10:33:42.863632 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.863656 kubelet[2722]: W0517 10:33:42.863649 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.863717 kubelet[2722]: E0517 10:33:42.863662 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.863896 kubelet[2722]: E0517 10:33:42.863874 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.863896 kubelet[2722]: W0517 10:33:42.863888 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.863953 kubelet[2722]: E0517 10:33:42.863906 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.864370 kubelet[2722]: E0517 10:33:42.864346 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.864370 kubelet[2722]: W0517 10:33:42.864361 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.864449 kubelet[2722]: E0517 10:33:42.864440 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.864630 kubelet[2722]: E0517 10:33:42.864608 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.864630 kubelet[2722]: W0517 10:33:42.864621 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.864769 kubelet[2722]: E0517 10:33:42.864725 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.864904 kubelet[2722]: E0517 10:33:42.864798 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.864904 kubelet[2722]: W0517 10:33:42.864807 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.864904 kubelet[2722]: E0517 10:33:42.864831 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.865114 kubelet[2722]: E0517 10:33:42.865094 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.865166 kubelet[2722]: W0517 10:33:42.865123 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.865166 kubelet[2722]: E0517 10:33:42.865156 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.865650 kubelet[2722]: E0517 10:33:42.865631 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.865650 kubelet[2722]: W0517 10:33:42.865643 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.865721 kubelet[2722]: E0517 10:33:42.865656 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.866035 kubelet[2722]: E0517 10:33:42.865936 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.866035 kubelet[2722]: W0517 10:33:42.865960 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.866035 kubelet[2722]: E0517 10:33:42.866000 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.866370 kubelet[2722]: E0517 10:33:42.866358 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.866523 kubelet[2722]: W0517 10:33:42.866433 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.866523 kubelet[2722]: E0517 10:33:42.866466 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.866752 kubelet[2722]: E0517 10:33:42.866633 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.866752 kubelet[2722]: W0517 10:33:42.866647 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.866752 kubelet[2722]: E0517 10:33:42.866669 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.867032 kubelet[2722]: E0517 10:33:42.867008 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.867032 kubelet[2722]: W0517 10:33:42.867025 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.867092 kubelet[2722]: E0517 10:33:42.867039 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.868499 kubelet[2722]: E0517 10:33:42.868477 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.868499 kubelet[2722]: W0517 10:33:42.868494 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.868584 kubelet[2722]: E0517 10:33:42.868508 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.868712 kubelet[2722]: E0517 10:33:42.868683 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.868712 kubelet[2722]: W0517 10:33:42.868699 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.868712 kubelet[2722]: E0517 10:33:42.868706 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.868935 kubelet[2722]: E0517 10:33:42.868920 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.868935 kubelet[2722]: W0517 10:33:42.868931 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.868994 kubelet[2722]: E0517 10:33:42.868939 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:42.869233 kubelet[2722]: E0517 10:33:42.869219 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:42.869233 kubelet[2722]: W0517 10:33:42.869229 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:42.869289 kubelet[2722]: E0517 10:33:42.869237 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.808207 kubelet[2722]: I0517 10:33:43.808164 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:33:43.863641 kubelet[2722]: E0517 10:33:43.863604 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.863641 kubelet[2722]: W0517 10:33:43.863632 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.863641 kubelet[2722]: E0517 10:33:43.863653 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.863894 kubelet[2722]: E0517 10:33:43.863876 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.863894 kubelet[2722]: W0517 10:33:43.863887 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.863894 kubelet[2722]: E0517 10:33:43.863896 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.864099 kubelet[2722]: E0517 10:33:43.864083 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.864099 kubelet[2722]: W0517 10:33:43.864093 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.864099 kubelet[2722]: E0517 10:33:43.864100 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.864322 kubelet[2722]: E0517 10:33:43.864301 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.864322 kubelet[2722]: W0517 10:33:43.864314 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.864322 kubelet[2722]: E0517 10:33:43.864322 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.864599 kubelet[2722]: E0517 10:33:43.864575 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.864665 kubelet[2722]: W0517 10:33:43.864598 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.864665 kubelet[2722]: E0517 10:33:43.864627 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.864866 kubelet[2722]: E0517 10:33:43.864852 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.864866 kubelet[2722]: W0517 10:33:43.864863 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.864960 kubelet[2722]: E0517 10:33:43.864874 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.865120 kubelet[2722]: E0517 10:33:43.865093 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.865120 kubelet[2722]: W0517 10:33:43.865106 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.865208 kubelet[2722]: E0517 10:33:43.865123 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.865312 kubelet[2722]: E0517 10:33:43.865296 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.865312 kubelet[2722]: W0517 10:33:43.865306 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.865312 kubelet[2722]: E0517 10:33:43.865315 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.865546 kubelet[2722]: E0517 10:33:43.865529 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.865546 kubelet[2722]: W0517 10:33:43.865543 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.865644 kubelet[2722]: E0517 10:33:43.865557 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.865766 kubelet[2722]: E0517 10:33:43.865750 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.865766 kubelet[2722]: W0517 10:33:43.865761 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.865840 kubelet[2722]: E0517 10:33:43.865771 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.865986 kubelet[2722]: E0517 10:33:43.865972 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.865986 kubelet[2722]: W0517 10:33:43.865984 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.866085 kubelet[2722]: E0517 10:33:43.865995 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.866228 kubelet[2722]: E0517 10:33:43.866204 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.866228 kubelet[2722]: W0517 10:33:43.866217 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.866228 kubelet[2722]: E0517 10:33:43.866228 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.866469 kubelet[2722]: E0517 10:33:43.866450 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.866469 kubelet[2722]: W0517 10:33:43.866467 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.866528 kubelet[2722]: E0517 10:33:43.866478 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.866671 kubelet[2722]: E0517 10:33:43.866653 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.866671 kubelet[2722]: W0517 10:33:43.866666 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.866759 kubelet[2722]: E0517 10:33:43.866675 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.866882 kubelet[2722]: E0517 10:33:43.866867 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.866882 kubelet[2722]: W0517 10:33:43.866878 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.866978 kubelet[2722]: E0517 10:33:43.866888 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.871186 kubelet[2722]: E0517 10:33:43.871160 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.871186 kubelet[2722]: W0517 10:33:43.871175 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.871186 kubelet[2722]: E0517 10:33:43.871185 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.871366 kubelet[2722]: E0517 10:33:43.871346 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.871366 kubelet[2722]: W0517 10:33:43.871356 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.871484 kubelet[2722]: E0517 10:33:43.871372 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.871565 kubelet[2722]: E0517 10:33:43.871544 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.871565 kubelet[2722]: W0517 10:33:43.871555 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.871615 kubelet[2722]: E0517 10:33:43.871568 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.871754 kubelet[2722]: E0517 10:33:43.871739 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.871754 kubelet[2722]: W0517 10:33:43.871750 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.871809 kubelet[2722]: E0517 10:33:43.871764 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.871965 kubelet[2722]: E0517 10:33:43.871950 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.871995 kubelet[2722]: W0517 10:33:43.871965 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.871995 kubelet[2722]: E0517 10:33:43.871986 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.872195 kubelet[2722]: E0517 10:33:43.872183 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.872230 kubelet[2722]: W0517 10:33:43.872195 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.872230 kubelet[2722]: E0517 10:33:43.872212 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.872457 kubelet[2722]: E0517 10:33:43.872443 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.872487 kubelet[2722]: W0517 10:33:43.872457 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.872508 kubelet[2722]: E0517 10:33:43.872489 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.872642 kubelet[2722]: E0517 10:33:43.872628 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.872679 kubelet[2722]: W0517 10:33:43.872641 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.872809 kubelet[2722]: E0517 10:33:43.872793 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.872871 kubelet[2722]: E0517 10:33:43.872860 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.872900 kubelet[2722]: W0517 10:33:43.872870 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.872900 kubelet[2722]: E0517 10:33:43.872885 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.873093 kubelet[2722]: E0517 10:33:43.873074 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.873093 kubelet[2722]: W0517 10:33:43.873090 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.873159 kubelet[2722]: E0517 10:33:43.873105 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.873319 kubelet[2722]: E0517 10:33:43.873305 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.873319 kubelet[2722]: W0517 10:33:43.873315 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.873367 kubelet[2722]: E0517 10:33:43.873329 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.873521 kubelet[2722]: E0517 10:33:43.873506 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.873521 kubelet[2722]: W0517 10:33:43.873516 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.873581 kubelet[2722]: E0517 10:33:43.873529 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.873718 kubelet[2722]: E0517 10:33:43.873704 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.873718 kubelet[2722]: W0517 10:33:43.873714 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.873768 kubelet[2722]: E0517 10:33:43.873727 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.873990 kubelet[2722]: E0517 10:33:43.873977 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.873990 kubelet[2722]: W0517 10:33:43.873989 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.874042 kubelet[2722]: E0517 10:33:43.874005 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.874180 kubelet[2722]: E0517 10:33:43.874167 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.874180 kubelet[2722]: W0517 10:33:43.874177 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.874225 kubelet[2722]: E0517 10:33:43.874192 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.874400 kubelet[2722]: E0517 10:33:43.874376 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.874439 kubelet[2722]: W0517 10:33:43.874388 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.874439 kubelet[2722]: E0517 10:33:43.874423 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.874671 kubelet[2722]: E0517 10:33:43.874656 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.874671 kubelet[2722]: W0517 10:33:43.874667 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.874723 kubelet[2722]: E0517 10:33:43.874682 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.874872 kubelet[2722]: E0517 10:33:43.874858 2722 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 17 10:33:43.874905 kubelet[2722]: W0517 10:33:43.874872 2722 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 17 10:33:43.874905 kubelet[2722]: E0517 10:33:43.874882 2722 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 17 10:33:43.946121 containerd[1593]: time="2025-05-17T10:33:43.946057939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:43.946791 containerd[1593]: time="2025-05-17T10:33:43.946758594Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 17 10:33:43.947797 containerd[1593]: time="2025-05-17T10:33:43.947767875Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:43.949622 containerd[1593]: time="2025-05-17T10:33:43.949586788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:43.950084 containerd[1593]: time="2025-05-17T10:33:43.950050536Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.288544412s" May 17 10:33:43.950129 containerd[1593]: time="2025-05-17T10:33:43.950080703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 17 10:33:43.952027 containerd[1593]: time="2025-05-17T10:33:43.952001319Z" level=info msg="CreateContainer within sandbox \"2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 17 10:33:43.961413 containerd[1593]: time="2025-05-17T10:33:43.960670069Z" level=info msg="Container 814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:43.968607 containerd[1593]: time="2025-05-17T10:33:43.968553772Z" level=info msg="CreateContainer within sandbox \"2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30\"" May 17 10:33:43.969074 containerd[1593]: time="2025-05-17T10:33:43.969049390Z" level=info msg="StartContainer for \"814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30\"" May 17 10:33:43.970342 containerd[1593]: time="2025-05-17T10:33:43.970314645Z" level=info msg="connecting to shim 814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30" address="unix:///run/containerd/s/adf07145e5045271c74b1f2ad5294a33c01be0ad51169b8dc883d2ca554e5f25" protocol=ttrpc version=3 May 17 10:33:43.988526 systemd[1]: Started cri-containerd-814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30.scope - libcontainer container 814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30. May 17 10:33:44.028948 containerd[1593]: time="2025-05-17T10:33:44.028801759Z" level=info msg="StartContainer for \"814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30\" returns successfully" May 17 10:33:44.040717 systemd[1]: cri-containerd-814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30.scope: Deactivated successfully. May 17 10:33:44.041154 systemd[1]: cri-containerd-814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30.scope: Consumed 35ms CPU time, 6.4M memory peak, 4.6M written to disk. May 17 10:33:44.043860 containerd[1593]: time="2025-05-17T10:33:44.043825052Z" level=info msg="TaskExit event in podsandbox handler container_id:\"814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30\" id:\"814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30\" pid:3454 exited_at:{seconds:1747478024 nanos:43219896}" May 17 10:33:44.044368 containerd[1593]: time="2025-05-17T10:33:44.044327773Z" level=info msg="received exit event container_id:\"814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30\" id:\"814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30\" pid:3454 exited_at:{seconds:1747478024 nanos:43219896}" May 17 10:33:44.067053 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-814094f2b069c47572d3eb66b20931badd74e24ba391be0a1086f09fe8ccbb30-rootfs.mount: Deactivated successfully. May 17 10:33:44.749820 kubelet[2722]: E0517 10:33:44.749773 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4b84" podUID="c9cc419c-75dd-4426-acd6-306f9b76a4b6" May 17 10:33:44.812058 containerd[1593]: time="2025-05-17T10:33:44.812021364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 17 10:33:44.839844 kubelet[2722]: I0517 10:33:44.839709 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6bbf6c59cd-fmmjt" podStartSLOduration=3.708496532 podStartE2EDuration="5.839691282s" podCreationTimestamp="2025-05-17 10:33:39 +0000 UTC" firstStartedPulling="2025-05-17 10:33:40.530055611 +0000 UTC m=+15.866456201" lastFinishedPulling="2025-05-17 10:33:42.661250361 +0000 UTC m=+17.997650951" observedRunningTime="2025-05-17 10:33:42.825366202 +0000 UTC m=+18.161766782" watchObservedRunningTime="2025-05-17 10:33:44.839691282 +0000 UTC m=+20.176091873" May 17 10:33:46.749588 kubelet[2722]: E0517 10:33:46.749530 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4b84" podUID="c9cc419c-75dd-4426-acd6-306f9b76a4b6" May 17 10:33:48.751529 kubelet[2722]: E0517 10:33:48.751480 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4b84" podUID="c9cc419c-75dd-4426-acd6-306f9b76a4b6" May 17 10:33:49.258510 containerd[1593]: time="2025-05-17T10:33:49.258454597Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:49.259352 containerd[1593]: time="2025-05-17T10:33:49.259312207Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 17 10:33:49.260493 containerd[1593]: time="2025-05-17T10:33:49.260469642Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:49.262357 containerd[1593]: time="2025-05-17T10:33:49.262327008Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:49.262922 containerd[1593]: time="2025-05-17T10:33:49.262879070Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 4.450826468s" May 17 10:33:49.262922 containerd[1593]: time="2025-05-17T10:33:49.262915678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 17 10:33:49.264882 containerd[1593]: time="2025-05-17T10:33:49.264839520Z" level=info msg="CreateContainer within sandbox \"2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 17 10:33:49.274033 containerd[1593]: time="2025-05-17T10:33:49.273991185Z" level=info msg="Container a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:49.283626 containerd[1593]: time="2025-05-17T10:33:49.283587920Z" level=info msg="CreateContainer within sandbox \"2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0\"" May 17 10:33:49.284108 containerd[1593]: time="2025-05-17T10:33:49.284046866Z" level=info msg="StartContainer for \"a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0\"" May 17 10:33:49.285351 containerd[1593]: time="2025-05-17T10:33:49.285324438Z" level=info msg="connecting to shim a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0" address="unix:///run/containerd/s/adf07145e5045271c74b1f2ad5294a33c01be0ad51169b8dc883d2ca554e5f25" protocol=ttrpc version=3 May 17 10:33:49.307636 systemd[1]: Started cri-containerd-a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0.scope - libcontainer container a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0. May 17 10:33:49.351438 containerd[1593]: time="2025-05-17T10:33:49.351383112Z" level=info msg="StartContainer for \"a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0\" returns successfully" May 17 10:33:50.751139 kubelet[2722]: E0517 10:33:50.751099 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4b84" podUID="c9cc419c-75dd-4426-acd6-306f9b76a4b6" May 17 10:33:51.929546 containerd[1593]: time="2025-05-17T10:33:51.929497111Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 17 10:33:51.932996 systemd[1]: cri-containerd-a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0.scope: Deactivated successfully. May 17 10:33:51.933354 systemd[1]: cri-containerd-a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0.scope: Consumed 577ms CPU time, 176.7M memory peak, 4M read from disk, 170.9M written to disk. May 17 10:33:51.934749 containerd[1593]: time="2025-05-17T10:33:51.934714442Z" level=info msg="received exit event container_id:\"a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0\" id:\"a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0\" pid:3513 exited_at:{seconds:1747478031 nanos:934444713}" May 17 10:33:51.934818 containerd[1593]: time="2025-05-17T10:33:51.934768995Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0\" id:\"a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0\" pid:3513 exited_at:{seconds:1747478031 nanos:934444713}" May 17 10:33:51.953973 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a19edfab529432774e96c0264f345cdf9e187dd969b72288c9e985be83d868b0-rootfs.mount: Deactivated successfully. May 17 10:33:51.982539 kubelet[2722]: I0517 10:33:51.982515 2722 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 17 10:33:52.027298 kubelet[2722]: I0517 10:33:52.027270 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmtjn\" (UniqueName: \"kubernetes.io/projected/dec0eedc-8551-4c18-a24b-b1f66271e3a3-kube-api-access-hmtjn\") pod \"coredns-7c65d6cfc9-gqhvf\" (UID: \"dec0eedc-8551-4c18-a24b-b1f66271e3a3\") " pod="kube-system/coredns-7c65d6cfc9-gqhvf" May 17 10:33:52.027952 kubelet[2722]: I0517 10:33:52.027642 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/128445f0-2892-4b37-8390-5c9d48c00340-whisker-ca-bundle\") pod \"whisker-5cbfc6f858-vwgx7\" (UID: \"128445f0-2892-4b37-8390-5c9d48c00340\") " pod="calico-system/whisker-5cbfc6f858-vwgx7" May 17 10:33:52.027952 kubelet[2722]: I0517 10:33:52.027664 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7dn7\" (UniqueName: \"kubernetes.io/projected/a7e72c27-70ce-4980-bd76-76af4f838036-kube-api-access-v7dn7\") pod \"calico-apiserver-58bc8d76c5-8mjx6\" (UID: \"a7e72c27-70ce-4980-bd76-76af4f838036\") " pod="calico-apiserver/calico-apiserver-58bc8d76c5-8mjx6" May 17 10:33:52.027952 kubelet[2722]: I0517 10:33:52.027677 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/065bd371-4a25-4865-bbfe-0d2b88d6ea40-goldmane-key-pair\") pod \"goldmane-8f77d7b6c-b6dz4\" (UID: \"065bd371-4a25-4865-bbfe-0d2b88d6ea40\") " pod="calico-system/goldmane-8f77d7b6c-b6dz4" May 17 10:33:52.027952 kubelet[2722]: I0517 10:33:52.027692 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dec0eedc-8551-4c18-a24b-b1f66271e3a3-config-volume\") pod \"coredns-7c65d6cfc9-gqhvf\" (UID: \"dec0eedc-8551-4c18-a24b-b1f66271e3a3\") " pod="kube-system/coredns-7c65d6cfc9-gqhvf" May 17 10:33:52.027952 kubelet[2722]: I0517 10:33:52.027706 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31caddae-b106-4cda-83c6-e178d2c2a47b-tigera-ca-bundle\") pod \"calico-kube-controllers-9f7d64749-9p6j6\" (UID: \"31caddae-b106-4cda-83c6-e178d2c2a47b\") " pod="calico-system/calico-kube-controllers-9f7d64749-9p6j6" May 17 10:33:52.028120 kubelet[2722]: I0517 10:33:52.027720 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjdp\" (UniqueName: \"kubernetes.io/projected/31caddae-b106-4cda-83c6-e178d2c2a47b-kube-api-access-fjjdp\") pod \"calico-kube-controllers-9f7d64749-9p6j6\" (UID: \"31caddae-b106-4cda-83c6-e178d2c2a47b\") " pod="calico-system/calico-kube-controllers-9f7d64749-9p6j6" May 17 10:33:52.028120 kubelet[2722]: I0517 10:33:52.027735 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/065bd371-4a25-4865-bbfe-0d2b88d6ea40-goldmane-ca-bundle\") pod \"goldmane-8f77d7b6c-b6dz4\" (UID: \"065bd371-4a25-4865-bbfe-0d2b88d6ea40\") " pod="calico-system/goldmane-8f77d7b6c-b6dz4" May 17 10:33:52.028120 kubelet[2722]: I0517 10:33:52.027749 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae-config-volume\") pod \"coredns-7c65d6cfc9-8tc7w\" (UID: \"7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae\") " pod="kube-system/coredns-7c65d6cfc9-8tc7w" May 17 10:33:52.028120 kubelet[2722]: I0517 10:33:52.027765 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/128445f0-2892-4b37-8390-5c9d48c00340-whisker-backend-key-pair\") pod \"whisker-5cbfc6f858-vwgx7\" (UID: \"128445f0-2892-4b37-8390-5c9d48c00340\") " pod="calico-system/whisker-5cbfc6f858-vwgx7" May 17 10:33:52.028120 kubelet[2722]: I0517 10:33:52.027779 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dng6m\" (UniqueName: \"kubernetes.io/projected/7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae-kube-api-access-dng6m\") pod \"coredns-7c65d6cfc9-8tc7w\" (UID: \"7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae\") " pod="kube-system/coredns-7c65d6cfc9-8tc7w" May 17 10:33:52.028236 kubelet[2722]: I0517 10:33:52.027794 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8p5n\" (UniqueName: \"kubernetes.io/projected/065bd371-4a25-4865-bbfe-0d2b88d6ea40-kube-api-access-t8p5n\") pod \"goldmane-8f77d7b6c-b6dz4\" (UID: \"065bd371-4a25-4865-bbfe-0d2b88d6ea40\") " pod="calico-system/goldmane-8f77d7b6c-b6dz4" May 17 10:33:52.028236 kubelet[2722]: I0517 10:33:52.027808 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8lz\" (UniqueName: \"kubernetes.io/projected/128445f0-2892-4b37-8390-5c9d48c00340-kube-api-access-zk8lz\") pod \"whisker-5cbfc6f858-vwgx7\" (UID: \"128445f0-2892-4b37-8390-5c9d48c00340\") " pod="calico-system/whisker-5cbfc6f858-vwgx7" May 17 10:33:52.028236 kubelet[2722]: I0517 10:33:52.027823 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ll8\" (UniqueName: \"kubernetes.io/projected/0ecfc371-19f9-4a1e-ad70-e6bcdd45e502-kube-api-access-g5ll8\") pod \"calico-apiserver-58bc8d76c5-t7rpr\" (UID: \"0ecfc371-19f9-4a1e-ad70-e6bcdd45e502\") " pod="calico-apiserver/calico-apiserver-58bc8d76c5-t7rpr" May 17 10:33:52.028236 kubelet[2722]: I0517 10:33:52.027837 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a7e72c27-70ce-4980-bd76-76af4f838036-calico-apiserver-certs\") pod \"calico-apiserver-58bc8d76c5-8mjx6\" (UID: \"a7e72c27-70ce-4980-bd76-76af4f838036\") " pod="calico-apiserver/calico-apiserver-58bc8d76c5-8mjx6" May 17 10:33:52.028236 kubelet[2722]: I0517 10:33:52.027851 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0ecfc371-19f9-4a1e-ad70-e6bcdd45e502-calico-apiserver-certs\") pod \"calico-apiserver-58bc8d76c5-t7rpr\" (UID: \"0ecfc371-19f9-4a1e-ad70-e6bcdd45e502\") " pod="calico-apiserver/calico-apiserver-58bc8d76c5-t7rpr" May 17 10:33:52.028347 kubelet[2722]: I0517 10:33:52.027864 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065bd371-4a25-4865-bbfe-0d2b88d6ea40-config\") pod \"goldmane-8f77d7b6c-b6dz4\" (UID: \"065bd371-4a25-4865-bbfe-0d2b88d6ea40\") " pod="calico-system/goldmane-8f77d7b6c-b6dz4" May 17 10:33:52.033606 systemd[1]: Created slice kubepods-burstable-poddec0eedc_8551_4c18_a24b_b1f66271e3a3.slice - libcontainer container kubepods-burstable-poddec0eedc_8551_4c18_a24b_b1f66271e3a3.slice. May 17 10:33:52.043264 systemd[1]: Created slice kubepods-besteffort-pod31caddae_b106_4cda_83c6_e178d2c2a47b.slice - libcontainer container kubepods-besteffort-pod31caddae_b106_4cda_83c6_e178d2c2a47b.slice. May 17 10:33:52.051352 systemd[1]: Created slice kubepods-burstable-pod7dd6efa3_b7d1_4b38_8155_4bdfa4d50cae.slice - libcontainer container kubepods-burstable-pod7dd6efa3_b7d1_4b38_8155_4bdfa4d50cae.slice. May 17 10:33:52.058985 systemd[1]: Created slice kubepods-besteffort-pod0ecfc371_19f9_4a1e_ad70_e6bcdd45e502.slice - libcontainer container kubepods-besteffort-pod0ecfc371_19f9_4a1e_ad70_e6bcdd45e502.slice. May 17 10:33:52.064969 systemd[1]: Created slice kubepods-besteffort-poda7e72c27_70ce_4980_bd76_76af4f838036.slice - libcontainer container kubepods-besteffort-poda7e72c27_70ce_4980_bd76_76af4f838036.slice. May 17 10:33:52.071548 systemd[1]: Created slice kubepods-besteffort-pod128445f0_2892_4b37_8390_5c9d48c00340.slice - libcontainer container kubepods-besteffort-pod128445f0_2892_4b37_8390_5c9d48c00340.slice. May 17 10:33:52.076596 systemd[1]: Created slice kubepods-besteffort-pod065bd371_4a25_4865_bbfe_0d2b88d6ea40.slice - libcontainer container kubepods-besteffort-pod065bd371_4a25_4865_bbfe_0d2b88d6ea40.slice. May 17 10:33:52.340062 containerd[1593]: time="2025-05-17T10:33:52.339933018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gqhvf,Uid:dec0eedc-8551-4c18-a24b-b1f66271e3a3,Namespace:kube-system,Attempt:0,}" May 17 10:33:52.348559 containerd[1593]: time="2025-05-17T10:33:52.348524629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9f7d64749-9p6j6,Uid:31caddae-b106-4cda-83c6-e178d2c2a47b,Namespace:calico-system,Attempt:0,}" May 17 10:33:52.355283 containerd[1593]: time="2025-05-17T10:33:52.355247747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8tc7w,Uid:7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae,Namespace:kube-system,Attempt:0,}" May 17 10:33:52.365639 containerd[1593]: time="2025-05-17T10:33:52.365328746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-t7rpr,Uid:0ecfc371-19f9-4a1e-ad70-e6bcdd45e502,Namespace:calico-apiserver,Attempt:0,}" May 17 10:33:52.369092 containerd[1593]: time="2025-05-17T10:33:52.369070973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-8mjx6,Uid:a7e72c27-70ce-4980-bd76-76af4f838036,Namespace:calico-apiserver,Attempt:0,}" May 17 10:33:52.374669 containerd[1593]: time="2025-05-17T10:33:52.374632890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cbfc6f858-vwgx7,Uid:128445f0-2892-4b37-8390-5c9d48c00340,Namespace:calico-system,Attempt:0,}" May 17 10:33:52.379293 containerd[1593]: time="2025-05-17T10:33:52.379238806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-b6dz4,Uid:065bd371-4a25-4865-bbfe-0d2b88d6ea40,Namespace:calico-system,Attempt:0,}" May 17 10:33:52.451134 containerd[1593]: time="2025-05-17T10:33:52.451001266Z" level=error msg="Failed to destroy network for sandbox \"36d03c54228a65db56fdaef717923a487ad331503b262db6f696cf6bfbaaffc5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.462099 containerd[1593]: time="2025-05-17T10:33:52.462016106Z" level=error msg="Failed to destroy network for sandbox \"16d61a175966a761e2d98b01214b55da4ee6137a6ecfc6424e01d06113929a65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.463496 containerd[1593]: time="2025-05-17T10:33:52.463454558Z" level=error msg="Failed to destroy network for sandbox \"8b313fe1aff524c922049ba01ff3737bc0bec7ee52f1658f13acc1cfd3b8540f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.479667 containerd[1593]: time="2025-05-17T10:33:52.479616825Z" level=error msg="Failed to destroy network for sandbox \"12b1c0808b35997ad9a6f770ccf0765490f2f0b73e7b3dee047c8f0e48385321\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.484663 containerd[1593]: time="2025-05-17T10:33:52.484428317Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9f7d64749-9p6j6,Uid:31caddae-b106-4cda-83c6-e178d2c2a47b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"36d03c54228a65db56fdaef717923a487ad331503b262db6f696cf6bfbaaffc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.484663 containerd[1593]: time="2025-05-17T10:33:52.484499161Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8tc7w,Uid:7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b313fe1aff524c922049ba01ff3737bc0bec7ee52f1658f13acc1cfd3b8540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.484663 containerd[1593]: time="2025-05-17T10:33:52.484520922Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gqhvf,Uid:dec0eedc-8551-4c18-a24b-b1f66271e3a3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d61a175966a761e2d98b01214b55da4ee6137a6ecfc6424e01d06113929a65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.485018 containerd[1593]: time="2025-05-17T10:33:52.484447814Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-8mjx6,Uid:a7e72c27-70ce-4980-bd76-76af4f838036,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b1c0808b35997ad9a6f770ccf0765490f2f0b73e7b3dee047c8f0e48385321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.485564 containerd[1593]: time="2025-05-17T10:33:52.485524224Z" level=error msg="Failed to destroy network for sandbox \"e3ee331f398b75b497c7ae74dbf2a0c7ebee0ee4d323346c32745ccde2b496e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.487677 containerd[1593]: time="2025-05-17T10:33:52.487650063Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-t7rpr,Uid:0ecfc371-19f9-4a1e-ad70-e6bcdd45e502,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3ee331f398b75b497c7ae74dbf2a0c7ebee0ee4d323346c32745ccde2b496e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.488250 containerd[1593]: time="2025-05-17T10:33:52.488221350Z" level=error msg="Failed to destroy network for sandbox \"e976ebd11045ed82a79b34df810793a2bf3ab13128693513671b46360f7689b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.489497 containerd[1593]: time="2025-05-17T10:33:52.489453874Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-b6dz4,Uid:065bd371-4a25-4865-bbfe-0d2b88d6ea40,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e976ebd11045ed82a79b34df810793a2bf3ab13128693513671b46360f7689b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.495005 kubelet[2722]: E0517 10:33:52.494626 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36d03c54228a65db56fdaef717923a487ad331503b262db6f696cf6bfbaaffc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.495005 kubelet[2722]: E0517 10:33:52.494663 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b1c0808b35997ad9a6f770ccf0765490f2f0b73e7b3dee047c8f0e48385321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.495005 kubelet[2722]: E0517 10:33:52.494646 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d61a175966a761e2d98b01214b55da4ee6137a6ecfc6424e01d06113929a65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.495005 kubelet[2722]: E0517 10:33:52.494686 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3ee331f398b75b497c7ae74dbf2a0c7ebee0ee4d323346c32745ccde2b496e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.495269 kubelet[2722]: E0517 10:33:52.494708 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36d03c54228a65db56fdaef717923a487ad331503b262db6f696cf6bfbaaffc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9f7d64749-9p6j6" May 17 10:33:52.495269 kubelet[2722]: E0517 10:33:52.494715 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3ee331f398b75b497c7ae74dbf2a0c7ebee0ee4d323346c32745ccde2b496e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58bc8d76c5-t7rpr" May 17 10:33:52.495269 kubelet[2722]: E0517 10:33:52.494727 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36d03c54228a65db56fdaef717923a487ad331503b262db6f696cf6bfbaaffc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9f7d64749-9p6j6" May 17 10:33:52.495269 kubelet[2722]: E0517 10:33:52.494731 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3ee331f398b75b497c7ae74dbf2a0c7ebee0ee4d323346c32745ccde2b496e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58bc8d76c5-t7rpr" May 17 10:33:52.495366 kubelet[2722]: E0517 10:33:52.494767 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58bc8d76c5-t7rpr_calico-apiserver(0ecfc371-19f9-4a1e-ad70-e6bcdd45e502)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58bc8d76c5-t7rpr_calico-apiserver(0ecfc371-19f9-4a1e-ad70-e6bcdd45e502)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3ee331f398b75b497c7ae74dbf2a0c7ebee0ee4d323346c32745ccde2b496e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-58bc8d76c5-t7rpr" podUID="0ecfc371-19f9-4a1e-ad70-e6bcdd45e502" May 17 10:33:52.495366 kubelet[2722]: E0517 10:33:52.494782 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d61a175966a761e2d98b01214b55da4ee6137a6ecfc6424e01d06113929a65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-gqhvf" May 17 10:33:52.495366 kubelet[2722]: E0517 10:33:52.494818 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b313fe1aff524c922049ba01ff3737bc0bec7ee52f1658f13acc1cfd3b8540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.495474 kubelet[2722]: E0517 10:33:52.494848 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16d61a175966a761e2d98b01214b55da4ee6137a6ecfc6424e01d06113929a65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-gqhvf" May 17 10:33:52.495474 kubelet[2722]: E0517 10:33:52.494856 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b313fe1aff524c922049ba01ff3737bc0bec7ee52f1658f13acc1cfd3b8540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8tc7w" May 17 10:33:52.495474 kubelet[2722]: E0517 10:33:52.494767 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-9f7d64749-9p6j6_calico-system(31caddae-b106-4cda-83c6-e178d2c2a47b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-9f7d64749-9p6j6_calico-system(31caddae-b106-4cda-83c6-e178d2c2a47b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"36d03c54228a65db56fdaef717923a487ad331503b262db6f696cf6bfbaaffc5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-9f7d64749-9p6j6" podUID="31caddae-b106-4cda-83c6-e178d2c2a47b" May 17 10:33:52.495563 kubelet[2722]: E0517 10:33:52.494626 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e976ebd11045ed82a79b34df810793a2bf3ab13128693513671b46360f7689b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.495563 kubelet[2722]: E0517 10:33:52.494803 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b1c0808b35997ad9a6f770ccf0765490f2f0b73e7b3dee047c8f0e48385321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58bc8d76c5-8mjx6" May 17 10:33:52.495563 kubelet[2722]: E0517 10:33:52.494884 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12b1c0808b35997ad9a6f770ccf0765490f2f0b73e7b3dee047c8f0e48385321\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58bc8d76c5-8mjx6" May 17 10:33:52.495563 kubelet[2722]: E0517 10:33:52.494890 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e976ebd11045ed82a79b34df810793a2bf3ab13128693513671b46360f7689b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-b6dz4" May 17 10:33:52.495659 kubelet[2722]: E0517 10:33:52.494905 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e976ebd11045ed82a79b34df810793a2bf3ab13128693513671b46360f7689b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-b6dz4" May 17 10:33:52.495659 kubelet[2722]: E0517 10:33:52.494923 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-gqhvf_kube-system(dec0eedc-8551-4c18-a24b-b1f66271e3a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-gqhvf_kube-system(dec0eedc-8551-4c18-a24b-b1f66271e3a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"16d61a175966a761e2d98b01214b55da4ee6137a6ecfc6424e01d06113929a65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-gqhvf" podUID="dec0eedc-8551-4c18-a24b-b1f66271e3a3" May 17 10:33:52.495659 kubelet[2722]: E0517 10:33:52.494942 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-8f77d7b6c-b6dz4_calico-system(065bd371-4a25-4865-bbfe-0d2b88d6ea40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-8f77d7b6c-b6dz4_calico-system(065bd371-4a25-4865-bbfe-0d2b88d6ea40)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e976ebd11045ed82a79b34df810793a2bf3ab13128693513671b46360f7689b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-8f77d7b6c-b6dz4" podUID="065bd371-4a25-4865-bbfe-0d2b88d6ea40" May 17 10:33:52.495756 kubelet[2722]: E0517 10:33:52.494920 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58bc8d76c5-8mjx6_calico-apiserver(a7e72c27-70ce-4980-bd76-76af4f838036)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58bc8d76c5-8mjx6_calico-apiserver(a7e72c27-70ce-4980-bd76-76af4f838036)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12b1c0808b35997ad9a6f770ccf0765490f2f0b73e7b3dee047c8f0e48385321\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-58bc8d76c5-8mjx6" podUID="a7e72c27-70ce-4980-bd76-76af4f838036" May 17 10:33:52.495756 kubelet[2722]: E0517 10:33:52.494871 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b313fe1aff524c922049ba01ff3737bc0bec7ee52f1658f13acc1cfd3b8540f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8tc7w" May 17 10:33:52.495756 kubelet[2722]: E0517 10:33:52.494999 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-8tc7w_kube-system(7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-8tc7w_kube-system(7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8b313fe1aff524c922049ba01ff3737bc0bec7ee52f1658f13acc1cfd3b8540f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8tc7w" podUID="7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae" May 17 10:33:52.497326 containerd[1593]: time="2025-05-17T10:33:52.497282496Z" level=error msg="Failed to destroy network for sandbox \"a9de4f1892d3673e07ee30ca321b4fa80736eba589da93d1b4abf38d19d545de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.499090 containerd[1593]: time="2025-05-17T10:33:52.498958447Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cbfc6f858-vwgx7,Uid:128445f0-2892-4b37-8390-5c9d48c00340,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9de4f1892d3673e07ee30ca321b4fa80736eba589da93d1b4abf38d19d545de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.499263 kubelet[2722]: E0517 10:33:52.499173 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9de4f1892d3673e07ee30ca321b4fa80736eba589da93d1b4abf38d19d545de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.499263 kubelet[2722]: E0517 10:33:52.499213 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9de4f1892d3673e07ee30ca321b4fa80736eba589da93d1b4abf38d19d545de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5cbfc6f858-vwgx7" May 17 10:33:52.499263 kubelet[2722]: E0517 10:33:52.499231 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9de4f1892d3673e07ee30ca321b4fa80736eba589da93d1b4abf38d19d545de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5cbfc6f858-vwgx7" May 17 10:33:52.499349 kubelet[2722]: E0517 10:33:52.499271 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5cbfc6f858-vwgx7_calico-system(128445f0-2892-4b37-8390-5c9d48c00340)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5cbfc6f858-vwgx7_calico-system(128445f0-2892-4b37-8390-5c9d48c00340)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9de4f1892d3673e07ee30ca321b4fa80736eba589da93d1b4abf38d19d545de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5cbfc6f858-vwgx7" podUID="128445f0-2892-4b37-8390-5c9d48c00340" May 17 10:33:52.755969 systemd[1]: Created slice kubepods-besteffort-podc9cc419c_75dd_4426_acd6_306f9b76a4b6.slice - libcontainer container kubepods-besteffort-podc9cc419c_75dd_4426_acd6_306f9b76a4b6.slice. May 17 10:33:52.758276 containerd[1593]: time="2025-05-17T10:33:52.758240404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4b84,Uid:c9cc419c-75dd-4426-acd6-306f9b76a4b6,Namespace:calico-system,Attempt:0,}" May 17 10:33:52.805545 containerd[1593]: time="2025-05-17T10:33:52.805484651Z" level=error msg="Failed to destroy network for sandbox \"c7de29e9c796da4302d15746a47cec94b0ce770187f1e7385979641637b686c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.807058 containerd[1593]: time="2025-05-17T10:33:52.806997644Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4b84,Uid:c9cc419c-75dd-4426-acd6-306f9b76a4b6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7de29e9c796da4302d15746a47cec94b0ce770187f1e7385979641637b686c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.807281 kubelet[2722]: E0517 10:33:52.807236 2722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7de29e9c796da4302d15746a47cec94b0ce770187f1e7385979641637b686c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 17 10:33:52.807345 kubelet[2722]: E0517 10:33:52.807293 2722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7de29e9c796da4302d15746a47cec94b0ce770187f1e7385979641637b686c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b4b84" May 17 10:33:52.807345 kubelet[2722]: E0517 10:33:52.807315 2722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c7de29e9c796da4302d15746a47cec94b0ce770187f1e7385979641637b686c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b4b84" May 17 10:33:52.807429 kubelet[2722]: E0517 10:33:52.807355 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-b4b84_calico-system(c9cc419c-75dd-4426-acd6-306f9b76a4b6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-b4b84_calico-system(c9cc419c-75dd-4426-acd6-306f9b76a4b6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c7de29e9c796da4302d15746a47cec94b0ce770187f1e7385979641637b686c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-b4b84" podUID="c9cc419c-75dd-4426-acd6-306f9b76a4b6" May 17 10:33:52.830367 containerd[1593]: time="2025-05-17T10:33:52.830331684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 17 10:33:56.850565 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3663519557.mount: Deactivated successfully. May 17 10:33:57.547851 containerd[1593]: time="2025-05-17T10:33:57.547792409Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:57.548567 containerd[1593]: time="2025-05-17T10:33:57.548455758Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 17 10:33:57.549638 containerd[1593]: time="2025-05-17T10:33:57.549585656Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:57.551587 containerd[1593]: time="2025-05-17T10:33:57.551534736Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:33:57.552151 containerd[1593]: time="2025-05-17T10:33:57.551988861Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 4.721624967s" May 17 10:33:57.552151 containerd[1593]: time="2025-05-17T10:33:57.552016343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 17 10:33:57.560667 containerd[1593]: time="2025-05-17T10:33:57.560613532Z" level=info msg="CreateContainer within sandbox \"2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 17 10:33:57.569140 containerd[1593]: time="2025-05-17T10:33:57.569107016Z" level=info msg="Container ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d: CDI devices from CRI Config.CDIDevices: []" May 17 10:33:57.632066 containerd[1593]: time="2025-05-17T10:33:57.631744951Z" level=info msg="CreateContainer within sandbox \"2a019efa51aa217ab22a8276fde07ca57b594e9940ae3955a623cc963fd54681\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d\"" May 17 10:33:57.635126 containerd[1593]: time="2025-05-17T10:33:57.635091172Z" level=info msg="StartContainer for \"ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d\"" May 17 10:33:57.636528 containerd[1593]: time="2025-05-17T10:33:57.636490387Z" level=info msg="connecting to shim ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d" address="unix:///run/containerd/s/adf07145e5045271c74b1f2ad5294a33c01be0ad51169b8dc883d2ca554e5f25" protocol=ttrpc version=3 May 17 10:33:57.661518 systemd[1]: Started cri-containerd-ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d.scope - libcontainer container ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d. May 17 10:33:57.706172 containerd[1593]: time="2025-05-17T10:33:57.706130988Z" level=info msg="StartContainer for \"ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d\" returns successfully" May 17 10:33:57.780140 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 17 10:33:57.780343 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 17 10:33:57.868385 kubelet[2722]: I0517 10:33:57.867626 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-79btc" podStartSLOduration=1.164135149 podStartE2EDuration="17.867606796s" podCreationTimestamp="2025-05-17 10:33:40 +0000 UTC" firstStartedPulling="2025-05-17 10:33:40.84956617 +0000 UTC m=+16.185966760" lastFinishedPulling="2025-05-17 10:33:57.553037816 +0000 UTC m=+32.889438407" observedRunningTime="2025-05-17 10:33:57.866779418 +0000 UTC m=+33.203179998" watchObservedRunningTime="2025-05-17 10:33:57.867606796 +0000 UTC m=+33.204007386" May 17 10:33:57.966414 kubelet[2722]: I0517 10:33:57.966278 2722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/128445f0-2892-4b37-8390-5c9d48c00340-whisker-backend-key-pair\") pod \"128445f0-2892-4b37-8390-5c9d48c00340\" (UID: \"128445f0-2892-4b37-8390-5c9d48c00340\") " May 17 10:33:57.966414 kubelet[2722]: I0517 10:33:57.966327 2722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk8lz\" (UniqueName: \"kubernetes.io/projected/128445f0-2892-4b37-8390-5c9d48c00340-kube-api-access-zk8lz\") pod \"128445f0-2892-4b37-8390-5c9d48c00340\" (UID: \"128445f0-2892-4b37-8390-5c9d48c00340\") " May 17 10:33:57.966414 kubelet[2722]: I0517 10:33:57.966344 2722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/128445f0-2892-4b37-8390-5c9d48c00340-whisker-ca-bundle\") pod \"128445f0-2892-4b37-8390-5c9d48c00340\" (UID: \"128445f0-2892-4b37-8390-5c9d48c00340\") " May 17 10:33:57.973415 kubelet[2722]: I0517 10:33:57.970328 2722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/128445f0-2892-4b37-8390-5c9d48c00340-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "128445f0-2892-4b37-8390-5c9d48c00340" (UID: "128445f0-2892-4b37-8390-5c9d48c00340"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 17 10:33:57.976515 systemd[1]: var-lib-kubelet-pods-128445f0\x2d2892\x2d4b37\x2d8390\x2d5c9d48c00340-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzk8lz.mount: Deactivated successfully. May 17 10:33:57.979429 kubelet[2722]: I0517 10:33:57.978901 2722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128445f0-2892-4b37-8390-5c9d48c00340-kube-api-access-zk8lz" (OuterVolumeSpecName: "kube-api-access-zk8lz") pod "128445f0-2892-4b37-8390-5c9d48c00340" (UID: "128445f0-2892-4b37-8390-5c9d48c00340"). InnerVolumeSpecName "kube-api-access-zk8lz". PluginName "kubernetes.io/projected", VolumeGidValue "" May 17 10:33:57.980535 kubelet[2722]: I0517 10:33:57.980501 2722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128445f0-2892-4b37-8390-5c9d48c00340-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "128445f0-2892-4b37-8390-5c9d48c00340" (UID: "128445f0-2892-4b37-8390-5c9d48c00340"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" May 17 10:33:57.980785 systemd[1]: var-lib-kubelet-pods-128445f0\x2d2892\x2d4b37\x2d8390\x2d5c9d48c00340-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 17 10:33:58.070493 kubelet[2722]: I0517 10:33:58.070437 2722 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/128445f0-2892-4b37-8390-5c9d48c00340-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" May 17 10:33:58.070493 kubelet[2722]: I0517 10:33:58.070469 2722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk8lz\" (UniqueName: \"kubernetes.io/projected/128445f0-2892-4b37-8390-5c9d48c00340-kube-api-access-zk8lz\") on node \"localhost\" DevicePath \"\"" May 17 10:33:58.070493 kubelet[2722]: I0517 10:33:58.070478 2722 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/128445f0-2892-4b37-8390-5c9d48c00340-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" May 17 10:33:58.765096 systemd[1]: Removed slice kubepods-besteffort-pod128445f0_2892_4b37_8390_5c9d48c00340.slice - libcontainer container kubepods-besteffort-pod128445f0_2892_4b37_8390_5c9d48c00340.slice. May 17 10:33:58.904682 systemd[1]: Created slice kubepods-besteffort-pod6f72511d_6a3c_4ed6_87bd_7f2878038cc5.slice - libcontainer container kubepods-besteffort-pod6f72511d_6a3c_4ed6_87bd_7f2878038cc5.slice. May 17 10:33:58.974043 kubelet[2722]: I0517 10:33:58.973991 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r95h\" (UniqueName: \"kubernetes.io/projected/6f72511d-6a3c-4ed6-87bd-7f2878038cc5-kube-api-access-7r95h\") pod \"whisker-6b77d4f8d-qhcgn\" (UID: \"6f72511d-6a3c-4ed6-87bd-7f2878038cc5\") " pod="calico-system/whisker-6b77d4f8d-qhcgn" May 17 10:33:58.974043 kubelet[2722]: I0517 10:33:58.974038 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6f72511d-6a3c-4ed6-87bd-7f2878038cc5-whisker-backend-key-pair\") pod \"whisker-6b77d4f8d-qhcgn\" (UID: \"6f72511d-6a3c-4ed6-87bd-7f2878038cc5\") " pod="calico-system/whisker-6b77d4f8d-qhcgn" May 17 10:33:58.974520 kubelet[2722]: I0517 10:33:58.974059 2722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f72511d-6a3c-4ed6-87bd-7f2878038cc5-whisker-ca-bundle\") pod \"whisker-6b77d4f8d-qhcgn\" (UID: \"6f72511d-6a3c-4ed6-87bd-7f2878038cc5\") " pod="calico-system/whisker-6b77d4f8d-qhcgn" May 17 10:33:59.211013 containerd[1593]: time="2025-05-17T10:33:59.210971339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b77d4f8d-qhcgn,Uid:6f72511d-6a3c-4ed6-87bd-7f2878038cc5,Namespace:calico-system,Attempt:0,}" May 17 10:33:59.437782 systemd-networkd[1506]: cali2e3f0508ab7: Link UP May 17 10:33:59.437996 systemd-networkd[1506]: cali2e3f0508ab7: Gained carrier May 17 10:33:59.450680 containerd[1593]: 2025-05-17 10:33:59.318 [INFO][3997] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 17 10:33:59.450680 containerd[1593]: 2025-05-17 10:33:59.334 [INFO][3997] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0 whisker-6b77d4f8d- calico-system 6f72511d-6a3c-4ed6-87bd-7f2878038cc5 850 0 2025-05-17 10:33:58 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6b77d4f8d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-6b77d4f8d-qhcgn eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2e3f0508ab7 [] [] }} ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-" May 17 10:33:59.450680 containerd[1593]: 2025-05-17 10:33:59.334 [INFO][3997] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" May 17 10:33:59.450680 containerd[1593]: 2025-05-17 10:33:59.396 [INFO][4012] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" HandleID="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Workload="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.397 [INFO][4012] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" HandleID="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Workload="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003df640), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-6b77d4f8d-qhcgn", "timestamp":"2025-05-17 10:33:59.396441522 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.397 [INFO][4012] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.398 [INFO][4012] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.398 [INFO][4012] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.406 [INFO][4012] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" host="localhost" May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.411 [INFO][4012] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.414 [INFO][4012] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.416 [INFO][4012] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.417 [INFO][4012] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:33:59.450890 containerd[1593]: 2025-05-17 10:33:59.417 [INFO][4012] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" host="localhost" May 17 10:33:59.451112 containerd[1593]: 2025-05-17 10:33:59.418 [INFO][4012] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49 May 17 10:33:59.451112 containerd[1593]: 2025-05-17 10:33:59.421 [INFO][4012] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" host="localhost" May 17 10:33:59.451112 containerd[1593]: 2025-05-17 10:33:59.426 [INFO][4012] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" host="localhost" May 17 10:33:59.451112 containerd[1593]: 2025-05-17 10:33:59.426 [INFO][4012] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" host="localhost" May 17 10:33:59.451112 containerd[1593]: 2025-05-17 10:33:59.426 [INFO][4012] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:33:59.451112 containerd[1593]: 2025-05-17 10:33:59.426 [INFO][4012] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" HandleID="k8s-pod-network.6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Workload="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" May 17 10:33:59.451235 containerd[1593]: 2025-05-17 10:33:59.430 [INFO][3997] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0", GenerateName:"whisker-6b77d4f8d-", Namespace:"calico-system", SelfLink:"", UID:"6f72511d-6a3c-4ed6-87bd-7f2878038cc5", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6b77d4f8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-6b77d4f8d-qhcgn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2e3f0508ab7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:33:59.451235 containerd[1593]: 2025-05-17 10:33:59.430 [INFO][3997] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" May 17 10:33:59.451305 containerd[1593]: 2025-05-17 10:33:59.430 [INFO][3997] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2e3f0508ab7 ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" May 17 10:33:59.451305 containerd[1593]: 2025-05-17 10:33:59.439 [INFO][3997] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" May 17 10:33:59.451348 containerd[1593]: 2025-05-17 10:33:59.439 [INFO][3997] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0", GenerateName:"whisker-6b77d4f8d-", Namespace:"calico-system", SelfLink:"", UID:"6f72511d-6a3c-4ed6-87bd-7f2878038cc5", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6b77d4f8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49", Pod:"whisker-6b77d4f8d-qhcgn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2e3f0508ab7", MAC:"4a:dc:ad:b7:15:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:33:59.451427 containerd[1593]: 2025-05-17 10:33:59.447 [INFO][3997] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" Namespace="calico-system" Pod="whisker-6b77d4f8d-qhcgn" WorkloadEndpoint="localhost-k8s-whisker--6b77d4f8d--qhcgn-eth0" May 17 10:33:59.542893 containerd[1593]: time="2025-05-17T10:33:59.542768523Z" level=info msg="connecting to shim 6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49" address="unix:///run/containerd/s/dc232c45e8b3ae68073e592614179df4da8f3cb1b771dac4759ee17cd13d6fdb" namespace=k8s.io protocol=ttrpc version=3 May 17 10:33:59.577586 systemd[1]: Started cri-containerd-6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49.scope - libcontainer container 6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49. May 17 10:33:59.591201 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:33:59.623171 containerd[1593]: time="2025-05-17T10:33:59.623113737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b77d4f8d-qhcgn,Uid:6f72511d-6a3c-4ed6-87bd-7f2878038cc5,Namespace:calico-system,Attempt:0,} returns sandbox id \"6b442c45f93393fe7989a384fd2b05a5995811065ac32c7ec081b4d8dc88de49\"" May 17 10:33:59.624916 containerd[1593]: time="2025-05-17T10:33:59.624881455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 17 10:33:59.908718 containerd[1593]: time="2025-05-17T10:33:59.908574923Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:33:59.930796 containerd[1593]: time="2025-05-17T10:33:59.930730807Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 17 10:33:59.938541 containerd[1593]: time="2025-05-17T10:33:59.938468162Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:33:59.938742 kubelet[2722]: E0517 10:33:59.938694 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 17 10:33:59.938854 kubelet[2722]: E0517 10:33:59.938746 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 17 10:33:59.940867 kubelet[2722]: E0517 10:33:59.940813 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:265c6d0b859b42aa9e883fb177ea14f8,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7r95h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b77d4f8d-qhcgn_calico-system(6f72511d-6a3c-4ed6-87bd-7f2878038cc5): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:33:59.942913 containerd[1593]: time="2025-05-17T10:33:59.942877241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 17 10:34:00.188605 containerd[1593]: time="2025-05-17T10:34:00.188562948Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:00.189774 containerd[1593]: time="2025-05-17T10:34:00.189738190Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:00.189832 containerd[1593]: time="2025-05-17T10:34:00.189808753Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 17 10:34:00.190067 kubelet[2722]: E0517 10:34:00.190005 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 17 10:34:00.190358 kubelet[2722]: E0517 10:34:00.190074 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 17 10:34:00.190425 kubelet[2722]: E0517 10:34:00.190258 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7r95h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b77d4f8d-qhcgn_calico-system(6f72511d-6a3c-4ed6-87bd-7f2878038cc5): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:00.192971 kubelet[2722]: E0517 10:34:00.192927 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6b77d4f8d-qhcgn" podUID="6f72511d-6a3c-4ed6-87bd-7f2878038cc5" May 17 10:34:00.752590 kubelet[2722]: I0517 10:34:00.752546 2722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128445f0-2892-4b37-8390-5c9d48c00340" path="/var/lib/kubelet/pods/128445f0-2892-4b37-8390-5c9d48c00340/volumes" May 17 10:34:00.758590 systemd-networkd[1506]: cali2e3f0508ab7: Gained IPv6LL May 17 10:34:00.859816 kubelet[2722]: E0517 10:34:00.859778 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-6b77d4f8d-qhcgn" podUID="6f72511d-6a3c-4ed6-87bd-7f2878038cc5" May 17 10:34:01.434274 systemd[1]: Started sshd@7-10.0.0.118:22-10.0.0.1:57784.service - OpenSSH per-connection server daemon (10.0.0.1:57784). May 17 10:34:01.474220 sshd[4127]: Accepted publickey for core from 10.0.0.1 port 57784 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:01.475522 sshd-session[4127]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:01.479692 systemd-logind[1585]: New session 8 of user core. May 17 10:34:01.489516 systemd[1]: Started session-8.scope - Session 8 of User core. May 17 10:34:01.612828 sshd[4129]: Connection closed by 10.0.0.1 port 57784 May 17 10:34:01.613111 sshd-session[4127]: pam_unix(sshd:session): session closed for user core May 17 10:34:01.617324 systemd[1]: sshd@7-10.0.0.118:22-10.0.0.1:57784.service: Deactivated successfully. May 17 10:34:01.619474 systemd[1]: session-8.scope: Deactivated successfully. May 17 10:34:01.620474 systemd-logind[1585]: Session 8 logged out. Waiting for processes to exit. May 17 10:34:01.621804 systemd-logind[1585]: Removed session 8. May 17 10:34:02.750319 containerd[1593]: time="2025-05-17T10:34:02.750278585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gqhvf,Uid:dec0eedc-8551-4c18-a24b-b1f66271e3a3,Namespace:kube-system,Attempt:0,}" May 17 10:34:02.864545 kubelet[2722]: I0517 10:34:02.864417 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:02.921482 systemd-networkd[1506]: calid80119ca0f1: Link UP May 17 10:34:02.922226 systemd-networkd[1506]: calid80119ca0f1: Gained carrier May 17 10:34:02.936355 containerd[1593]: 2025-05-17 10:34:02.848 [INFO][4172] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 17 10:34:02.936355 containerd[1593]: 2025-05-17 10:34:02.858 [INFO][4172] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0 coredns-7c65d6cfc9- kube-system dec0eedc-8551-4c18-a24b-b1f66271e3a3 777 0 2025-05-17 10:33:30 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-gqhvf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid80119ca0f1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-" May 17 10:34:02.936355 containerd[1593]: 2025-05-17 10:34:02.858 [INFO][4172] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" May 17 10:34:02.936355 containerd[1593]: 2025-05-17 10:34:02.885 [INFO][4189] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" HandleID="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Workload="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.886 [INFO][4189] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" HandleID="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Workload="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a71b0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-gqhvf", "timestamp":"2025-05-17 10:34:02.885887372 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.886 [INFO][4189] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.886 [INFO][4189] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.886 [INFO][4189] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.893 [INFO][4189] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" host="localhost" May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.899 [INFO][4189] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.902 [INFO][4189] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.904 [INFO][4189] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.905 [INFO][4189] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:34:02.936625 containerd[1593]: 2025-05-17 10:34:02.905 [INFO][4189] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" host="localhost" May 17 10:34:02.936919 containerd[1593]: 2025-05-17 10:34:02.907 [INFO][4189] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464 May 17 10:34:02.936919 containerd[1593]: 2025-05-17 10:34:02.911 [INFO][4189] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" host="localhost" May 17 10:34:02.936919 containerd[1593]: 2025-05-17 10:34:02.915 [INFO][4189] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" host="localhost" May 17 10:34:02.936919 containerd[1593]: 2025-05-17 10:34:02.915 [INFO][4189] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" host="localhost" May 17 10:34:02.936919 containerd[1593]: 2025-05-17 10:34:02.915 [INFO][4189] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:34:02.936919 containerd[1593]: 2025-05-17 10:34:02.915 [INFO][4189] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" HandleID="k8s-pod-network.4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Workload="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" May 17 10:34:02.937080 containerd[1593]: 2025-05-17 10:34:02.918 [INFO][4172] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"dec0eedc-8551-4c18-a24b-b1f66271e3a3", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-gqhvf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid80119ca0f1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:02.937163 containerd[1593]: 2025-05-17 10:34:02.918 [INFO][4172] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" May 17 10:34:02.937163 containerd[1593]: 2025-05-17 10:34:02.918 [INFO][4172] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid80119ca0f1 ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" May 17 10:34:02.937163 containerd[1593]: 2025-05-17 10:34:02.922 [INFO][4172] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" May 17 10:34:02.937250 containerd[1593]: 2025-05-17 10:34:02.922 [INFO][4172] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"dec0eedc-8551-4c18-a24b-b1f66271e3a3", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464", Pod:"coredns-7c65d6cfc9-gqhvf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid80119ca0f1", MAC:"8e:f1:e6:74:ca:f1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:02.937250 containerd[1593]: 2025-05-17 10:34:02.931 [INFO][4172] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gqhvf" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gqhvf-eth0" May 17 10:34:02.966647 containerd[1593]: time="2025-05-17T10:34:02.966605625Z" level=info msg="connecting to shim 4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464" address="unix:///run/containerd/s/a1efe11cda763d42bae78b3b05ccb004a878a918ce67191546b5bbc16ac21ae2" namespace=k8s.io protocol=ttrpc version=3 May 17 10:34:02.995598 systemd[1]: Started cri-containerd-4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464.scope - libcontainer container 4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464. May 17 10:34:03.009267 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:34:03.039818 containerd[1593]: time="2025-05-17T10:34:03.039771037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gqhvf,Uid:dec0eedc-8551-4c18-a24b-b1f66271e3a3,Namespace:kube-system,Attempt:0,} returns sandbox id \"4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464\"" May 17 10:34:03.042450 containerd[1593]: time="2025-05-17T10:34:03.042385374Z" level=info msg="CreateContainer within sandbox \"4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 17 10:34:03.054410 containerd[1593]: time="2025-05-17T10:34:03.054338869Z" level=info msg="Container 99dc053746ba9c64d9b4e688ee2923af07497728cda0a801a41da96e1758e691: CDI devices from CRI Config.CDIDevices: []" May 17 10:34:03.057930 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3035632572.mount: Deactivated successfully. May 17 10:34:03.061057 containerd[1593]: time="2025-05-17T10:34:03.061013564Z" level=info msg="CreateContainer within sandbox \"4212889c95404bdc48bd7d3c72d6379e81158f57493c5d92c435c3e526a08464\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"99dc053746ba9c64d9b4e688ee2923af07497728cda0a801a41da96e1758e691\"" May 17 10:34:03.061584 containerd[1593]: time="2025-05-17T10:34:03.061535806Z" level=info msg="StartContainer for \"99dc053746ba9c64d9b4e688ee2923af07497728cda0a801a41da96e1758e691\"" May 17 10:34:03.062558 containerd[1593]: time="2025-05-17T10:34:03.062530417Z" level=info msg="connecting to shim 99dc053746ba9c64d9b4e688ee2923af07497728cda0a801a41da96e1758e691" address="unix:///run/containerd/s/a1efe11cda763d42bae78b3b05ccb004a878a918ce67191546b5bbc16ac21ae2" protocol=ttrpc version=3 May 17 10:34:03.087647 systemd[1]: Started cri-containerd-99dc053746ba9c64d9b4e688ee2923af07497728cda0a801a41da96e1758e691.scope - libcontainer container 99dc053746ba9c64d9b4e688ee2923af07497728cda0a801a41da96e1758e691. May 17 10:34:03.123559 containerd[1593]: time="2025-05-17T10:34:03.123521569Z" level=info msg="StartContainer for \"99dc053746ba9c64d9b4e688ee2923af07497728cda0a801a41da96e1758e691\" returns successfully" May 17 10:34:03.648276 systemd-networkd[1506]: vxlan.calico: Link UP May 17 10:34:03.648286 systemd-networkd[1506]: vxlan.calico: Gained carrier May 17 10:34:03.750923 containerd[1593]: time="2025-05-17T10:34:03.750871635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9f7d64749-9p6j6,Uid:31caddae-b106-4cda-83c6-e178d2c2a47b,Namespace:calico-system,Attempt:0,}" May 17 10:34:03.848608 systemd-networkd[1506]: calic8ccd94864e: Link UP May 17 10:34:03.848832 systemd-networkd[1506]: calic8ccd94864e: Gained carrier May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.787 [INFO][4392] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0 calico-kube-controllers-9f7d64749- calico-system 31caddae-b106-4cda-83c6-e178d2c2a47b 785 0 2025-05-17 10:33:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:9f7d64749 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-9f7d64749-9p6j6 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic8ccd94864e [] [] }} ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.788 [INFO][4392] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.812 [INFO][4406] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" HandleID="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Workload="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.812 [INFO][4406] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" HandleID="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Workload="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139b90), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-9f7d64749-9p6j6", "timestamp":"2025-05-17 10:34:03.812222264 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.812 [INFO][4406] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.812 [INFO][4406] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.812 [INFO][4406] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.817 [INFO][4406] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.821 [INFO][4406] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.826 [INFO][4406] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.827 [INFO][4406] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.829 [INFO][4406] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.829 [INFO][4406] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.834 [INFO][4406] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61 May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.837 [INFO][4406] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.842 [INFO][4406] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.842 [INFO][4406] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" host="localhost" May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.842 [INFO][4406] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:34:03.864149 containerd[1593]: 2025-05-17 10:34:03.842 [INFO][4406] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" HandleID="k8s-pod-network.037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Workload="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" May 17 10:34:03.864723 containerd[1593]: 2025-05-17 10:34:03.846 [INFO][4392] cni-plugin/k8s.go 418: Populated endpoint ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0", GenerateName:"calico-kube-controllers-9f7d64749-", Namespace:"calico-system", SelfLink:"", UID:"31caddae-b106-4cda-83c6-e178d2c2a47b", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9f7d64749", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-9f7d64749-9p6j6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic8ccd94864e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:03.864723 containerd[1593]: 2025-05-17 10:34:03.846 [INFO][4392] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" May 17 10:34:03.864723 containerd[1593]: 2025-05-17 10:34:03.846 [INFO][4392] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic8ccd94864e ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" May 17 10:34:03.864723 containerd[1593]: 2025-05-17 10:34:03.848 [INFO][4392] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" May 17 10:34:03.864723 containerd[1593]: 2025-05-17 10:34:03.848 [INFO][4392] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0", GenerateName:"calico-kube-controllers-9f7d64749-", Namespace:"calico-system", SelfLink:"", UID:"31caddae-b106-4cda-83c6-e178d2c2a47b", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9f7d64749", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61", Pod:"calico-kube-controllers-9f7d64749-9p6j6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic8ccd94864e", MAC:"a6:63:7b:15:a2:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:03.864723 containerd[1593]: 2025-05-17 10:34:03.860 [INFO][4392] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" Namespace="calico-system" Pod="calico-kube-controllers-9f7d64749-9p6j6" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9f7d64749--9p6j6-eth0" May 17 10:34:03.875411 kubelet[2722]: I0517 10:34:03.875027 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-gqhvf" podStartSLOduration=33.875008854 podStartE2EDuration="33.875008854s" podCreationTimestamp="2025-05-17 10:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-17 10:34:03.874682339 +0000 UTC m=+39.211082939" watchObservedRunningTime="2025-05-17 10:34:03.875008854 +0000 UTC m=+39.211409444" May 17 10:34:03.894294 containerd[1593]: time="2025-05-17T10:34:03.894030973Z" level=info msg="connecting to shim 037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61" address="unix:///run/containerd/s/58df4d70793d9ff46c00e9699f080eac1a4d6b1f33784bd8ecf3eef6228176f3" namespace=k8s.io protocol=ttrpc version=3 May 17 10:34:03.922572 systemd[1]: Started cri-containerd-037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61.scope - libcontainer container 037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61. May 17 10:34:03.942501 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:34:03.976903 containerd[1593]: time="2025-05-17T10:34:03.976863439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9f7d64749-9p6j6,Uid:31caddae-b106-4cda-83c6-e178d2c2a47b,Namespace:calico-system,Attempt:0,} returns sandbox id \"037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61\"" May 17 10:34:03.978376 containerd[1593]: time="2025-05-17T10:34:03.978349654Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 17 10:34:04.750997 containerd[1593]: time="2025-05-17T10:34:04.750941912Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-8mjx6,Uid:a7e72c27-70ce-4980-bd76-76af4f838036,Namespace:calico-apiserver,Attempt:0,}" May 17 10:34:04.750997 containerd[1593]: time="2025-05-17T10:34:04.750995583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4b84,Uid:c9cc419c-75dd-4426-acd6-306f9b76a4b6,Namespace:calico-system,Attempt:0,}" May 17 10:34:04.751457 containerd[1593]: time="2025-05-17T10:34:04.751304364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-t7rpr,Uid:0ecfc371-19f9-4a1e-ad70-e6bcdd45e502,Namespace:calico-apiserver,Attempt:0,}" May 17 10:34:04.793421 systemd-networkd[1506]: calid80119ca0f1: Gained IPv6LL May 17 10:34:04.880929 systemd-networkd[1506]: cali87da3555bb3: Link UP May 17 10:34:04.881498 systemd-networkd[1506]: cali87da3555bb3: Gained carrier May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.802 [INFO][4508] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--b4b84-eth0 csi-node-driver- calico-system c9cc419c-75dd-4426-acd6-306f9b76a4b6 676 0 2025-05-17 10:33:40 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:68bf44dd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-b4b84 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali87da3555bb3 [] [] }} ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.802 [INFO][4508] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-eth0" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.844 [INFO][4552] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" HandleID="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Workload="localhost-k8s-csi--node--driver--b4b84-eth0" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.844 [INFO][4552] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" HandleID="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Workload="localhost-k8s-csi--node--driver--b4b84-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00046d1a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-b4b84", "timestamp":"2025-05-17 10:34:04.84457725 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.844 [INFO][4552] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.844 [INFO][4552] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.844 [INFO][4552] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.854 [INFO][4552] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.858 [INFO][4552] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.861 [INFO][4552] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.862 [INFO][4552] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.864 [INFO][4552] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.864 [INFO][4552] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.866 [INFO][4552] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92 May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.869 [INFO][4552] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.875 [INFO][4552] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.875 [INFO][4552] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" host="localhost" May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.875 [INFO][4552] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:34:04.894893 containerd[1593]: 2025-05-17 10:34:04.875 [INFO][4552] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" HandleID="k8s-pod-network.af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Workload="localhost-k8s-csi--node--driver--b4b84-eth0" May 17 10:34:04.895494 containerd[1593]: 2025-05-17 10:34:04.878 [INFO][4508] cni-plugin/k8s.go 418: Populated endpoint ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--b4b84-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c9cc419c-75dd-4426-acd6-306f9b76a4b6", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-b4b84", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali87da3555bb3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:04.895494 containerd[1593]: 2025-05-17 10:34:04.878 [INFO][4508] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-eth0" May 17 10:34:04.895494 containerd[1593]: 2025-05-17 10:34:04.878 [INFO][4508] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali87da3555bb3 ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-eth0" May 17 10:34:04.895494 containerd[1593]: 2025-05-17 10:34:04.882 [INFO][4508] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-eth0" May 17 10:34:04.895494 containerd[1593]: 2025-05-17 10:34:04.882 [INFO][4508] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--b4b84-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c9cc419c-75dd-4426-acd6-306f9b76a4b6", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92", Pod:"csi-node-driver-b4b84", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali87da3555bb3", MAC:"6e:5f:e0:9a:0f:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:04.895494 containerd[1593]: 2025-05-17 10:34:04.890 [INFO][4508] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" Namespace="calico-system" Pod="csi-node-driver-b4b84" WorkloadEndpoint="localhost-k8s-csi--node--driver--b4b84-eth0" May 17 10:34:04.918683 containerd[1593]: time="2025-05-17T10:34:04.918625101Z" level=info msg="connecting to shim af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92" address="unix:///run/containerd/s/0d7062d80e4f19481ef187dc54801b4906a22ff33936f631c1e3e706ccb2700c" namespace=k8s.io protocol=ttrpc version=3 May 17 10:34:04.943525 systemd[1]: Started cri-containerd-af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92.scope - libcontainer container af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92. May 17 10:34:04.956272 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:34:04.973727 containerd[1593]: time="2025-05-17T10:34:04.973683329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4b84,Uid:c9cc419c-75dd-4426-acd6-306f9b76a4b6,Namespace:calico-system,Attempt:0,} returns sandbox id \"af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92\"" May 17 10:34:04.986298 systemd-networkd[1506]: calic8eb74824af: Link UP May 17 10:34:04.987080 systemd-networkd[1506]: calic8eb74824af: Gained carrier May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.806 [INFO][4530] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0 calico-apiserver-58bc8d76c5- calico-apiserver 0ecfc371-19f9-4a1e-ad70-e6bcdd45e502 786 0 2025-05-17 10:33:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58bc8d76c5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-58bc8d76c5-t7rpr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic8eb74824af [] [] }} ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.806 [INFO][4530] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.847 [INFO][4558] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" HandleID="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Workload="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.847 [INFO][4558] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" HandleID="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Workload="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00042e640), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-58bc8d76c5-t7rpr", "timestamp":"2025-05-17 10:34:04.847717727 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.847 [INFO][4558] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.875 [INFO][4558] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.875 [INFO][4558] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.956 [INFO][4558] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.962 [INFO][4558] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.967 [INFO][4558] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.969 [INFO][4558] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.971 [INFO][4558] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.971 [INFO][4558] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.973 [INFO][4558] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.976 [INFO][4558] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.981 [INFO][4558] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.981 [INFO][4558] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" host="localhost" May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.981 [INFO][4558] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:34:04.998628 containerd[1593]: 2025-05-17 10:34:04.981 [INFO][4558] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" HandleID="k8s-pod-network.2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Workload="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" May 17 10:34:04.999234 containerd[1593]: 2025-05-17 10:34:04.984 [INFO][4530] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0", GenerateName:"calico-apiserver-58bc8d76c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"0ecfc371-19f9-4a1e-ad70-e6bcdd45e502", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58bc8d76c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-58bc8d76c5-t7rpr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic8eb74824af", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:04.999234 containerd[1593]: 2025-05-17 10:34:04.984 [INFO][4530] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" May 17 10:34:04.999234 containerd[1593]: 2025-05-17 10:34:04.984 [INFO][4530] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic8eb74824af ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" May 17 10:34:04.999234 containerd[1593]: 2025-05-17 10:34:04.987 [INFO][4530] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" May 17 10:34:04.999234 containerd[1593]: 2025-05-17 10:34:04.987 [INFO][4530] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0", GenerateName:"calico-apiserver-58bc8d76c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"0ecfc371-19f9-4a1e-ad70-e6bcdd45e502", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58bc8d76c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a", Pod:"calico-apiserver-58bc8d76c5-t7rpr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic8eb74824af", MAC:"96:54:57:5d:07:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:04.999234 containerd[1593]: 2025-05-17 10:34:04.995 [INFO][4530] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-t7rpr" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--t7rpr-eth0" May 17 10:34:05.020588 containerd[1593]: time="2025-05-17T10:34:05.020507104Z" level=info msg="connecting to shim 2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a" address="unix:///run/containerd/s/05dcccb5a1a6b1b15705fbe42c82d0d2b9624d41f652ffc5bf974d458e0385e7" namespace=k8s.io protocol=ttrpc version=3 May 17 10:34:05.052565 systemd[1]: Started cri-containerd-2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a.scope - libcontainer container 2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a. May 17 10:34:05.068150 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:34:05.094338 systemd-networkd[1506]: cali8e54d5fc08d: Link UP May 17 10:34:05.094872 systemd-networkd[1506]: cali8e54d5fc08d: Gained carrier May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:04.819 [INFO][4519] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0 calico-apiserver-58bc8d76c5- calico-apiserver a7e72c27-70ce-4980-bd76-76af4f838036 788 0 2025-05-17 10:33:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58bc8d76c5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-58bc8d76c5-8mjx6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8e54d5fc08d [] [] }} ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:04.820 [INFO][4519] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:04.852 [INFO][4568] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" HandleID="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Workload="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:04.852 [INFO][4568] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" HandleID="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Workload="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00023d0c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-58bc8d76c5-8mjx6", "timestamp":"2025-05-17 10:34:04.852019026 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:04.852 [INFO][4568] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:04.981 [INFO][4568] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:04.982 [INFO][4568] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.055 [INFO][4568] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.064 [INFO][4568] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.070 [INFO][4568] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.071 [INFO][4568] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.073 [INFO][4568] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.073 [INFO][4568] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.075 [INFO][4568] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915 May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.079 [INFO][4568] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.086 [INFO][4568] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.086 [INFO][4568] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" host="localhost" May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.086 [INFO][4568] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:34:05.117502 containerd[1593]: 2025-05-17 10:34:05.086 [INFO][4568] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" HandleID="k8s-pod-network.8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Workload="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" May 17 10:34:05.118058 containerd[1593]: 2025-05-17 10:34:05.092 [INFO][4519] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0", GenerateName:"calico-apiserver-58bc8d76c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"a7e72c27-70ce-4980-bd76-76af4f838036", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58bc8d76c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-58bc8d76c5-8mjx6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8e54d5fc08d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:05.118058 containerd[1593]: 2025-05-17 10:34:05.092 [INFO][4519] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" May 17 10:34:05.118058 containerd[1593]: 2025-05-17 10:34:05.092 [INFO][4519] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8e54d5fc08d ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" May 17 10:34:05.118058 containerd[1593]: 2025-05-17 10:34:05.094 [INFO][4519] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" May 17 10:34:05.118058 containerd[1593]: 2025-05-17 10:34:05.094 [INFO][4519] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0", GenerateName:"calico-apiserver-58bc8d76c5-", Namespace:"calico-apiserver", SelfLink:"", UID:"a7e72c27-70ce-4980-bd76-76af4f838036", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58bc8d76c5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915", Pod:"calico-apiserver-58bc8d76c5-8mjx6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8e54d5fc08d", MAC:"1a:e3:f3:cf:f5:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:05.118058 containerd[1593]: 2025-05-17 10:34:05.106 [INFO][4519] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" Namespace="calico-apiserver" Pod="calico-apiserver-58bc8d76c5-8mjx6" WorkloadEndpoint="localhost-k8s-calico--apiserver--58bc8d76c5--8mjx6-eth0" May 17 10:34:05.120013 containerd[1593]: time="2025-05-17T10:34:05.119968573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-t7rpr,Uid:0ecfc371-19f9-4a1e-ad70-e6bcdd45e502,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a\"" May 17 10:34:05.151160 containerd[1593]: time="2025-05-17T10:34:05.151109612Z" level=info msg="connecting to shim 8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915" address="unix:///run/containerd/s/b87c840817f1f468e9a2118459bef2d0026eb99c3dd22e50e6c345e2aaabf18e" namespace=k8s.io protocol=ttrpc version=3 May 17 10:34:05.175714 systemd[1]: Started cri-containerd-8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915.scope - libcontainer container 8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915. May 17 10:34:05.188622 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:34:05.218506 containerd[1593]: time="2025-05-17T10:34:05.218476059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58bc8d76c5-8mjx6,Uid:a7e72c27-70ce-4980-bd76-76af4f838036,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915\"" May 17 10:34:05.237564 systemd-networkd[1506]: vxlan.calico: Gained IPv6LL May 17 10:34:05.493543 systemd-networkd[1506]: calic8ccd94864e: Gained IPv6LL May 17 10:34:05.750766 containerd[1593]: time="2025-05-17T10:34:05.750516538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8tc7w,Uid:7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae,Namespace:kube-system,Attempt:0,}" May 17 10:34:05.751535 containerd[1593]: time="2025-05-17T10:34:05.751492833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-b6dz4,Uid:065bd371-4a25-4865-bbfe-0d2b88d6ea40,Namespace:calico-system,Attempt:0,}" May 17 10:34:05.869415 systemd-networkd[1506]: cali6fd47652d31: Link UP May 17 10:34:05.870093 systemd-networkd[1506]: cali6fd47652d31: Gained carrier May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.800 [INFO][4761] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0 goldmane-8f77d7b6c- calico-system 065bd371-4a25-4865-bbfe-0d2b88d6ea40 787 0 2025-05-17 10:33:39 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:8f77d7b6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-8f77d7b6c-b6dz4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali6fd47652d31 [] [] }} ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.800 [INFO][4761] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.830 [INFO][4784] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" HandleID="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Workload="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.830 [INFO][4784] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" HandleID="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Workload="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000495860), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-8f77d7b6c-b6dz4", "timestamp":"2025-05-17 10:34:05.830494289 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.830 [INFO][4784] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.830 [INFO][4784] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.831 [INFO][4784] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.839 [INFO][4784] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.845 [INFO][4784] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.849 [INFO][4784] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.851 [INFO][4784] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.853 [INFO][4784] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.853 [INFO][4784] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.854 [INFO][4784] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6 May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.858 [INFO][4784] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.863 [INFO][4784] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.863 [INFO][4784] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" host="localhost" May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.864 [INFO][4784] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:34:05.890698 containerd[1593]: 2025-05-17 10:34:05.864 [INFO][4784] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" HandleID="k8s-pod-network.1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Workload="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" May 17 10:34:05.891304 containerd[1593]: 2025-05-17 10:34:05.867 [INFO][4761] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"065bd371-4a25-4865-bbfe-0d2b88d6ea40", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-8f77d7b6c-b6dz4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6fd47652d31", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:05.891304 containerd[1593]: 2025-05-17 10:34:05.867 [INFO][4761] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" May 17 10:34:05.891304 containerd[1593]: 2025-05-17 10:34:05.867 [INFO][4761] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6fd47652d31 ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" May 17 10:34:05.891304 containerd[1593]: 2025-05-17 10:34:05.869 [INFO][4761] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" May 17 10:34:05.891304 containerd[1593]: 2025-05-17 10:34:05.869 [INFO][4761] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"065bd371-4a25-4865-bbfe-0d2b88d6ea40", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6", Pod:"goldmane-8f77d7b6c-b6dz4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6fd47652d31", MAC:"12:44:6b:85:4c:1c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:05.891304 containerd[1593]: 2025-05-17 10:34:05.884 [INFO][4761] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" Namespace="calico-system" Pod="goldmane-8f77d7b6c-b6dz4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--b6dz4-eth0" May 17 10:34:05.922784 containerd[1593]: time="2025-05-17T10:34:05.922685086Z" level=info msg="connecting to shim 1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6" address="unix:///run/containerd/s/48c34c9e3c6a2336853ed2512b4db1541639c306dc0907155017dd5f2c248866" namespace=k8s.io protocol=ttrpc version=3 May 17 10:34:05.950526 systemd[1]: Started cri-containerd-1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6.scope - libcontainer container 1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6. May 17 10:34:05.967079 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:34:05.986300 systemd-networkd[1506]: caliafecb506efb: Link UP May 17 10:34:05.987144 systemd-networkd[1506]: caliafecb506efb: Gained carrier May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.800 [INFO][4749] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0 coredns-7c65d6cfc9- kube-system 7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae 784 0 2025-05-17 10:33:30 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-8tc7w eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliafecb506efb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.801 [INFO][4749] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.831 [INFO][4785] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" HandleID="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Workload="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.831 [INFO][4785] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" HandleID="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Workload="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad520), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-8tc7w", "timestamp":"2025-05-17 10:34:05.831049123 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.831 [INFO][4785] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.864 [INFO][4785] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.864 [INFO][4785] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.940 [INFO][4785] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.948 [INFO][4785] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.953 [INFO][4785] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.955 [INFO][4785] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.958 [INFO][4785] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.958 [INFO][4785] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.960 [INFO][4785] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9 May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.965 [INFO][4785] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.976 [INFO][4785] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.976 [INFO][4785] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" host="localhost" May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.976 [INFO][4785] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 17 10:34:06.009839 containerd[1593]: 2025-05-17 10:34:05.976 [INFO][4785] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" HandleID="k8s-pod-network.b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Workload="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" May 17 10:34:06.010699 containerd[1593]: 2025-05-17 10:34:05.982 [INFO][4749] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-8tc7w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliafecb506efb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:06.010699 containerd[1593]: 2025-05-17 10:34:05.982 [INFO][4749] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" May 17 10:34:06.010699 containerd[1593]: 2025-05-17 10:34:05.982 [INFO][4749] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliafecb506efb ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" May 17 10:34:06.010699 containerd[1593]: 2025-05-17 10:34:05.987 [INFO][4749] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" May 17 10:34:06.010699 containerd[1593]: 2025-05-17 10:34:05.987 [INFO][4749] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.May, 17, 10, 33, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9", Pod:"coredns-7c65d6cfc9-8tc7w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliafecb506efb", MAC:"ea:32:42:75:a0:eb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 17 10:34:06.010699 containerd[1593]: 2025-05-17 10:34:06.001 [INFO][4749] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8tc7w" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--8tc7w-eth0" May 17 10:34:06.051088 containerd[1593]: time="2025-05-17T10:34:06.050872710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-b6dz4,Uid:065bd371-4a25-4865-bbfe-0d2b88d6ea40,Namespace:calico-system,Attempt:0,} returns sandbox id \"1c0237438f69520371e6cd12e2e2fec907b56cb12416c0a9d893bf7668c5eef6\"" May 17 10:34:06.078618 containerd[1593]: time="2025-05-17T10:34:06.078570978Z" level=info msg="connecting to shim b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9" address="unix:///run/containerd/s/7d1522f8390bc5020aafe2591757cfa176f735ae7a75d2cb54be4af1b8b938c0" namespace=k8s.io protocol=ttrpc version=3 May 17 10:34:06.105530 systemd[1]: Started cri-containerd-b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9.scope - libcontainer container b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9. May 17 10:34:06.119098 systemd-resolved[1420]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 17 10:34:06.133541 systemd-networkd[1506]: calic8eb74824af: Gained IPv6LL May 17 10:34:06.148709 containerd[1593]: time="2025-05-17T10:34:06.148669465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8tc7w,Uid:7dd6efa3-b7d1-4b38-8155-4bdfa4d50cae,Namespace:kube-system,Attempt:0,} returns sandbox id \"b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9\"" May 17 10:34:06.151889 containerd[1593]: time="2025-05-17T10:34:06.151860254Z" level=info msg="CreateContainer within sandbox \"b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 17 10:34:06.168747 containerd[1593]: time="2025-05-17T10:34:06.168713361Z" level=info msg="Container 4d1a980aa3490323b1dd8cafe06d8db48d8103a5c9168bfaaa80c0a2cf5c919a: CDI devices from CRI Config.CDIDevices: []" May 17 10:34:06.175340 containerd[1593]: time="2025-05-17T10:34:06.175302109Z" level=info msg="CreateContainer within sandbox \"b172d99e603f2185a84e3e548076320ef0580f0cf6ad6de1d0a0d1784cfa78d9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4d1a980aa3490323b1dd8cafe06d8db48d8103a5c9168bfaaa80c0a2cf5c919a\"" May 17 10:34:06.176182 containerd[1593]: time="2025-05-17T10:34:06.175871630Z" level=info msg="StartContainer for \"4d1a980aa3490323b1dd8cafe06d8db48d8103a5c9168bfaaa80c0a2cf5c919a\"" May 17 10:34:06.176970 containerd[1593]: time="2025-05-17T10:34:06.176936873Z" level=info msg="connecting to shim 4d1a980aa3490323b1dd8cafe06d8db48d8103a5c9168bfaaa80c0a2cf5c919a" address="unix:///run/containerd/s/7d1522f8390bc5020aafe2591757cfa176f735ae7a75d2cb54be4af1b8b938c0" protocol=ttrpc version=3 May 17 10:34:06.197526 systemd-networkd[1506]: cali8e54d5fc08d: Gained IPv6LL May 17 10:34:06.202590 systemd[1]: Started cri-containerd-4d1a980aa3490323b1dd8cafe06d8db48d8103a5c9168bfaaa80c0a2cf5c919a.scope - libcontainer container 4d1a980aa3490323b1dd8cafe06d8db48d8103a5c9168bfaaa80c0a2cf5c919a. May 17 10:34:06.243061 containerd[1593]: time="2025-05-17T10:34:06.243017836Z" level=info msg="StartContainer for \"4d1a980aa3490323b1dd8cafe06d8db48d8103a5c9168bfaaa80c0a2cf5c919a\" returns successfully" May 17 10:34:06.422505 containerd[1593]: time="2025-05-17T10:34:06.422371622Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:06.423252 containerd[1593]: time="2025-05-17T10:34:06.423198606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 17 10:34:06.424448 containerd[1593]: time="2025-05-17T10:34:06.424415544Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:06.426357 containerd[1593]: time="2025-05-17T10:34:06.426307601Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:06.426758 containerd[1593]: time="2025-05-17T10:34:06.426730457Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 2.448350726s" May 17 10:34:06.426793 containerd[1593]: time="2025-05-17T10:34:06.426758279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 17 10:34:06.427607 containerd[1593]: time="2025-05-17T10:34:06.427583089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 17 10:34:06.434372 containerd[1593]: time="2025-05-17T10:34:06.434335105Z" level=info msg="CreateContainer within sandbox \"037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 17 10:34:06.442320 containerd[1593]: time="2025-05-17T10:34:06.442281555Z" level=info msg="Container 5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c: CDI devices from CRI Config.CDIDevices: []" May 17 10:34:06.449757 containerd[1593]: time="2025-05-17T10:34:06.449716604Z" level=info msg="CreateContainer within sandbox \"037ef0cd594d1f3375d8f514f60f00e5ffb37fe910c855413e41b06b8008dd61\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c\"" May 17 10:34:06.450410 containerd[1593]: time="2025-05-17T10:34:06.450078375Z" level=info msg="StartContainer for \"5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c\"" May 17 10:34:06.451036 containerd[1593]: time="2025-05-17T10:34:06.450981563Z" level=info msg="connecting to shim 5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c" address="unix:///run/containerd/s/58df4d70793d9ff46c00e9699f080eac1a4d6b1f33784bd8ecf3eef6228176f3" protocol=ttrpc version=3 May 17 10:34:06.476743 systemd[1]: Started cri-containerd-5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c.scope - libcontainer container 5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c. May 17 10:34:06.572730 containerd[1593]: time="2025-05-17T10:34:06.572680207Z" level=info msg="StartContainer for \"5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c\" returns successfully" May 17 10:34:06.626792 systemd[1]: Started sshd@8-10.0.0.118:22-10.0.0.1:42216.service - OpenSSH per-connection server daemon (10.0.0.1:42216). May 17 10:34:06.688017 sshd[4993]: Accepted publickey for core from 10.0.0.1 port 42216 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:06.689722 sshd-session[4993]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:06.694539 systemd-logind[1585]: New session 9 of user core. May 17 10:34:06.704821 systemd[1]: Started session-9.scope - Session 9 of User core. May 17 10:34:06.873120 sshd[4996]: Connection closed by 10.0.0.1 port 42216 May 17 10:34:06.873402 sshd-session[4993]: pam_unix(sshd:session): session closed for user core May 17 10:34:06.876319 systemd[1]: sshd@8-10.0.0.118:22-10.0.0.1:42216.service: Deactivated successfully. May 17 10:34:06.878380 systemd[1]: session-9.scope: Deactivated successfully. May 17 10:34:06.880826 systemd-logind[1585]: Session 9 logged out. Waiting for processes to exit. May 17 10:34:06.881748 systemd-logind[1585]: Removed session 9. May 17 10:34:06.900963 kubelet[2722]: I0517 10:34:06.900894 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-9f7d64749-9p6j6" podStartSLOduration=24.451530804 podStartE2EDuration="26.900877052s" podCreationTimestamp="2025-05-17 10:33:40 +0000 UTC" firstStartedPulling="2025-05-17 10:34:03.978126785 +0000 UTC m=+39.314527365" lastFinishedPulling="2025-05-17 10:34:06.427473023 +0000 UTC m=+41.763873613" observedRunningTime="2025-05-17 10:34:06.899225838 +0000 UTC m=+42.235626428" watchObservedRunningTime="2025-05-17 10:34:06.900877052 +0000 UTC m=+42.237277642" May 17 10:34:06.902521 systemd-networkd[1506]: cali87da3555bb3: Gained IPv6LL May 17 10:34:06.913748 kubelet[2722]: I0517 10:34:06.913673 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-8tc7w" podStartSLOduration=36.913655047 podStartE2EDuration="36.913655047s" podCreationTimestamp="2025-05-17 10:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-17 10:34:06.912634769 +0000 UTC m=+42.249035359" watchObservedRunningTime="2025-05-17 10:34:06.913655047 +0000 UTC m=+42.250055637" May 17 10:34:07.093577 systemd-networkd[1506]: caliafecb506efb: Gained IPv6LL May 17 10:34:07.891709 kubelet[2722]: I0517 10:34:07.891661 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:07.926901 systemd-networkd[1506]: cali6fd47652d31: Gained IPv6LL May 17 10:34:08.023912 containerd[1593]: time="2025-05-17T10:34:08.023863293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:08.024594 containerd[1593]: time="2025-05-17T10:34:08.024572376Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 17 10:34:08.025740 containerd[1593]: time="2025-05-17T10:34:08.025688163Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:08.027407 containerd[1593]: time="2025-05-17T10:34:08.027371357Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:08.028303 containerd[1593]: time="2025-05-17T10:34:08.027885493Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 1.600280112s" May 17 10:34:08.028303 containerd[1593]: time="2025-05-17T10:34:08.027910701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 17 10:34:08.028742 containerd[1593]: time="2025-05-17T10:34:08.028719611Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 17 10:34:08.030502 containerd[1593]: time="2025-05-17T10:34:08.030481703Z" level=info msg="CreateContainer within sandbox \"af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 17 10:34:08.041054 containerd[1593]: time="2025-05-17T10:34:08.041009372Z" level=info msg="Container 5e634848789d7fd29bcfea0f91a0baa5e2baf23e14d025860dde40f2fcf0beb7: CDI devices from CRI Config.CDIDevices: []" May 17 10:34:08.057664 containerd[1593]: time="2025-05-17T10:34:08.057626308Z" level=info msg="CreateContainer within sandbox \"af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5e634848789d7fd29bcfea0f91a0baa5e2baf23e14d025860dde40f2fcf0beb7\"" May 17 10:34:08.058082 containerd[1593]: time="2025-05-17T10:34:08.058057818Z" level=info msg="StartContainer for \"5e634848789d7fd29bcfea0f91a0baa5e2baf23e14d025860dde40f2fcf0beb7\"" May 17 10:34:08.059357 containerd[1593]: time="2025-05-17T10:34:08.059331492Z" level=info msg="connecting to shim 5e634848789d7fd29bcfea0f91a0baa5e2baf23e14d025860dde40f2fcf0beb7" address="unix:///run/containerd/s/0d7062d80e4f19481ef187dc54801b4906a22ff33936f631c1e3e706ccb2700c" protocol=ttrpc version=3 May 17 10:34:08.082623 systemd[1]: Started cri-containerd-5e634848789d7fd29bcfea0f91a0baa5e2baf23e14d025860dde40f2fcf0beb7.scope - libcontainer container 5e634848789d7fd29bcfea0f91a0baa5e2baf23e14d025860dde40f2fcf0beb7. May 17 10:34:08.121075 containerd[1593]: time="2025-05-17T10:34:08.121040619Z" level=info msg="StartContainer for \"5e634848789d7fd29bcfea0f91a0baa5e2baf23e14d025860dde40f2fcf0beb7\" returns successfully" May 17 10:34:09.176804 kubelet[2722]: I0517 10:34:09.176742 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:09.300901 containerd[1593]: time="2025-05-17T10:34:09.300789439Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d\" id:\"facf250112371e4d730e1ba08a4ce955ac55058986e875d90cede07d69092895\" pid:5064 exited_at:{seconds:1747478049 nanos:300170756}" May 17 10:34:09.412842 containerd[1593]: time="2025-05-17T10:34:09.412786328Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d\" id:\"bd1e09b18e949bb74b41077ec8e1cef8d4758623e981dd6ece219941edf97ed1\" pid:5088 exited_at:{seconds:1747478049 nanos:412411614}" May 17 10:34:10.284418 containerd[1593]: time="2025-05-17T10:34:10.284340870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:10.285066 containerd[1593]: time="2025-05-17T10:34:10.285018403Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 17 10:34:10.286131 containerd[1593]: time="2025-05-17T10:34:10.286097210Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:10.288350 containerd[1593]: time="2025-05-17T10:34:10.288316290Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:10.288924 containerd[1593]: time="2025-05-17T10:34:10.288897302Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 2.260151161s" May 17 10:34:10.288974 containerd[1593]: time="2025-05-17T10:34:10.288923331Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 17 10:34:10.290108 containerd[1593]: time="2025-05-17T10:34:10.290043256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 17 10:34:10.291003 containerd[1593]: time="2025-05-17T10:34:10.290969627Z" level=info msg="CreateContainer within sandbox \"2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 17 10:34:10.301143 containerd[1593]: time="2025-05-17T10:34:10.301091519Z" level=info msg="Container 53922cd8e2fff8de02771534d57be8e526ecdf827296e002b625e60e6d36ffac: CDI devices from CRI Config.CDIDevices: []" May 17 10:34:10.309510 containerd[1593]: time="2025-05-17T10:34:10.309467040Z" level=info msg="CreateContainer within sandbox \"2a88ec8c4167f5a1750c8aede3e3e68ccadad7bed857dd6b5e1eb4c3efc1a17a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"53922cd8e2fff8de02771534d57be8e526ecdf827296e002b625e60e6d36ffac\"" May 17 10:34:10.310099 containerd[1593]: time="2025-05-17T10:34:10.310066205Z" level=info msg="StartContainer for \"53922cd8e2fff8de02771534d57be8e526ecdf827296e002b625e60e6d36ffac\"" May 17 10:34:10.311133 containerd[1593]: time="2025-05-17T10:34:10.311112702Z" level=info msg="connecting to shim 53922cd8e2fff8de02771534d57be8e526ecdf827296e002b625e60e6d36ffac" address="unix:///run/containerd/s/05dcccb5a1a6b1b15705fbe42c82d0d2b9624d41f652ffc5bf974d458e0385e7" protocol=ttrpc version=3 May 17 10:34:10.373542 systemd[1]: Started cri-containerd-53922cd8e2fff8de02771534d57be8e526ecdf827296e002b625e60e6d36ffac.scope - libcontainer container 53922cd8e2fff8de02771534d57be8e526ecdf827296e002b625e60e6d36ffac. May 17 10:34:10.486436 containerd[1593]: time="2025-05-17T10:34:10.485571056Z" level=info msg="StartContainer for \"53922cd8e2fff8de02771534d57be8e526ecdf827296e002b625e60e6d36ffac\" returns successfully" May 17 10:34:10.749670 containerd[1593]: time="2025-05-17T10:34:10.749604718Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:10.751453 containerd[1593]: time="2025-05-17T10:34:10.750430660Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 17 10:34:10.752966 containerd[1593]: time="2025-05-17T10:34:10.752926732Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 462.796903ms" May 17 10:34:10.753040 containerd[1593]: time="2025-05-17T10:34:10.752971346Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 17 10:34:10.754574 containerd[1593]: time="2025-05-17T10:34:10.754514996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 17 10:34:10.756989 containerd[1593]: time="2025-05-17T10:34:10.756769734Z" level=info msg="CreateContainer within sandbox \"8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 17 10:34:10.769418 containerd[1593]: time="2025-05-17T10:34:10.767583456Z" level=info msg="Container e1f94ee9767a12c3136007549b1ebb503968c646fe596d9c1eecf271fbfb7751: CDI devices from CRI Config.CDIDevices: []" May 17 10:34:10.778570 containerd[1593]: time="2025-05-17T10:34:10.778516162Z" level=info msg="CreateContainer within sandbox \"8bc6e7159e7fafe36ade03618a718e8f2db00cfc556755127f46c3f4dcb4a915\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e1f94ee9767a12c3136007549b1ebb503968c646fe596d9c1eecf271fbfb7751\"" May 17 10:34:10.779148 containerd[1593]: time="2025-05-17T10:34:10.779114727Z" level=info msg="StartContainer for \"e1f94ee9767a12c3136007549b1ebb503968c646fe596d9c1eecf271fbfb7751\"" May 17 10:34:10.780280 containerd[1593]: time="2025-05-17T10:34:10.780068239Z" level=info msg="connecting to shim e1f94ee9767a12c3136007549b1ebb503968c646fe596d9c1eecf271fbfb7751" address="unix:///run/containerd/s/b87c840817f1f468e9a2118459bef2d0026eb99c3dd22e50e6c345e2aaabf18e" protocol=ttrpc version=3 May 17 10:34:10.812656 systemd[1]: Started cri-containerd-e1f94ee9767a12c3136007549b1ebb503968c646fe596d9c1eecf271fbfb7751.scope - libcontainer container e1f94ee9767a12c3136007549b1ebb503968c646fe596d9c1eecf271fbfb7751. May 17 10:34:10.877454 containerd[1593]: time="2025-05-17T10:34:10.877380139Z" level=info msg="StartContainer for \"e1f94ee9767a12c3136007549b1ebb503968c646fe596d9c1eecf271fbfb7751\" returns successfully" May 17 10:34:10.922293 kubelet[2722]: I0517 10:34:10.922215 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-58bc8d76c5-t7rpr" podStartSLOduration=28.754005923 podStartE2EDuration="33.922197175s" podCreationTimestamp="2025-05-17 10:33:37 +0000 UTC" firstStartedPulling="2025-05-17 10:34:05.121626551 +0000 UTC m=+40.458027141" lastFinishedPulling="2025-05-17 10:34:10.289817803 +0000 UTC m=+45.626218393" observedRunningTime="2025-05-17 10:34:10.920604792 +0000 UTC m=+46.257005382" watchObservedRunningTime="2025-05-17 10:34:10.922197175 +0000 UTC m=+46.258597765" May 17 10:34:11.014982 containerd[1593]: time="2025-05-17T10:34:11.014861668Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:11.016595 containerd[1593]: time="2025-05-17T10:34:11.016564297Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:11.016687 containerd[1593]: time="2025-05-17T10:34:11.016644778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 17 10:34:11.016940 kubelet[2722]: E0517 10:34:11.016864 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 17 10:34:11.017005 kubelet[2722]: E0517 10:34:11.016956 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 17 10:34:11.018368 kubelet[2722]: E0517 10:34:11.018276 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t8p5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-b6dz4_calico-system(065bd371-4a25-4865-bbfe-0d2b88d6ea40): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:11.018748 containerd[1593]: time="2025-05-17T10:34:11.018700362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 17 10:34:11.019537 kubelet[2722]: E0517 10:34:11.019477 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-b6dz4" podUID="065bd371-4a25-4865-bbfe-0d2b88d6ea40" May 17 10:34:11.891027 systemd[1]: Started sshd@9-10.0.0.118:22-10.0.0.1:42218.service - OpenSSH per-connection server daemon (10.0.0.1:42218). May 17 10:34:11.909293 kubelet[2722]: I0517 10:34:11.909260 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:11.909801 kubelet[2722]: I0517 10:34:11.909261 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:11.910140 kubelet[2722]: E0517 10:34:11.910109 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-b6dz4" podUID="065bd371-4a25-4865-bbfe-0d2b88d6ea40" May 17 10:34:11.919953 kubelet[2722]: I0517 10:34:11.919798 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-58bc8d76c5-8mjx6" podStartSLOduration=29.385357446 podStartE2EDuration="34.919754626s" podCreationTimestamp="2025-05-17 10:33:37 +0000 UTC" firstStartedPulling="2025-05-17 10:34:05.219478383 +0000 UTC m=+40.555878964" lastFinishedPulling="2025-05-17 10:34:10.753875424 +0000 UTC m=+46.090276144" observedRunningTime="2025-05-17 10:34:10.935790881 +0000 UTC m=+46.272191471" watchObservedRunningTime="2025-05-17 10:34:11.919754626 +0000 UTC m=+47.256155216" May 17 10:34:11.954298 sshd[5188]: Accepted publickey for core from 10.0.0.1 port 42218 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:11.955926 sshd-session[5188]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:11.963445 systemd-logind[1585]: New session 10 of user core. May 17 10:34:11.970518 systemd[1]: Started session-10.scope - Session 10 of User core. May 17 10:34:12.102136 sshd[5190]: Connection closed by 10.0.0.1 port 42218 May 17 10:34:12.102494 sshd-session[5188]: pam_unix(sshd:session): session closed for user core May 17 10:34:12.111223 systemd[1]: sshd@9-10.0.0.118:22-10.0.0.1:42218.service: Deactivated successfully. May 17 10:34:12.113516 systemd[1]: session-10.scope: Deactivated successfully. May 17 10:34:12.114443 systemd-logind[1585]: Session 10 logged out. Waiting for processes to exit. May 17 10:34:12.118794 systemd[1]: Started sshd@10-10.0.0.118:22-10.0.0.1:42230.service - OpenSSH per-connection server daemon (10.0.0.1:42230). May 17 10:34:12.119413 systemd-logind[1585]: Removed session 10. May 17 10:34:12.170587 sshd[5207]: Accepted publickey for core from 10.0.0.1 port 42230 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:12.172039 sshd-session[5207]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:12.176489 systemd-logind[1585]: New session 11 of user core. May 17 10:34:12.186543 systemd[1]: Started session-11.scope - Session 11 of User core. May 17 10:34:12.369521 sshd[5209]: Connection closed by 10.0.0.1 port 42230 May 17 10:34:12.369369 sshd-session[5207]: pam_unix(sshd:session): session closed for user core May 17 10:34:12.382926 systemd[1]: sshd@10-10.0.0.118:22-10.0.0.1:42230.service: Deactivated successfully. May 17 10:34:12.386712 systemd[1]: session-11.scope: Deactivated successfully. May 17 10:34:12.390604 systemd-logind[1585]: Session 11 logged out. Waiting for processes to exit. May 17 10:34:12.395705 systemd-logind[1585]: Removed session 11. May 17 10:34:12.397982 systemd[1]: Started sshd@11-10.0.0.118:22-10.0.0.1:42240.service - OpenSSH per-connection server daemon (10.0.0.1:42240). May 17 10:34:12.448675 sshd[5225]: Accepted publickey for core from 10.0.0.1 port 42240 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:12.450290 sshd-session[5225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:12.455711 systemd-logind[1585]: New session 12 of user core. May 17 10:34:12.459577 systemd[1]: Started session-12.scope - Session 12 of User core. May 17 10:34:12.564735 containerd[1593]: time="2025-05-17T10:34:12.564659666Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:12.565489 containerd[1593]: time="2025-05-17T10:34:12.565452696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 17 10:34:12.566481 containerd[1593]: time="2025-05-17T10:34:12.566442194Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:12.568295 containerd[1593]: time="2025-05-17T10:34:12.568261313Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 17 10:34:12.568796 containerd[1593]: time="2025-05-17T10:34:12.568771371Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 1.550045852s" May 17 10:34:12.568854 containerd[1593]: time="2025-05-17T10:34:12.568798963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 17 10:34:12.573191 containerd[1593]: time="2025-05-17T10:34:12.573076350Z" level=info msg="CreateContainer within sandbox \"af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 17 10:34:12.586435 containerd[1593]: time="2025-05-17T10:34:12.586092877Z" level=info msg="Container c1982f554bd5c4d549f9d7838c5901d8ee8f1b72ad16027191cf4d559f9da4c9: CDI devices from CRI Config.CDIDevices: []" May 17 10:34:12.590252 sshd[5227]: Connection closed by 10.0.0.1 port 42240 May 17 10:34:12.590053 sshd-session[5225]: pam_unix(sshd:session): session closed for user core May 17 10:34:12.597106 systemd[1]: sshd@11-10.0.0.118:22-10.0.0.1:42240.service: Deactivated successfully. May 17 10:34:12.599665 systemd[1]: session-12.scope: Deactivated successfully. May 17 10:34:12.600567 systemd-logind[1585]: Session 12 logged out. Waiting for processes to exit. May 17 10:34:12.600732 containerd[1593]: time="2025-05-17T10:34:12.600638558Z" level=info msg="CreateContainer within sandbox \"af983269312bbacd798d81174fb4b820ed2ff37085db24fb9328330ac5f4fd92\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c1982f554bd5c4d549f9d7838c5901d8ee8f1b72ad16027191cf4d559f9da4c9\"" May 17 10:34:12.601277 containerd[1593]: time="2025-05-17T10:34:12.601201146Z" level=info msg="StartContainer for \"c1982f554bd5c4d549f9d7838c5901d8ee8f1b72ad16027191cf4d559f9da4c9\"" May 17 10:34:12.602714 systemd-logind[1585]: Removed session 12. May 17 10:34:12.602931 containerd[1593]: time="2025-05-17T10:34:12.602899637Z" level=info msg="connecting to shim c1982f554bd5c4d549f9d7838c5901d8ee8f1b72ad16027191cf4d559f9da4c9" address="unix:///run/containerd/s/0d7062d80e4f19481ef187dc54801b4906a22ff33936f631c1e3e706ccb2700c" protocol=ttrpc version=3 May 17 10:34:12.626549 systemd[1]: Started cri-containerd-c1982f554bd5c4d549f9d7838c5901d8ee8f1b72ad16027191cf4d559f9da4c9.scope - libcontainer container c1982f554bd5c4d549f9d7838c5901d8ee8f1b72ad16027191cf4d559f9da4c9. May 17 10:34:12.675429 containerd[1593]: time="2025-05-17T10:34:12.675361851Z" level=info msg="StartContainer for \"c1982f554bd5c4d549f9d7838c5901d8ee8f1b72ad16027191cf4d559f9da4c9\" returns successfully" May 17 10:34:12.806419 kubelet[2722]: I0517 10:34:12.806302 2722 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 17 10:34:12.806419 kubelet[2722]: I0517 10:34:12.806348 2722 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 17 10:34:12.927624 kubelet[2722]: I0517 10:34:12.927519 2722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-b4b84" podStartSLOduration=25.332220811 podStartE2EDuration="32.927500311s" podCreationTimestamp="2025-05-17 10:33:40 +0000 UTC" firstStartedPulling="2025-05-17 10:34:04.97470504 +0000 UTC m=+40.311105620" lastFinishedPulling="2025-05-17 10:34:12.56998453 +0000 UTC m=+47.906385120" observedRunningTime="2025-05-17 10:34:12.926732899 +0000 UTC m=+48.263133499" watchObservedRunningTime="2025-05-17 10:34:12.927500311 +0000 UTC m=+48.263900901" May 17 10:34:15.751149 containerd[1593]: time="2025-05-17T10:34:15.751107166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 17 10:34:16.013582 containerd[1593]: time="2025-05-17T10:34:16.013457706Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:16.014709 containerd[1593]: time="2025-05-17T10:34:16.014678208Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:16.014764 containerd[1593]: time="2025-05-17T10:34:16.014747669Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 17 10:34:16.014990 kubelet[2722]: E0517 10:34:16.014934 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 17 10:34:16.015315 kubelet[2722]: E0517 10:34:16.015001 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 17 10:34:16.015315 kubelet[2722]: E0517 10:34:16.015120 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:265c6d0b859b42aa9e883fb177ea14f8,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7r95h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b77d4f8d-qhcgn_calico-system(6f72511d-6a3c-4ed6-87bd-7f2878038cc5): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:16.016975 containerd[1593]: time="2025-05-17T10:34:16.016941721Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 17 10:34:16.293948 containerd[1593]: time="2025-05-17T10:34:16.293813423Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:16.295087 containerd[1593]: time="2025-05-17T10:34:16.295035619Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:16.295260 containerd[1593]: time="2025-05-17T10:34:16.295121000Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 17 10:34:16.295310 kubelet[2722]: E0517 10:34:16.295261 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 17 10:34:16.295424 kubelet[2722]: E0517 10:34:16.295320 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 17 10:34:16.295521 kubelet[2722]: E0517 10:34:16.295470 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7r95h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b77d4f8d-qhcgn_calico-system(6f72511d-6a3c-4ed6-87bd-7f2878038cc5): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:16.296642 kubelet[2722]: E0517 10:34:16.296586 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6b77d4f8d-qhcgn" podUID="6f72511d-6a3c-4ed6-87bd-7f2878038cc5" May 17 10:34:17.615541 systemd[1]: Started sshd@12-10.0.0.118:22-10.0.0.1:60990.service - OpenSSH per-connection server daemon (10.0.0.1:60990). May 17 10:34:17.670938 sshd[5293]: Accepted publickey for core from 10.0.0.1 port 60990 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:17.672241 sshd-session[5293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:17.676525 systemd-logind[1585]: New session 13 of user core. May 17 10:34:17.690645 systemd[1]: Started session-13.scope - Session 13 of User core. May 17 10:34:17.820324 sshd[5295]: Connection closed by 10.0.0.1 port 60990 May 17 10:34:17.820656 sshd-session[5293]: pam_unix(sshd:session): session closed for user core May 17 10:34:17.825006 systemd[1]: sshd@12-10.0.0.118:22-10.0.0.1:60990.service: Deactivated successfully. May 17 10:34:17.827112 systemd[1]: session-13.scope: Deactivated successfully. May 17 10:34:17.828016 systemd-logind[1585]: Session 13 logged out. Waiting for processes to exit. May 17 10:34:17.829434 systemd-logind[1585]: Removed session 13. May 17 10:34:19.131924 kubelet[2722]: I0517 10:34:19.131876 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:19.191838 containerd[1593]: time="2025-05-17T10:34:19.191783135Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c\" id:\"0119beca896862ff896459ba7574e37fcb88ef9f99b802d435728b6004fedf93\" pid:5319 exited_at:{seconds:1747478059 nanos:190818183}" May 17 10:34:19.239525 containerd[1593]: time="2025-05-17T10:34:19.239474192Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c\" id:\"816fd16f1e417a6ecc60d148670acd60ea15b9c29f20ba0667929ba4c9dddd66\" pid:5341 exited_at:{seconds:1747478059 nanos:239264457}" May 17 10:34:19.992779 kubelet[2722]: I0517 10:34:19.992742 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:22.837305 systemd[1]: Started sshd@13-10.0.0.118:22-10.0.0.1:32774.service - OpenSSH per-connection server daemon (10.0.0.1:32774). May 17 10:34:22.899420 sshd[5353]: Accepted publickey for core from 10.0.0.1 port 32774 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:22.900829 sshd-session[5353]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:22.904863 systemd-logind[1585]: New session 14 of user core. May 17 10:34:22.913634 systemd[1]: Started session-14.scope - Session 14 of User core. May 17 10:34:23.031324 sshd[5355]: Connection closed by 10.0.0.1 port 32774 May 17 10:34:23.031676 sshd-session[5353]: pam_unix(sshd:session): session closed for user core May 17 10:34:23.036280 systemd[1]: sshd@13-10.0.0.118:22-10.0.0.1:32774.service: Deactivated successfully. May 17 10:34:23.038338 systemd[1]: session-14.scope: Deactivated successfully. May 17 10:34:23.039043 systemd-logind[1585]: Session 14 logged out. Waiting for processes to exit. May 17 10:34:23.040367 systemd-logind[1585]: Removed session 14. May 17 10:34:26.751474 containerd[1593]: time="2025-05-17T10:34:26.750829940Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 17 10:34:27.011276 containerd[1593]: time="2025-05-17T10:34:27.011170398Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:27.013644 containerd[1593]: time="2025-05-17T10:34:27.013599800Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:27.013744 containerd[1593]: time="2025-05-17T10:34:27.013666405Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 17 10:34:27.013837 kubelet[2722]: E0517 10:34:27.013780 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 17 10:34:27.013837 kubelet[2722]: E0517 10:34:27.013831 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 17 10:34:27.014257 kubelet[2722]: E0517 10:34:27.013957 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t8p5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-b6dz4_calico-system(065bd371-4a25-4865-bbfe-0d2b88d6ea40): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:27.015146 kubelet[2722]: E0517 10:34:27.015107 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-b6dz4" podUID="065bd371-4a25-4865-bbfe-0d2b88d6ea40" May 17 10:34:28.049212 systemd[1]: Started sshd@14-10.0.0.118:22-10.0.0.1:44550.service - OpenSSH per-connection server daemon (10.0.0.1:44550). May 17 10:34:28.109550 sshd[5379]: Accepted publickey for core from 10.0.0.1 port 44550 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:28.110933 sshd-session[5379]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:28.115262 systemd-logind[1585]: New session 15 of user core. May 17 10:34:28.126518 systemd[1]: Started session-15.scope - Session 15 of User core. May 17 10:34:28.258362 sshd[5381]: Connection closed by 10.0.0.1 port 44550 May 17 10:34:28.258669 sshd-session[5379]: pam_unix(sshd:session): session closed for user core May 17 10:34:28.263174 systemd[1]: sshd@14-10.0.0.118:22-10.0.0.1:44550.service: Deactivated successfully. May 17 10:34:28.265471 systemd[1]: session-15.scope: Deactivated successfully. May 17 10:34:28.266322 systemd-logind[1585]: Session 15 logged out. Waiting for processes to exit. May 17 10:34:28.267734 systemd-logind[1585]: Removed session 15. May 17 10:34:29.750787 kubelet[2722]: E0517 10:34:29.750739 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-6b77d4f8d-qhcgn" podUID="6f72511d-6a3c-4ed6-87bd-7f2878038cc5" May 17 10:34:29.978705 kubelet[2722]: I0517 10:34:29.978646 2722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 17 10:34:33.271914 systemd[1]: Started sshd@15-10.0.0.118:22-10.0.0.1:44552.service - OpenSSH per-connection server daemon (10.0.0.1:44552). May 17 10:34:33.310211 sshd[5401]: Accepted publickey for core from 10.0.0.1 port 44552 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:33.311948 sshd-session[5401]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:33.316375 systemd-logind[1585]: New session 16 of user core. May 17 10:34:33.329511 systemd[1]: Started session-16.scope - Session 16 of User core. May 17 10:34:33.441227 sshd[5403]: Connection closed by 10.0.0.1 port 44552 May 17 10:34:33.441689 sshd-session[5401]: pam_unix(sshd:session): session closed for user core May 17 10:34:33.451217 systemd[1]: sshd@15-10.0.0.118:22-10.0.0.1:44552.service: Deactivated successfully. May 17 10:34:33.453473 systemd[1]: session-16.scope: Deactivated successfully. May 17 10:34:33.454214 systemd-logind[1585]: Session 16 logged out. Waiting for processes to exit. May 17 10:34:33.457797 systemd[1]: Started sshd@16-10.0.0.118:22-10.0.0.1:44560.service - OpenSSH per-connection server daemon (10.0.0.1:44560). May 17 10:34:33.458376 systemd-logind[1585]: Removed session 16. May 17 10:34:33.519271 sshd[5416]: Accepted publickey for core from 10.0.0.1 port 44560 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:33.520890 sshd-session[5416]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:33.525489 systemd-logind[1585]: New session 17 of user core. May 17 10:34:33.534515 systemd[1]: Started session-17.scope - Session 17 of User core. May 17 10:34:33.735992 sshd[5418]: Connection closed by 10.0.0.1 port 44560 May 17 10:34:33.736312 sshd-session[5416]: pam_unix(sshd:session): session closed for user core May 17 10:34:33.745101 systemd[1]: sshd@16-10.0.0.118:22-10.0.0.1:44560.service: Deactivated successfully. May 17 10:34:33.746794 systemd[1]: session-17.scope: Deactivated successfully. May 17 10:34:33.747622 systemd-logind[1585]: Session 17 logged out. Waiting for processes to exit. May 17 10:34:33.750332 systemd[1]: Started sshd@17-10.0.0.118:22-10.0.0.1:44564.service - OpenSSH per-connection server daemon (10.0.0.1:44564). May 17 10:34:33.751187 systemd-logind[1585]: Removed session 17. May 17 10:34:33.807787 sshd[5431]: Accepted publickey for core from 10.0.0.1 port 44564 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:33.809333 sshd-session[5431]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:33.813652 systemd-logind[1585]: New session 18 of user core. May 17 10:34:33.824514 systemd[1]: Started session-18.scope - Session 18 of User core. May 17 10:34:35.610185 sshd[5433]: Connection closed by 10.0.0.1 port 44564 May 17 10:34:35.610848 sshd-session[5431]: pam_unix(sshd:session): session closed for user core May 17 10:34:35.626131 systemd[1]: sshd@17-10.0.0.118:22-10.0.0.1:44564.service: Deactivated successfully. May 17 10:34:35.629790 systemd[1]: session-18.scope: Deactivated successfully. May 17 10:34:35.630829 systemd[1]: session-18.scope: Consumed 627ms CPU time, 73M memory peak. May 17 10:34:35.632589 systemd-logind[1585]: Session 18 logged out. Waiting for processes to exit. May 17 10:34:35.636870 systemd[1]: Started sshd@18-10.0.0.118:22-10.0.0.1:44574.service - OpenSSH per-connection server daemon (10.0.0.1:44574). May 17 10:34:35.640123 systemd-logind[1585]: Removed session 18. May 17 10:34:35.704726 sshd[5454]: Accepted publickey for core from 10.0.0.1 port 44574 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:35.706402 sshd-session[5454]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:35.711120 systemd-logind[1585]: New session 19 of user core. May 17 10:34:35.722527 systemd[1]: Started session-19.scope - Session 19 of User core. May 17 10:34:36.094335 sshd[5456]: Connection closed by 10.0.0.1 port 44574 May 17 10:34:36.095621 sshd-session[5454]: pam_unix(sshd:session): session closed for user core May 17 10:34:36.108696 systemd[1]: sshd@18-10.0.0.118:22-10.0.0.1:44574.service: Deactivated successfully. May 17 10:34:36.110985 systemd[1]: session-19.scope: Deactivated successfully. May 17 10:34:36.112366 systemd-logind[1585]: Session 19 logged out. Waiting for processes to exit. May 17 10:34:36.116832 systemd[1]: Started sshd@19-10.0.0.118:22-10.0.0.1:44578.service - OpenSSH per-connection server daemon (10.0.0.1:44578). May 17 10:34:36.118237 systemd-logind[1585]: Removed session 19. May 17 10:34:36.168146 sshd[5469]: Accepted publickey for core from 10.0.0.1 port 44578 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:36.170283 sshd-session[5469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:36.179977 systemd-logind[1585]: New session 20 of user core. May 17 10:34:36.183655 systemd[1]: Started session-20.scope - Session 20 of User core. May 17 10:34:36.306349 sshd[5471]: Connection closed by 10.0.0.1 port 44578 May 17 10:34:36.306653 sshd-session[5469]: pam_unix(sshd:session): session closed for user core May 17 10:34:36.311021 systemd[1]: sshd@19-10.0.0.118:22-10.0.0.1:44578.service: Deactivated successfully. May 17 10:34:36.313331 systemd[1]: session-20.scope: Deactivated successfully. May 17 10:34:36.314293 systemd-logind[1585]: Session 20 logged out. Waiting for processes to exit. May 17 10:34:36.315989 systemd-logind[1585]: Removed session 20. May 17 10:34:39.299676 containerd[1593]: time="2025-05-17T10:34:39.299618536Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ade0c9b7468e87fe2ae253d2c5bc98c41ef41381dc5e4572b64defa8e3f8730d\" id:\"142f4fb391f5e43d1d29372497e06c34b194e61b9aa93f16e6e2f2a6aefeab05\" pid:5494 exited_at:{seconds:1747478079 nanos:275047310}" May 17 10:34:39.752188 kubelet[2722]: E0517 10:34:39.752140 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-b6dz4" podUID="065bd371-4a25-4865-bbfe-0d2b88d6ea40" May 17 10:34:41.323207 systemd[1]: Started sshd@20-10.0.0.118:22-10.0.0.1:45652.service - OpenSSH per-connection server daemon (10.0.0.1:45652). May 17 10:34:41.395030 sshd[5508]: Accepted publickey for core from 10.0.0.1 port 45652 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:41.396694 sshd-session[5508]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:41.401162 systemd-logind[1585]: New session 21 of user core. May 17 10:34:41.411532 systemd[1]: Started session-21.scope - Session 21 of User core. May 17 10:34:41.535316 sshd[5510]: Connection closed by 10.0.0.1 port 45652 May 17 10:34:41.536008 sshd-session[5508]: pam_unix(sshd:session): session closed for user core May 17 10:34:41.540357 systemd[1]: sshd@20-10.0.0.118:22-10.0.0.1:45652.service: Deactivated successfully. May 17 10:34:41.542449 systemd[1]: session-21.scope: Deactivated successfully. May 17 10:34:41.543250 systemd-logind[1585]: Session 21 logged out. Waiting for processes to exit. May 17 10:34:41.544492 systemd-logind[1585]: Removed session 21. May 17 10:34:41.751806 containerd[1593]: time="2025-05-17T10:34:41.751496794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 17 10:34:42.022502 containerd[1593]: time="2025-05-17T10:34:42.021606663Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:42.023831 containerd[1593]: time="2025-05-17T10:34:42.023799796Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:42.024003 containerd[1593]: time="2025-05-17T10:34:42.023907172Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 17 10:34:42.025572 kubelet[2722]: E0517 10:34:42.025437 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 17 10:34:42.025572 kubelet[2722]: E0517 10:34:42.025525 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 17 10:34:42.026216 kubelet[2722]: E0517 10:34:42.026154 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:265c6d0b859b42aa9e883fb177ea14f8,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7r95h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b77d4f8d-qhcgn_calico-system(6f72511d-6a3c-4ed6-87bd-7f2878038cc5): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:42.028507 containerd[1593]: time="2025-05-17T10:34:42.028472572Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 17 10:34:42.273336 containerd[1593]: time="2025-05-17T10:34:42.273214716Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:42.274513 containerd[1593]: time="2025-05-17T10:34:42.274458630Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:42.274605 containerd[1593]: time="2025-05-17T10:34:42.274481333Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 17 10:34:42.274745 kubelet[2722]: E0517 10:34:42.274705 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 17 10:34:42.274806 kubelet[2722]: E0517 10:34:42.274754 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 17 10:34:42.274933 kubelet[2722]: E0517 10:34:42.274862 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7r95h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6b77d4f8d-qhcgn_calico-system(6f72511d-6a3c-4ed6-87bd-7f2878038cc5): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:42.276110 kubelet[2722]: E0517 10:34:42.276062 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-6b77d4f8d-qhcgn" podUID="6f72511d-6a3c-4ed6-87bd-7f2878038cc5" May 17 10:34:46.553044 systemd[1]: Started sshd@21-10.0.0.118:22-10.0.0.1:37966.service - OpenSSH per-connection server daemon (10.0.0.1:37966). May 17 10:34:46.596094 sshd[5535]: Accepted publickey for core from 10.0.0.1 port 37966 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:46.597686 sshd-session[5535]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:46.602456 systemd-logind[1585]: New session 22 of user core. May 17 10:34:46.612632 systemd[1]: Started session-22.scope - Session 22 of User core. May 17 10:34:46.740861 sshd[5537]: Connection closed by 10.0.0.1 port 37966 May 17 10:34:46.741219 sshd-session[5535]: pam_unix(sshd:session): session closed for user core May 17 10:34:46.745106 systemd[1]: sshd@21-10.0.0.118:22-10.0.0.1:37966.service: Deactivated successfully. May 17 10:34:46.747338 systemd[1]: session-22.scope: Deactivated successfully. May 17 10:34:46.749009 systemd-logind[1585]: Session 22 logged out. Waiting for processes to exit. May 17 10:34:46.751682 systemd-logind[1585]: Removed session 22. May 17 10:34:49.184129 containerd[1593]: time="2025-05-17T10:34:49.184088245Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5daa495876a1152e843814b68915f3cc9bafc5c0c3b022afcadf3b2b51b48c4c\" id:\"c3ae17f3ebe4ea195b2fea2a826a882965c8f4b2d6957190e4efebd96379665c\" pid:5561 exited_at:{seconds:1747478089 nanos:183848187}" May 17 10:34:50.751224 containerd[1593]: time="2025-05-17T10:34:50.751181431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 17 10:34:51.046342 containerd[1593]: time="2025-05-17T10:34:51.046214957Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 17 10:34:51.047306 containerd[1593]: time="2025-05-17T10:34:51.047273165Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 17 10:34:51.047424 containerd[1593]: time="2025-05-17T10:34:51.047348429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 17 10:34:51.047542 kubelet[2722]: E0517 10:34:51.047498 2722 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 17 10:34:51.047880 kubelet[2722]: E0517 10:34:51.047551 2722 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 17 10:34:51.047880 kubelet[2722]: E0517 10:34:51.047693 2722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t8p5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-b6dz4_calico-system(065bd371-4a25-4865-bbfe-0d2b88d6ea40): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 17 10:34:51.048846 kubelet[2722]: E0517 10:34:51.048817 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-b6dz4" podUID="065bd371-4a25-4865-bbfe-0d2b88d6ea40" May 17 10:34:51.758486 systemd[1]: Started sshd@22-10.0.0.118:22-10.0.0.1:37970.service - OpenSSH per-connection server daemon (10.0.0.1:37970). May 17 10:34:51.822558 sshd[5572]: Accepted publickey for core from 10.0.0.1 port 37970 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:51.824298 sshd-session[5572]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:51.828954 systemd-logind[1585]: New session 23 of user core. May 17 10:34:51.835608 systemd[1]: Started session-23.scope - Session 23 of User core. May 17 10:34:51.989472 sshd[5574]: Connection closed by 10.0.0.1 port 37970 May 17 10:34:51.990663 sshd-session[5572]: pam_unix(sshd:session): session closed for user core May 17 10:34:51.994371 systemd-logind[1585]: Session 23 logged out. Waiting for processes to exit. May 17 10:34:51.996697 systemd[1]: sshd@22-10.0.0.118:22-10.0.0.1:37970.service: Deactivated successfully. May 17 10:34:51.999336 systemd[1]: session-23.scope: Deactivated successfully. May 17 10:34:52.002403 systemd-logind[1585]: Removed session 23. May 17 10:34:53.751800 kubelet[2722]: E0517 10:34:53.751753 2722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-6b77d4f8d-qhcgn" podUID="6f72511d-6a3c-4ed6-87bd-7f2878038cc5" May 17 10:34:57.001744 systemd[1]: Started sshd@23-10.0.0.118:22-10.0.0.1:52834.service - OpenSSH per-connection server daemon (10.0.0.1:52834). May 17 10:34:57.046862 sshd[5588]: Accepted publickey for core from 10.0.0.1 port 52834 ssh2: RSA SHA256:fqd0Zw1c0TOc8VjEN/TY5HphIWm94006yyZoyFyzIuE May 17 10:34:57.048332 sshd-session[5588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 17 10:34:57.052651 systemd-logind[1585]: New session 24 of user core. May 17 10:34:57.060533 systemd[1]: Started session-24.scope - Session 24 of User core. May 17 10:34:57.168599 sshd[5590]: Connection closed by 10.0.0.1 port 52834 May 17 10:34:57.168973 sshd-session[5588]: pam_unix(sshd:session): session closed for user core May 17 10:34:57.172693 systemd[1]: sshd@23-10.0.0.118:22-10.0.0.1:52834.service: Deactivated successfully. May 17 10:34:57.174881 systemd[1]: session-24.scope: Deactivated successfully. May 17 10:34:57.175771 systemd-logind[1585]: Session 24 logged out. Waiting for processes to exit. May 17 10:34:57.177882 systemd-logind[1585]: Removed session 24.