Sep 12 23:08:08.847006 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 20:38:35 -00 2025 Sep 12 23:08:08.847030 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 23:08:08.847042 kernel: BIOS-provided physical RAM map: Sep 12 23:08:08.847049 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000002ffff] usable Sep 12 23:08:08.847055 kernel: BIOS-e820: [mem 0x0000000000030000-0x000000000004ffff] reserved Sep 12 23:08:08.847062 kernel: BIOS-e820: [mem 0x0000000000050000-0x000000000009efff] usable Sep 12 23:08:08.847070 kernel: BIOS-e820: [mem 0x000000000009f000-0x000000000009ffff] reserved Sep 12 23:08:08.847076 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009b8ecfff] usable Sep 12 23:08:08.847085 kernel: BIOS-e820: [mem 0x000000009b8ed000-0x000000009bb6cfff] reserved Sep 12 23:08:08.847092 kernel: BIOS-e820: [mem 0x000000009bb6d000-0x000000009bb7efff] ACPI data Sep 12 23:08:08.847099 kernel: BIOS-e820: [mem 0x000000009bb7f000-0x000000009bbfefff] ACPI NVS Sep 12 23:08:08.847109 kernel: BIOS-e820: [mem 0x000000009bbff000-0x000000009bfb0fff] usable Sep 12 23:08:08.847115 kernel: BIOS-e820: [mem 0x000000009bfb1000-0x000000009bfb4fff] reserved Sep 12 23:08:08.847122 kernel: BIOS-e820: [mem 0x000000009bfb5000-0x000000009bfb6fff] ACPI NVS Sep 12 23:08:08.847130 kernel: BIOS-e820: [mem 0x000000009bfb7000-0x000000009bffffff] usable Sep 12 23:08:08.847138 kernel: BIOS-e820: [mem 0x000000009c000000-0x000000009cffffff] reserved Sep 12 23:08:08.847150 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 23:08:08.847157 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 23:08:08.847165 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 12 23:08:08.847172 kernel: NX (Execute Disable) protection: active Sep 12 23:08:08.847179 kernel: APIC: Static calls initialized Sep 12 23:08:08.847186 kernel: e820: update [mem 0x9a13f018-0x9a148c57] usable ==> usable Sep 12 23:08:08.847194 kernel: e820: update [mem 0x9a102018-0x9a13ee57] usable ==> usable Sep 12 23:08:08.847201 kernel: extended physical RAM map: Sep 12 23:08:08.847208 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000002ffff] usable Sep 12 23:08:08.847216 kernel: reserve setup_data: [mem 0x0000000000030000-0x000000000004ffff] reserved Sep 12 23:08:08.847223 kernel: reserve setup_data: [mem 0x0000000000050000-0x000000000009efff] usable Sep 12 23:08:08.847233 kernel: reserve setup_data: [mem 0x000000000009f000-0x000000000009ffff] reserved Sep 12 23:08:08.847240 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000009a102017] usable Sep 12 23:08:08.847247 kernel: reserve setup_data: [mem 0x000000009a102018-0x000000009a13ee57] usable Sep 12 23:08:08.847254 kernel: reserve setup_data: [mem 0x000000009a13ee58-0x000000009a13f017] usable Sep 12 23:08:08.847261 kernel: reserve setup_data: [mem 0x000000009a13f018-0x000000009a148c57] usable Sep 12 23:08:08.847268 kernel: reserve setup_data: [mem 0x000000009a148c58-0x000000009b8ecfff] usable Sep 12 23:08:08.847276 kernel: reserve setup_data: [mem 0x000000009b8ed000-0x000000009bb6cfff] reserved Sep 12 23:08:08.847283 kernel: reserve setup_data: [mem 0x000000009bb6d000-0x000000009bb7efff] ACPI data Sep 12 23:08:08.847290 kernel: reserve setup_data: [mem 0x000000009bb7f000-0x000000009bbfefff] ACPI NVS Sep 12 23:08:08.847297 kernel: reserve setup_data: [mem 0x000000009bbff000-0x000000009bfb0fff] usable Sep 12 23:08:08.847304 kernel: reserve setup_data: [mem 0x000000009bfb1000-0x000000009bfb4fff] reserved Sep 12 23:08:08.847314 kernel: reserve setup_data: [mem 0x000000009bfb5000-0x000000009bfb6fff] ACPI NVS Sep 12 23:08:08.847321 kernel: reserve setup_data: [mem 0x000000009bfb7000-0x000000009bffffff] usable Sep 12 23:08:08.847332 kernel: reserve setup_data: [mem 0x000000009c000000-0x000000009cffffff] reserved Sep 12 23:08:08.847340 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 23:08:08.847353 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 23:08:08.847360 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 12 23:08:08.847381 kernel: efi: EFI v2.7 by EDK II Sep 12 23:08:08.847410 kernel: efi: SMBIOS=0x9b9d5000 ACPI=0x9bb7e000 ACPI 2.0=0x9bb7e014 MEMATTR=0x9a1af018 RNG=0x9bb73018 Sep 12 23:08:08.847435 kernel: random: crng init done Sep 12 23:08:08.847443 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Sep 12 23:08:08.847451 kernel: secureboot: Secure boot enabled Sep 12 23:08:08.847458 kernel: SMBIOS 2.8 present. Sep 12 23:08:08.847468 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Sep 12 23:08:08.847479 kernel: DMI: Memory slots populated: 1/1 Sep 12 23:08:08.847489 kernel: Hypervisor detected: KVM Sep 12 23:08:08.847499 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 23:08:08.847508 kernel: kvm-clock: using sched offset of 6450555196 cycles Sep 12 23:08:08.847520 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 23:08:08.847528 kernel: tsc: Detected 2794.748 MHz processor Sep 12 23:08:08.847536 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 23:08:08.847543 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 23:08:08.847551 kernel: last_pfn = 0x9c000 max_arch_pfn = 0x400000000 Sep 12 23:08:08.847559 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 12 23:08:08.847570 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 23:08:08.847580 kernel: Using GB pages for direct mapping Sep 12 23:08:08.847589 kernel: ACPI: Early table checksum verification disabled Sep 12 23:08:08.847599 kernel: ACPI: RSDP 0x000000009BB7E014 000024 (v02 BOCHS ) Sep 12 23:08:08.847607 kernel: ACPI: XSDT 0x000000009BB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Sep 12 23:08:08.847615 kernel: ACPI: FACP 0x000000009BB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:08:08.847623 kernel: ACPI: DSDT 0x000000009BB7A000 002237 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:08:08.847631 kernel: ACPI: FACS 0x000000009BBDD000 000040 Sep 12 23:08:08.847638 kernel: ACPI: APIC 0x000000009BB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:08:08.847646 kernel: ACPI: HPET 0x000000009BB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:08:08.847654 kernel: ACPI: MCFG 0x000000009BB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:08:08.847662 kernel: ACPI: WAET 0x000000009BB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:08:08.847672 kernel: ACPI: BGRT 0x000000009BB74000 000038 (v01 INTEL EDK2 00000002 01000013) Sep 12 23:08:08.847680 kernel: ACPI: Reserving FACP table memory at [mem 0x9bb79000-0x9bb790f3] Sep 12 23:08:08.847688 kernel: ACPI: Reserving DSDT table memory at [mem 0x9bb7a000-0x9bb7c236] Sep 12 23:08:08.847695 kernel: ACPI: Reserving FACS table memory at [mem 0x9bbdd000-0x9bbdd03f] Sep 12 23:08:08.847703 kernel: ACPI: Reserving APIC table memory at [mem 0x9bb78000-0x9bb7808f] Sep 12 23:08:08.847711 kernel: ACPI: Reserving HPET table memory at [mem 0x9bb77000-0x9bb77037] Sep 12 23:08:08.847718 kernel: ACPI: Reserving MCFG table memory at [mem 0x9bb76000-0x9bb7603b] Sep 12 23:08:08.847726 kernel: ACPI: Reserving WAET table memory at [mem 0x9bb75000-0x9bb75027] Sep 12 23:08:08.847733 kernel: ACPI: Reserving BGRT table memory at [mem 0x9bb74000-0x9bb74037] Sep 12 23:08:08.847744 kernel: No NUMA configuration found Sep 12 23:08:08.847751 kernel: Faking a node at [mem 0x0000000000000000-0x000000009bffffff] Sep 12 23:08:08.847759 kernel: NODE_DATA(0) allocated [mem 0x9bf57dc0-0x9bf5efff] Sep 12 23:08:08.847767 kernel: Zone ranges: Sep 12 23:08:08.847774 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 23:08:08.847782 kernel: DMA32 [mem 0x0000000001000000-0x000000009bffffff] Sep 12 23:08:08.847802 kernel: Normal empty Sep 12 23:08:08.847827 kernel: Device empty Sep 12 23:08:08.847835 kernel: Movable zone start for each node Sep 12 23:08:08.847846 kernel: Early memory node ranges Sep 12 23:08:08.847855 kernel: node 0: [mem 0x0000000000001000-0x000000000002ffff] Sep 12 23:08:08.847862 kernel: node 0: [mem 0x0000000000050000-0x000000000009efff] Sep 12 23:08:08.847870 kernel: node 0: [mem 0x0000000000100000-0x000000009b8ecfff] Sep 12 23:08:08.847878 kernel: node 0: [mem 0x000000009bbff000-0x000000009bfb0fff] Sep 12 23:08:08.847885 kernel: node 0: [mem 0x000000009bfb7000-0x000000009bffffff] Sep 12 23:08:08.847893 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009bffffff] Sep 12 23:08:08.847901 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 23:08:08.847908 kernel: On node 0, zone DMA: 32 pages in unavailable ranges Sep 12 23:08:08.847918 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 12 23:08:08.847926 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 12 23:08:08.847934 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Sep 12 23:08:08.847942 kernel: On node 0, zone DMA32: 16384 pages in unavailable ranges Sep 12 23:08:08.847950 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 12 23:08:08.847957 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 23:08:08.847966 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 23:08:08.847975 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 23:08:08.847983 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 23:08:08.847995 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 23:08:08.848003 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 23:08:08.848010 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 23:08:08.848018 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 23:08:08.848025 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 23:08:08.848033 kernel: TSC deadline timer available Sep 12 23:08:08.848041 kernel: CPU topo: Max. logical packages: 1 Sep 12 23:08:08.848048 kernel: CPU topo: Max. logical dies: 1 Sep 12 23:08:08.848056 kernel: CPU topo: Max. dies per package: 1 Sep 12 23:08:08.848073 kernel: CPU topo: Max. threads per core: 1 Sep 12 23:08:08.848081 kernel: CPU topo: Num. cores per package: 4 Sep 12 23:08:08.848089 kernel: CPU topo: Num. threads per package: 4 Sep 12 23:08:08.848099 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 12 23:08:08.848110 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 12 23:08:08.848118 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 12 23:08:08.848125 kernel: kvm-guest: setup PV sched yield Sep 12 23:08:08.848133 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Sep 12 23:08:08.848143 kernel: Booting paravirtualized kernel on KVM Sep 12 23:08:08.848152 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 23:08:08.848160 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 12 23:08:08.848168 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 12 23:08:08.848176 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 12 23:08:08.848184 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 12 23:08:08.848191 kernel: kvm-guest: PV spinlocks enabled Sep 12 23:08:08.848199 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 12 23:08:08.848209 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 23:08:08.848219 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 23:08:08.848227 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 23:08:08.848235 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 23:08:08.848243 kernel: Fallback order for Node 0: 0 Sep 12 23:08:08.848251 kernel: Built 1 zonelists, mobility grouping on. Total pages: 638054 Sep 12 23:08:08.848259 kernel: Policy zone: DMA32 Sep 12 23:08:08.848267 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 23:08:08.848275 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 12 23:08:08.848285 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 23:08:08.848293 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 23:08:08.848301 kernel: Dynamic Preempt: voluntary Sep 12 23:08:08.848309 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 23:08:08.848318 kernel: rcu: RCU event tracing is enabled. Sep 12 23:08:08.848326 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 12 23:08:08.848334 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 23:08:08.848343 kernel: Rude variant of Tasks RCU enabled. Sep 12 23:08:08.848351 kernel: Tracing variant of Tasks RCU enabled. Sep 12 23:08:08.848361 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 23:08:08.848369 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 12 23:08:08.848377 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 23:08:08.848386 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 23:08:08.848396 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 23:08:08.848404 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 12 23:08:08.848413 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 23:08:08.848514 kernel: Console: colour dummy device 80x25 Sep 12 23:08:08.848523 kernel: printk: legacy console [ttyS0] enabled Sep 12 23:08:08.848535 kernel: ACPI: Core revision 20240827 Sep 12 23:08:08.848543 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 12 23:08:08.848551 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 23:08:08.848559 kernel: x2apic enabled Sep 12 23:08:08.848567 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 23:08:08.848575 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 12 23:08:08.848583 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 12 23:08:08.848591 kernel: kvm-guest: setup PV IPIs Sep 12 23:08:08.848599 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 23:08:08.848609 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 12 23:08:08.848617 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Sep 12 23:08:08.848625 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 12 23:08:08.848633 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 12 23:08:08.848641 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 12 23:08:08.848652 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 23:08:08.848660 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 23:08:08.848668 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 23:08:08.848676 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 12 23:08:08.848686 kernel: active return thunk: retbleed_return_thunk Sep 12 23:08:08.848694 kernel: RETBleed: Mitigation: untrained return thunk Sep 12 23:08:08.848703 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 23:08:08.848711 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 23:08:08.848719 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 12 23:08:08.848728 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 12 23:08:08.848736 kernel: active return thunk: srso_return_thunk Sep 12 23:08:08.848744 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 12 23:08:08.848754 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 23:08:08.848762 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 23:08:08.848770 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 23:08:08.848778 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 23:08:08.848786 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 23:08:08.848794 kernel: Freeing SMP alternatives memory: 32K Sep 12 23:08:08.848802 kernel: pid_max: default: 32768 minimum: 301 Sep 12 23:08:08.848817 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 23:08:08.848825 kernel: landlock: Up and running. Sep 12 23:08:08.848835 kernel: SELinux: Initializing. Sep 12 23:08:08.848843 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 23:08:08.848851 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 23:08:08.848859 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 12 23:08:08.848867 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 12 23:08:08.848876 kernel: ... version: 0 Sep 12 23:08:08.848886 kernel: ... bit width: 48 Sep 12 23:08:08.848894 kernel: ... generic registers: 6 Sep 12 23:08:08.848901 kernel: ... value mask: 0000ffffffffffff Sep 12 23:08:08.848912 kernel: ... max period: 00007fffffffffff Sep 12 23:08:08.848924 kernel: ... fixed-purpose events: 0 Sep 12 23:08:08.848932 kernel: ... event mask: 000000000000003f Sep 12 23:08:08.848940 kernel: signal: max sigframe size: 1776 Sep 12 23:08:08.848948 kernel: rcu: Hierarchical SRCU implementation. Sep 12 23:08:08.848957 kernel: rcu: Max phase no-delay instances is 400. Sep 12 23:08:08.848965 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 23:08:08.848973 kernel: smp: Bringing up secondary CPUs ... Sep 12 23:08:08.848983 kernel: smpboot: x86: Booting SMP configuration: Sep 12 23:08:08.848999 kernel: .... node #0, CPUs: #1 #2 #3 Sep 12 23:08:08.849015 kernel: smp: Brought up 1 node, 4 CPUs Sep 12 23:08:08.849027 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Sep 12 23:08:08.849037 kernel: Memory: 2409224K/2552216K available (14336K kernel code, 2432K rwdata, 9992K rodata, 54084K init, 2880K bss, 137064K reserved, 0K cma-reserved) Sep 12 23:08:08.849050 kernel: devtmpfs: initialized Sep 12 23:08:08.849061 kernel: x86/mm: Memory block size: 128MB Sep 12 23:08:08.849073 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9bb7f000-0x9bbfefff] (524288 bytes) Sep 12 23:08:08.849084 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9bfb5000-0x9bfb6fff] (8192 bytes) Sep 12 23:08:08.849094 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 23:08:08.849108 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 12 23:08:08.849118 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 23:08:08.849128 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 23:08:08.849139 kernel: audit: initializing netlink subsys (disabled) Sep 12 23:08:08.849149 kernel: audit: type=2000 audit(1757718485.952:1): state=initialized audit_enabled=0 res=1 Sep 12 23:08:08.849159 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 23:08:08.849170 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 23:08:08.849180 kernel: cpuidle: using governor menu Sep 12 23:08:08.849191 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 23:08:08.849204 kernel: dca service started, version 1.12.1 Sep 12 23:08:08.849214 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 12 23:08:08.849225 kernel: PCI: Using configuration type 1 for base access Sep 12 23:08:08.849236 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 23:08:08.849246 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 23:08:08.849255 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 23:08:08.849263 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 23:08:08.849271 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 23:08:08.849278 kernel: ACPI: Added _OSI(Module Device) Sep 12 23:08:08.849302 kernel: ACPI: Added _OSI(Processor Device) Sep 12 23:08:08.849310 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 23:08:08.849326 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 23:08:08.849344 kernel: ACPI: Interpreter enabled Sep 12 23:08:08.849353 kernel: ACPI: PM: (supports S0 S5) Sep 12 23:08:08.849361 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 23:08:08.849369 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 23:08:08.849377 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 23:08:08.849384 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 12 23:08:08.849396 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 23:08:08.849642 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 23:08:08.849775 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 12 23:08:08.849913 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 12 23:08:08.849928 kernel: PCI host bridge to bus 0000:00 Sep 12 23:08:08.850126 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 23:08:08.850267 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 23:08:08.850442 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 23:08:08.850668 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Sep 12 23:08:08.850844 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Sep 12 23:08:08.850990 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Sep 12 23:08:08.851128 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 23:08:08.851322 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 12 23:08:08.851534 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 12 23:08:08.851694 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Sep 12 23:08:08.851854 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Sep 12 23:08:08.852735 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 12 23:08:08.852960 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 23:08:08.853199 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 12 23:08:08.853363 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Sep 12 23:08:08.853555 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Sep 12 23:08:08.853718 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Sep 12 23:08:08.853918 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 12 23:08:08.854089 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Sep 12 23:08:08.854249 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Sep 12 23:08:08.854410 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Sep 12 23:08:08.854761 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 12 23:08:08.854950 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Sep 12 23:08:08.855117 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Sep 12 23:08:08.855276 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Sep 12 23:08:08.855463 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Sep 12 23:08:08.855652 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 12 23:08:08.855824 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 12 23:08:08.856016 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 12 23:08:08.856182 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Sep 12 23:08:08.856343 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Sep 12 23:08:08.857614 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 12 23:08:08.857796 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Sep 12 23:08:08.857827 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 23:08:08.857840 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 23:08:08.857861 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 23:08:08.857873 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 23:08:08.857884 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 12 23:08:08.857895 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 12 23:08:08.857906 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 12 23:08:08.857918 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 12 23:08:08.857929 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 12 23:08:08.857940 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 12 23:08:08.857951 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 12 23:08:08.857966 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 12 23:08:08.857978 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 12 23:08:08.857992 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 12 23:08:08.858004 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 12 23:08:08.858016 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 12 23:08:08.858027 kernel: iommu: Default domain type: Translated Sep 12 23:08:08.858039 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 23:08:08.858050 kernel: efivars: Registered efivars operations Sep 12 23:08:08.858061 kernel: PCI: Using ACPI for IRQ routing Sep 12 23:08:08.858072 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 23:08:08.858088 kernel: e820: reserve RAM buffer [mem 0x0009f000-0x0009ffff] Sep 12 23:08:08.858099 kernel: e820: reserve RAM buffer [mem 0x9a102018-0x9bffffff] Sep 12 23:08:08.858110 kernel: e820: reserve RAM buffer [mem 0x9a13f018-0x9bffffff] Sep 12 23:08:08.858121 kernel: e820: reserve RAM buffer [mem 0x9b8ed000-0x9bffffff] Sep 12 23:08:08.858132 kernel: e820: reserve RAM buffer [mem 0x9bfb1000-0x9bffffff] Sep 12 23:08:08.858304 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 12 23:08:08.858494 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 12 23:08:08.858654 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 23:08:08.858676 kernel: vgaarb: loaded Sep 12 23:08:08.858688 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 12 23:08:08.858699 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 12 23:08:08.858712 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 23:08:08.858723 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 23:08:08.858735 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 23:08:08.858746 kernel: pnp: PnP ACPI init Sep 12 23:08:08.858995 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Sep 12 23:08:08.859020 kernel: pnp: PnP ACPI: found 6 devices Sep 12 23:08:08.859031 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 23:08:08.859043 kernel: NET: Registered PF_INET protocol family Sep 12 23:08:08.859056 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 23:08:08.859068 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 23:08:08.859081 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 23:08:08.859095 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 23:08:08.859108 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 23:08:08.859119 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 23:08:08.859134 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 23:08:08.859145 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 23:08:08.859157 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 23:08:08.859168 kernel: NET: Registered PF_XDP protocol family Sep 12 23:08:08.859346 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Sep 12 23:08:08.861215 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Sep 12 23:08:08.861362 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 23:08:08.861505 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 23:08:08.861631 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 23:08:08.861749 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Sep 12 23:08:08.861884 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Sep 12 23:08:08.861998 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Sep 12 23:08:08.862010 kernel: PCI: CLS 0 bytes, default 64 Sep 12 23:08:08.862019 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 12 23:08:08.862028 kernel: Initialise system trusted keyrings Sep 12 23:08:08.862038 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 23:08:08.862048 kernel: Key type asymmetric registered Sep 12 23:08:08.862062 kernel: Asymmetric key parser 'x509' registered Sep 12 23:08:08.862091 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 23:08:08.862103 kernel: io scheduler mq-deadline registered Sep 12 23:08:08.862111 kernel: io scheduler kyber registered Sep 12 23:08:08.862120 kernel: io scheduler bfq registered Sep 12 23:08:08.862129 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 23:08:08.862138 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 12 23:08:08.862147 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 12 23:08:08.862156 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 12 23:08:08.862167 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 23:08:08.862176 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 23:08:08.862185 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 23:08:08.862193 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 23:08:08.862202 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 23:08:08.862211 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 23:08:08.862396 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 12 23:08:08.862575 kernel: rtc_cmos 00:04: registered as rtc0 Sep 12 23:08:08.862708 kernel: rtc_cmos 00:04: setting system clock to 2025-09-12T23:08:08 UTC (1757718488) Sep 12 23:08:08.862846 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 12 23:08:08.862858 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 12 23:08:08.862868 kernel: efifb: probing for efifb Sep 12 23:08:08.862876 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Sep 12 23:08:08.862885 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Sep 12 23:08:08.862893 kernel: efifb: scrolling: redraw Sep 12 23:08:08.862902 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 12 23:08:08.862914 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 23:08:08.862923 kernel: fb0: EFI VGA frame buffer device Sep 12 23:08:08.862934 kernel: pstore: Using crash dump compression: deflate Sep 12 23:08:08.862942 kernel: pstore: Registered efi_pstore as persistent store backend Sep 12 23:08:08.862952 kernel: NET: Registered PF_INET6 protocol family Sep 12 23:08:08.862960 kernel: Segment Routing with IPv6 Sep 12 23:08:08.862971 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 23:08:08.862980 kernel: NET: Registered PF_PACKET protocol family Sep 12 23:08:08.862988 kernel: Key type dns_resolver registered Sep 12 23:08:08.862997 kernel: IPI shorthand broadcast: enabled Sep 12 23:08:08.863005 kernel: sched_clock: Marking stable (3690005260, 139952704)->(3923873434, -93915470) Sep 12 23:08:08.863014 kernel: registered taskstats version 1 Sep 12 23:08:08.863022 kernel: Loading compiled-in X.509 certificates Sep 12 23:08:08.863031 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: c3297a5801573420030c321362a802da1fd49c4e' Sep 12 23:08:08.863040 kernel: Demotion targets for Node 0: null Sep 12 23:08:08.863051 kernel: Key type .fscrypt registered Sep 12 23:08:08.863060 kernel: Key type fscrypt-provisioning registered Sep 12 23:08:08.863068 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 23:08:08.863077 kernel: ima: Allocated hash algorithm: sha1 Sep 12 23:08:08.863086 kernel: ima: No architecture policies found Sep 12 23:08:08.863094 kernel: clk: Disabling unused clocks Sep 12 23:08:08.863103 kernel: Warning: unable to open an initial console. Sep 12 23:08:08.863112 kernel: Freeing unused kernel image (initmem) memory: 54084K Sep 12 23:08:08.863120 kernel: Write protecting the kernel read-only data: 24576k Sep 12 23:08:08.863131 kernel: Freeing unused kernel image (rodata/data gap) memory: 248K Sep 12 23:08:08.863140 kernel: Run /init as init process Sep 12 23:08:08.863148 kernel: with arguments: Sep 12 23:08:08.863157 kernel: /init Sep 12 23:08:08.863166 kernel: with environment: Sep 12 23:08:08.863174 kernel: HOME=/ Sep 12 23:08:08.863182 kernel: TERM=linux Sep 12 23:08:08.863191 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 23:08:08.863205 systemd[1]: Successfully made /usr/ read-only. Sep 12 23:08:08.863222 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 23:08:08.863232 systemd[1]: Detected virtualization kvm. Sep 12 23:08:08.863241 systemd[1]: Detected architecture x86-64. Sep 12 23:08:08.863250 systemd[1]: Running in initrd. Sep 12 23:08:08.863259 systemd[1]: No hostname configured, using default hostname. Sep 12 23:08:08.863268 systemd[1]: Hostname set to . Sep 12 23:08:08.863277 systemd[1]: Initializing machine ID from VM UUID. Sep 12 23:08:08.863288 systemd[1]: Queued start job for default target initrd.target. Sep 12 23:08:08.863298 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:08:08.863307 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:08:08.863317 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 23:08:08.863326 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 23:08:08.863335 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 23:08:08.863345 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 23:08:08.863358 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 23:08:08.863368 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 23:08:08.863377 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:08:08.863386 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:08:08.863395 systemd[1]: Reached target paths.target - Path Units. Sep 12 23:08:08.863407 systemd[1]: Reached target slices.target - Slice Units. Sep 12 23:08:08.863430 systemd[1]: Reached target swap.target - Swaps. Sep 12 23:08:08.863440 systemd[1]: Reached target timers.target - Timer Units. Sep 12 23:08:08.863452 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 23:08:08.863461 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 23:08:08.863470 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 23:08:08.863479 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 23:08:08.863488 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:08:08.863497 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 23:08:08.863506 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:08:08.863515 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 23:08:08.863527 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 23:08:08.863536 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 23:08:08.863544 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 23:08:08.863554 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 23:08:08.863563 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 23:08:08.863572 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 23:08:08.863581 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 23:08:08.863590 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:08:08.863599 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 23:08:08.863611 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:08:08.863620 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 23:08:08.863630 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 23:08:08.863668 systemd-journald[220]: Collecting audit messages is disabled. Sep 12 23:08:08.863695 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:08:08.863705 systemd-journald[220]: Journal started Sep 12 23:08:08.863728 systemd-journald[220]: Runtime Journal (/run/log/journal/72dee40115664f57bb7dbbef2b70c47c) is 6M, max 48.2M, 42.2M free. Sep 12 23:08:08.847862 systemd-modules-load[221]: Inserted module 'overlay' Sep 12 23:08:08.866229 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 23:08:08.869447 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 23:08:08.871127 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:08:08.879456 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 23:08:08.881664 systemd-modules-load[221]: Inserted module 'br_netfilter' Sep 12 23:08:08.882929 kernel: Bridge firewalling registered Sep 12 23:08:08.882602 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 23:08:08.884970 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 23:08:08.889412 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 23:08:08.894555 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 23:08:08.894891 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:08:08.895194 systemd-tmpfiles[244]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 23:08:08.900273 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:08:08.912697 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:08:08.914696 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 23:08:08.930636 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 23:08:08.934890 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 23:08:08.963668 dracut-cmdline[264]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 23:08:08.975049 systemd-resolved[260]: Positive Trust Anchors: Sep 12 23:08:08.975068 systemd-resolved[260]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 23:08:08.975106 systemd-resolved[260]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 23:08:08.978166 systemd-resolved[260]: Defaulting to hostname 'linux'. Sep 12 23:08:08.979683 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 23:08:08.988476 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:08:09.096486 kernel: SCSI subsystem initialized Sep 12 23:08:09.111469 kernel: Loading iSCSI transport class v2.0-870. Sep 12 23:08:09.126467 kernel: iscsi: registered transport (tcp) Sep 12 23:08:09.149463 kernel: iscsi: registered transport (qla4xxx) Sep 12 23:08:09.149546 kernel: QLogic iSCSI HBA Driver Sep 12 23:08:09.173322 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 23:08:09.202316 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 23:08:09.206194 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 23:08:09.268233 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 23:08:09.270079 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 23:08:09.352476 kernel: raid6: avx2x4 gen() 28062 MB/s Sep 12 23:08:09.407475 kernel: raid6: avx2x2 gen() 29964 MB/s Sep 12 23:08:09.429468 kernel: raid6: avx2x1 gen() 25142 MB/s Sep 12 23:08:09.429559 kernel: raid6: using algorithm avx2x2 gen() 29964 MB/s Sep 12 23:08:09.446573 kernel: raid6: .... xor() 19249 MB/s, rmw enabled Sep 12 23:08:09.446683 kernel: raid6: using avx2x2 recovery algorithm Sep 12 23:08:09.470461 kernel: xor: automatically using best checksumming function avx Sep 12 23:08:09.640463 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 23:08:09.648991 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 23:08:09.652075 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:08:09.694880 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 12 23:08:09.702622 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:08:09.703865 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 23:08:09.724736 dracut-pre-trigger[474]: rd.md=0: removing MD RAID activation Sep 12 23:08:09.758519 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 23:08:09.761471 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 23:08:09.849732 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:08:09.853588 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 23:08:09.888456 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 12 23:08:09.891905 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 12 23:08:09.895912 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 23:08:09.895945 kernel: GPT:9289727 != 19775487 Sep 12 23:08:09.895958 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 23:08:09.895968 kernel: GPT:9289727 != 19775487 Sep 12 23:08:09.897618 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 23:08:09.897647 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 23:08:09.912269 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 12 23:08:09.923439 kernel: libata version 3.00 loaded. Sep 12 23:08:09.925447 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 23:08:09.930442 kernel: ahci 0000:00:1f.2: version 3.0 Sep 12 23:08:09.934520 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 12 23:08:09.936937 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 12 23:08:09.937126 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 12 23:08:09.937271 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 12 23:08:09.938510 kernel: AES CTR mode by8 optimization enabled Sep 12 23:08:09.940401 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 23:08:09.940615 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:08:09.944337 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:08:09.956668 kernel: scsi host0: ahci Sep 12 23:08:09.956960 kernel: scsi host1: ahci Sep 12 23:08:09.957173 kernel: scsi host2: ahci Sep 12 23:08:09.957403 kernel: scsi host3: ahci Sep 12 23:08:09.957676 kernel: scsi host4: ahci Sep 12 23:08:09.952507 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:08:09.954983 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 23:08:09.967165 kernel: scsi host5: ahci Sep 12 23:08:09.967405 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Sep 12 23:08:09.967484 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Sep 12 23:08:09.967501 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Sep 12 23:08:09.967521 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Sep 12 23:08:09.967536 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Sep 12 23:08:09.967549 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Sep 12 23:08:09.999035 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:08:10.023487 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 12 23:08:10.046090 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 12 23:08:10.057539 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 12 23:08:10.059257 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 12 23:08:10.074500 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 23:08:10.077551 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 23:08:10.119857 disk-uuid[634]: Primary Header is updated. Sep 12 23:08:10.119857 disk-uuid[634]: Secondary Entries is updated. Sep 12 23:08:10.119857 disk-uuid[634]: Secondary Header is updated. Sep 12 23:08:10.124460 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 23:08:10.129498 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 23:08:10.278467 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 12 23:08:10.278550 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 12 23:08:10.279483 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 12 23:08:10.280467 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 23:08:10.281463 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 23:08:10.282468 kernel: ata3.00: LPM support broken, forcing max_power Sep 12 23:08:10.282510 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 12 23:08:10.282985 kernel: ata3.00: applying bridge limits Sep 12 23:08:10.284468 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 23:08:10.285700 kernel: ata3.00: LPM support broken, forcing max_power Sep 12 23:08:10.285730 kernel: ata3.00: configured for UDMA/100 Sep 12 23:08:10.286461 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 23:08:10.353485 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 12 23:08:10.353868 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 23:08:10.379482 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 12 23:08:10.699275 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 23:08:10.701235 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 23:08:10.707998 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:08:10.708729 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 23:08:10.710406 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 23:08:10.751023 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 23:08:11.132467 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 23:08:11.132872 disk-uuid[635]: The operation has completed successfully. Sep 12 23:08:11.171236 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 23:08:11.171368 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 23:08:11.208027 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 23:08:11.235644 sh[663]: Success Sep 12 23:08:11.255673 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 23:08:11.255764 kernel: device-mapper: uevent: version 1.0.3 Sep 12 23:08:11.257469 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 23:08:11.267452 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 12 23:08:11.302940 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 23:08:11.306900 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 23:08:11.323284 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 23:08:11.328527 kernel: BTRFS: device fsid 5d2ab445-1154-4e47-9d7e-ff4b81d84474 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (675) Sep 12 23:08:11.330654 kernel: BTRFS info (device dm-0): first mount of filesystem 5d2ab445-1154-4e47-9d7e-ff4b81d84474 Sep 12 23:08:11.330730 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 23:08:11.335500 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 23:08:11.335529 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 23:08:11.336996 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 23:08:11.339412 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 23:08:11.341904 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 23:08:11.345127 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 23:08:11.347923 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 23:08:11.372455 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (708) Sep 12 23:08:11.374848 kernel: BTRFS info (device vda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 23:08:11.374881 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 23:08:11.379442 kernel: BTRFS info (device vda6): turning on async discard Sep 12 23:08:11.379484 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 23:08:11.385451 kernel: BTRFS info (device vda6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 23:08:11.385974 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 23:08:11.390463 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 23:08:11.489534 ignition[751]: Ignition 2.22.0 Sep 12 23:08:11.490703 ignition[751]: Stage: fetch-offline Sep 12 23:08:11.490771 ignition[751]: no configs at "/usr/lib/ignition/base.d" Sep 12 23:08:11.490783 ignition[751]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 23:08:11.490883 ignition[751]: parsed url from cmdline: "" Sep 12 23:08:11.490890 ignition[751]: no config URL provided Sep 12 23:08:11.490896 ignition[751]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 23:08:11.490905 ignition[751]: no config at "/usr/lib/ignition/user.ign" Sep 12 23:08:11.490932 ignition[751]: op(1): [started] loading QEMU firmware config module Sep 12 23:08:11.490939 ignition[751]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 12 23:08:11.501986 ignition[751]: op(1): [finished] loading QEMU firmware config module Sep 12 23:08:11.502032 ignition[751]: QEMU firmware config was not found. Ignoring... Sep 12 23:08:11.515865 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 23:08:11.520203 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 23:08:11.548192 ignition[751]: parsing config with SHA512: 2013a91b0df880b52b13188bd343dfbe205d8f8825e9a7ec4c2e536c7d88b5b1fb55965fc92189020aafa82032ff24b6ed47b3a60bdee56c3ead339cbec7d998 Sep 12 23:08:11.554457 unknown[751]: fetched base config from "system" Sep 12 23:08:11.554470 unknown[751]: fetched user config from "qemu" Sep 12 23:08:11.554840 ignition[751]: fetch-offline: fetch-offline passed Sep 12 23:08:11.554898 ignition[751]: Ignition finished successfully Sep 12 23:08:11.561166 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 23:08:11.583617 systemd-networkd[853]: lo: Link UP Sep 12 23:08:11.583628 systemd-networkd[853]: lo: Gained carrier Sep 12 23:08:11.585478 systemd-networkd[853]: Enumeration completed Sep 12 23:08:11.585947 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:08:11.585951 systemd-networkd[853]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:08:11.586275 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 23:08:11.587239 systemd-networkd[853]: eth0: Link UP Sep 12 23:08:11.587497 systemd-networkd[853]: eth0: Gained carrier Sep 12 23:08:11.587508 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:08:11.595252 systemd[1]: Reached target network.target - Network. Sep 12 23:08:11.596228 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 23:08:11.599835 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 23:08:11.617551 systemd-networkd[853]: eth0: DHCPv4 address 10.0.0.150/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 23:08:11.643358 ignition[857]: Ignition 2.22.0 Sep 12 23:08:11.643378 ignition[857]: Stage: kargs Sep 12 23:08:11.643543 ignition[857]: no configs at "/usr/lib/ignition/base.d" Sep 12 23:08:11.643554 ignition[857]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 23:08:11.644446 ignition[857]: kargs: kargs passed Sep 12 23:08:11.644510 ignition[857]: Ignition finished successfully Sep 12 23:08:11.650620 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 23:08:11.653491 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 23:08:11.727646 ignition[866]: Ignition 2.22.0 Sep 12 23:08:11.727666 ignition[866]: Stage: disks Sep 12 23:08:11.727822 ignition[866]: no configs at "/usr/lib/ignition/base.d" Sep 12 23:08:11.727833 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 23:08:11.737055 ignition[866]: disks: disks passed Sep 12 23:08:11.737132 ignition[866]: Ignition finished successfully Sep 12 23:08:11.741740 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 23:08:11.744550 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 23:08:11.747062 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 23:08:11.748538 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 23:08:11.748615 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 23:08:11.752100 systemd[1]: Reached target basic.target - Basic System. Sep 12 23:08:11.753545 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 23:08:11.783441 systemd-fsck[876]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 23:08:12.261223 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 23:08:12.264683 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 23:08:12.425481 kernel: EXT4-fs (vda9): mounted filesystem d027afc5-396a-49bf-a5be-60ddd42cb089 r/w with ordered data mode. Quota mode: none. Sep 12 23:08:12.426879 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 23:08:12.427732 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 23:08:12.429373 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 23:08:12.431928 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 23:08:12.433454 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 23:08:12.433511 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 23:08:12.433543 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 23:08:12.450235 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 23:08:12.452131 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 23:08:12.456246 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (884) Sep 12 23:08:12.456279 kernel: BTRFS info (device vda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 23:08:12.456291 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 23:08:12.460737 kernel: BTRFS info (device vda6): turning on async discard Sep 12 23:08:12.460807 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 23:08:12.462554 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 23:08:12.501196 initrd-setup-root[908]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 23:08:12.505868 initrd-setup-root[915]: cut: /sysroot/etc/group: No such file or directory Sep 12 23:08:12.512969 initrd-setup-root[922]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 23:08:12.518973 initrd-setup-root[929]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 23:08:12.623633 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 23:08:12.625506 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 23:08:12.627684 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 23:08:12.652255 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 23:08:12.653409 kernel: BTRFS info (device vda6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 23:08:12.667857 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 23:08:12.704228 ignition[998]: INFO : Ignition 2.22.0 Sep 12 23:08:12.704228 ignition[998]: INFO : Stage: mount Sep 12 23:08:12.706320 ignition[998]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:08:12.706320 ignition[998]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 23:08:12.706320 ignition[998]: INFO : mount: mount passed Sep 12 23:08:12.706320 ignition[998]: INFO : Ignition finished successfully Sep 12 23:08:12.708290 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 23:08:12.710753 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 23:08:13.429042 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 23:08:13.449169 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1010) Sep 12 23:08:13.449244 kernel: BTRFS info (device vda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 23:08:13.449261 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 23:08:13.453517 kernel: BTRFS info (device vda6): turning on async discard Sep 12 23:08:13.453589 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 23:08:13.455599 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 23:08:13.492703 ignition[1027]: INFO : Ignition 2.22.0 Sep 12 23:08:13.492703 ignition[1027]: INFO : Stage: files Sep 12 23:08:13.494787 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:08:13.494787 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 23:08:13.494787 ignition[1027]: DEBUG : files: compiled without relabeling support, skipping Sep 12 23:08:13.498709 ignition[1027]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 23:08:13.498709 ignition[1027]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 23:08:13.502056 ignition[1027]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 23:08:13.502056 ignition[1027]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 23:08:13.502056 ignition[1027]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 23:08:13.500242 unknown[1027]: wrote ssh authorized keys file for user: core Sep 12 23:08:13.507854 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 12 23:08:13.507854 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 12 23:08:13.574204 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 23:08:13.607822 systemd-networkd[853]: eth0: Gained IPv6LL Sep 12 23:08:13.830450 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 23:08:13.833180 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 23:08:13.853947 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 23:08:13.853947 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 23:08:13.853947 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:08:13.853947 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:08:13.853947 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:08:13.853947 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 12 23:08:14.291253 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 23:08:15.148455 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 12 23:08:15.148455 ignition[1027]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 23:08:15.152670 ignition[1027]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 23:08:15.156846 ignition[1027]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 23:08:15.156846 ignition[1027]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 23:08:15.156846 ignition[1027]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 23:08:15.161812 ignition[1027]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 23:08:15.161812 ignition[1027]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 23:08:15.161812 ignition[1027]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 23:08:15.161812 ignition[1027]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 23:08:15.189072 ignition[1027]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 23:08:15.199613 ignition[1027]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 23:08:15.201650 ignition[1027]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 23:08:15.201650 ignition[1027]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 12 23:08:15.201650 ignition[1027]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 23:08:15.201650 ignition[1027]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 23:08:15.201650 ignition[1027]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 23:08:15.201650 ignition[1027]: INFO : files: files passed Sep 12 23:08:15.201650 ignition[1027]: INFO : Ignition finished successfully Sep 12 23:08:15.209999 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 23:08:15.213020 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 23:08:15.215454 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 23:08:15.233930 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 23:08:15.234120 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 23:08:15.237976 initrd-setup-root-after-ignition[1056]: grep: /sysroot/oem/oem-release: No such file or directory Sep 12 23:08:15.239894 initrd-setup-root-after-ignition[1058]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:08:15.239894 initrd-setup-root-after-ignition[1058]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:08:15.246043 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:08:15.241053 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 23:08:15.243560 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 23:08:15.247199 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 23:08:15.317198 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 23:08:15.318528 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 23:08:15.322514 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 23:08:15.325060 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 23:08:15.325198 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 23:08:15.327715 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 23:08:15.368769 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 23:08:15.370814 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 23:08:15.445471 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:08:15.447863 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:08:15.448093 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 23:08:15.451092 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 23:08:15.451289 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 23:08:15.455168 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 23:08:15.455348 systemd[1]: Stopped target basic.target - Basic System. Sep 12 23:08:15.455894 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 23:08:15.456219 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 23:08:15.456716 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 23:08:15.464439 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 23:08:15.466704 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 23:08:15.467911 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 23:08:15.468282 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 23:08:15.468830 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 23:08:15.469174 systemd[1]: Stopped target swap.target - Swaps. Sep 12 23:08:15.469514 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 23:08:15.469727 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 23:08:15.470456 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:08:15.471011 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:08:15.471316 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 23:08:15.471487 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:08:15.486593 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 23:08:15.486804 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 23:08:15.489011 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 23:08:15.489166 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 23:08:15.492748 systemd[1]: Stopped target paths.target - Path Units. Sep 12 23:08:15.493363 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 23:08:15.497583 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:08:15.499067 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 23:08:15.499401 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 23:08:15.500076 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 23:08:15.500181 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 23:08:15.507330 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 23:08:15.507470 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 23:08:15.509551 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 23:08:15.509687 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 23:08:15.519115 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 23:08:15.520458 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 23:08:15.524316 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 23:08:15.527865 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 23:08:15.530200 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 23:08:15.530458 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:08:15.534489 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 23:08:15.534783 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 23:08:15.544521 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 23:08:15.544717 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 23:08:15.564177 ignition[1082]: INFO : Ignition 2.22.0 Sep 12 23:08:15.564177 ignition[1082]: INFO : Stage: umount Sep 12 23:08:15.566570 ignition[1082]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:08:15.566570 ignition[1082]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 23:08:15.569125 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 23:08:15.594442 ignition[1082]: INFO : umount: umount passed Sep 12 23:08:15.595637 ignition[1082]: INFO : Ignition finished successfully Sep 12 23:08:15.598682 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 23:08:15.598896 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 23:08:15.600519 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 23:08:15.600676 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 23:08:15.606099 systemd[1]: Stopped target network.target - Network. Sep 12 23:08:15.606235 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 23:08:15.606324 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 23:08:15.608377 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 23:08:15.608466 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 23:08:15.612003 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 23:08:15.612061 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 23:08:15.615790 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 23:08:15.615886 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 23:08:15.616928 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 23:08:15.616986 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 23:08:15.617499 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 23:08:15.618040 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 23:08:15.634414 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 23:08:15.634623 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 23:08:15.639444 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 23:08:15.639720 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 23:08:15.639844 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 23:08:15.644719 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 23:08:15.645363 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 23:08:15.648977 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 23:08:15.649051 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:08:15.652957 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 23:08:15.653025 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 23:08:15.653076 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 23:08:15.655330 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 23:08:15.655379 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:08:15.661250 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 23:08:15.662294 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 23:08:15.663312 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 23:08:15.663361 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:08:15.667352 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:08:15.669795 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 23:08:15.669878 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 23:08:15.681269 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 23:08:15.681517 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:08:15.682785 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 23:08:15.682846 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 23:08:15.684907 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 23:08:15.684960 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:08:15.685243 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 23:08:15.685307 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 23:08:15.686202 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 23:08:15.686263 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 23:08:15.687088 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 23:08:15.687151 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 23:08:15.707229 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 23:08:15.708333 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 23:08:15.708401 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 23:08:15.713381 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 23:08:15.713502 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:08:15.716950 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 23:08:15.717008 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:08:15.720938 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 23:08:15.720999 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:08:15.723325 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 23:08:15.723389 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:08:15.727267 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 23:08:15.727344 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 23:08:15.727392 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 23:08:15.727488 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 23:08:15.727923 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 23:08:15.734484 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 23:08:15.741111 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 23:08:15.741238 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 23:08:15.742531 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 23:08:15.746338 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 23:08:15.771060 systemd[1]: Switching root. Sep 12 23:08:15.819938 systemd-journald[220]: Journal stopped Sep 12 23:08:17.272593 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). Sep 12 23:08:17.272679 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 23:08:17.272703 kernel: SELinux: policy capability open_perms=1 Sep 12 23:08:17.272743 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 23:08:17.272765 kernel: SELinux: policy capability always_check_network=0 Sep 12 23:08:17.272780 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 23:08:17.272796 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 23:08:17.272811 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 23:08:17.272827 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 23:08:17.272842 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 23:08:17.272858 kernel: audit: type=1403 audit(1757718496.393:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 23:08:17.272885 systemd[1]: Successfully loaded SELinux policy in 65.195ms. Sep 12 23:08:17.272916 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.065ms. Sep 12 23:08:17.272934 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 23:08:17.272957 systemd[1]: Detected virtualization kvm. Sep 12 23:08:17.272973 systemd[1]: Detected architecture x86-64. Sep 12 23:08:17.272990 systemd[1]: Detected first boot. Sep 12 23:08:17.273006 systemd[1]: Initializing machine ID from VM UUID. Sep 12 23:08:17.273023 zram_generator::config[1128]: No configuration found. Sep 12 23:08:17.273040 kernel: Guest personality initialized and is inactive Sep 12 23:08:17.273067 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 23:08:17.273083 kernel: Initialized host personality Sep 12 23:08:17.273098 kernel: NET: Registered PF_VSOCK protocol family Sep 12 23:08:17.273115 systemd[1]: Populated /etc with preset unit settings. Sep 12 23:08:17.273132 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 23:08:17.273149 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 23:08:17.273166 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 23:08:17.273183 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 23:08:17.273199 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 23:08:17.273227 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 23:08:17.273245 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 23:08:17.273262 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 23:08:17.273279 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 23:08:17.273295 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 23:08:17.273312 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 23:08:17.273328 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 23:08:17.273345 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:08:17.273372 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:08:17.273389 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 23:08:17.273405 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 23:08:17.275158 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 23:08:17.275185 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 23:08:17.275202 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 23:08:17.275219 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:08:17.275235 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:08:17.275269 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 23:08:17.275286 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 23:08:17.275303 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 23:08:17.275320 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 23:08:17.275705 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:08:17.275726 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 23:08:17.275743 systemd[1]: Reached target slices.target - Slice Units. Sep 12 23:08:17.275759 systemd[1]: Reached target swap.target - Swaps. Sep 12 23:08:17.275775 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 23:08:17.275801 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 23:08:17.275818 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 23:08:17.275834 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:08:17.275850 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 23:08:17.275866 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:08:17.276093 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 23:08:17.276115 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 23:08:17.276133 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 23:08:17.276150 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 23:08:17.276177 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:08:17.276195 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 23:08:17.276211 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 23:08:17.276228 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 23:08:17.276246 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 23:08:17.276263 systemd[1]: Reached target machines.target - Containers. Sep 12 23:08:17.278914 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 23:08:17.278941 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:08:17.278986 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 23:08:17.279000 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 23:08:17.279013 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:08:17.279027 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 23:08:17.279040 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:08:17.279053 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 23:08:17.279065 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:08:17.279079 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 23:08:17.279092 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 23:08:17.279113 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 23:08:17.279126 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 23:08:17.279138 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 23:08:17.279153 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 23:08:17.279165 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 23:08:17.279178 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 23:08:17.279191 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 23:08:17.279218 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 23:08:17.279247 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 23:08:17.280525 kernel: loop: module loaded Sep 12 23:08:17.280553 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 23:08:17.280583 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 23:08:17.280599 systemd[1]: Stopped verity-setup.service. Sep 12 23:08:17.280646 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:08:17.280670 kernel: fuse: init (API version 7.41) Sep 12 23:08:17.280686 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 23:08:17.280702 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 23:08:17.280718 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 23:08:17.280734 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 23:08:17.280757 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 23:08:17.280802 systemd-journald[1192]: Collecting audit messages is disabled. Sep 12 23:08:17.280830 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 23:08:17.280843 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:08:17.280856 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 23:08:17.280869 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 23:08:17.280882 systemd-journald[1192]: Journal started Sep 12 23:08:17.280915 systemd-journald[1192]: Runtime Journal (/run/log/journal/72dee40115664f57bb7dbbef2b70c47c) is 6M, max 48.2M, 42.2M free. Sep 12 23:08:16.997989 systemd[1]: Queued start job for default target multi-user.target. Sep 12 23:08:17.017963 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 12 23:08:17.018554 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 23:08:17.283636 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 23:08:17.284871 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:08:17.285151 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:08:17.286814 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:08:17.287041 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:08:17.288575 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 23:08:17.288800 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 23:08:17.290206 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:08:17.290441 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:08:17.291873 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 23:08:17.293306 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 23:08:17.294930 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 23:08:17.296521 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 23:08:17.300748 kernel: ACPI: bus type drm_connector registered Sep 12 23:08:17.301813 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 23:08:17.302064 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 23:08:17.314259 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 23:08:17.316945 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 23:08:17.319146 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 23:08:17.320454 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 23:08:17.320583 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 23:08:17.322634 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 23:08:17.330539 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 23:08:17.331797 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:08:17.334581 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 23:08:17.336876 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 23:08:17.338080 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 23:08:17.339692 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 23:08:17.340813 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 23:08:17.341834 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 23:08:17.345645 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 23:08:17.381115 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 23:08:17.384750 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:08:17.386300 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 23:08:17.387648 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 23:08:17.397042 systemd-journald[1192]: Time spent on flushing to /var/log/journal/72dee40115664f57bb7dbbef2b70c47c is 26.763ms for 1046 entries. Sep 12 23:08:17.397042 systemd-journald[1192]: System Journal (/var/log/journal/72dee40115664f57bb7dbbef2b70c47c) is 8M, max 195.6M, 187.6M free. Sep 12 23:08:18.011894 systemd-journald[1192]: Received client request to flush runtime journal. Sep 12 23:08:18.012724 kernel: loop0: detected capacity change from 0 to 229808 Sep 12 23:08:18.012772 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 23:08:18.012795 kernel: loop1: detected capacity change from 0 to 128016 Sep 12 23:08:18.012820 kernel: loop2: detected capacity change from 0 to 110984 Sep 12 23:08:18.012848 kernel: loop3: detected capacity change from 0 to 229808 Sep 12 23:08:18.012892 kernel: loop4: detected capacity change from 0 to 128016 Sep 12 23:08:18.012921 kernel: loop5: detected capacity change from 0 to 110984 Sep 12 23:08:17.499691 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Sep 12 23:08:17.499705 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Sep 12 23:08:17.501901 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:08:17.505729 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:08:17.798006 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 23:08:17.808708 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 23:08:17.813648 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 23:08:17.964236 (sd-merge)[1254]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 12 23:08:17.964861 (sd-merge)[1254]: Merged extensions into '/usr'. Sep 12 23:08:17.969767 systemd[1]: Reload requested from client PID 1229 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 23:08:17.969780 systemd[1]: Reloading... Sep 12 23:08:18.051559 zram_generator::config[1295]: No configuration found. Sep 12 23:08:18.151740 ldconfig[1220]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 23:08:18.251075 systemd[1]: Reloading finished in 280 ms. Sep 12 23:08:18.282029 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 23:08:18.302180 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 23:08:18.304081 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 23:08:18.305835 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 23:08:18.307525 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 23:08:18.309370 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 23:08:18.318796 systemd[1]: Starting ensure-sysext.service... Sep 12 23:08:18.321197 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 23:08:18.344150 systemd[1]: Reload requested from client PID 1332 ('systemctl') (unit ensure-sysext.service)... Sep 12 23:08:18.344168 systemd[1]: Reloading... Sep 12 23:08:18.394454 zram_generator::config[1358]: No configuration found. Sep 12 23:08:18.607964 systemd[1]: Reloading finished in 263 ms. Sep 12 23:08:18.629939 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 23:08:18.658295 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 23:08:18.669287 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 23:08:18.671818 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 23:08:18.675387 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:08:18.675727 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:08:18.694777 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:08:18.697596 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:08:18.701717 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:08:18.703040 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:08:18.703163 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 23:08:18.703266 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:08:18.709806 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:08:18.710392 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:08:18.711997 systemd-tmpfiles[1400]: ACLs are not supported, ignoring. Sep 12 23:08:18.712015 systemd-tmpfiles[1400]: ACLs are not supported, ignoring. Sep 12 23:08:18.713650 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:08:18.713970 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:08:18.714005 systemd-tmpfiles[1401]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 23:08:18.714284 systemd-tmpfiles[1401]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 23:08:18.714594 systemd-tmpfiles[1401]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 23:08:18.714873 systemd-tmpfiles[1401]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 23:08:18.716006 systemd-tmpfiles[1401]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 23:08:18.716300 systemd-tmpfiles[1401]: ACLs are not supported, ignoring. Sep 12 23:08:18.716378 systemd-tmpfiles[1401]: ACLs are not supported, ignoring. Sep 12 23:08:18.717354 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:08:18.719328 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:08:18.719648 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:08:18.721236 systemd-tmpfiles[1401]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 23:08:18.721250 systemd-tmpfiles[1401]: Skipping /boot Sep 12 23:08:18.728072 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:08:18.728353 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:08:18.729833 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:08:18.731683 systemd-tmpfiles[1401]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 23:08:18.731699 systemd-tmpfiles[1401]: Skipping /boot Sep 12 23:08:18.732303 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 23:08:18.735686 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:08:18.741559 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:08:18.742967 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:08:18.743133 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 23:08:18.745046 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:08:18.746498 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 23:08:18.749238 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:08:18.749711 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:08:18.752900 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 23:08:18.753153 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 23:08:18.754936 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:08:18.756928 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:08:18.757155 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:08:18.759018 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:08:18.759269 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:08:18.765166 systemd[1]: Finished ensure-sysext.service. Sep 12 23:08:18.772548 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 23:08:18.775310 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 23:08:18.778105 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 23:08:18.779609 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 23:08:18.779711 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 23:08:18.782382 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 23:08:18.784793 systemd-udevd[1415]: Using default interface naming scheme 'v255'. Sep 12 23:08:18.794580 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 23:08:18.798611 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 23:08:18.805755 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 23:08:18.823703 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:08:18.825733 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 23:08:18.827805 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 23:08:18.836575 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 23:08:18.843556 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 23:08:18.877646 augenrules[1473]: No rules Sep 12 23:08:18.878878 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 23:08:18.880255 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 23:08:18.881644 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 23:08:18.881910 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 23:08:18.892657 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 23:08:18.895152 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 23:08:18.974072 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 23:08:19.032558 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 23:08:19.036921 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 23:08:19.044389 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 23:08:19.044463 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 23:08:19.043308 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 23:08:19.044614 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 23:08:19.050455 kernel: ACPI: button: Power Button [PWRF] Sep 12 23:08:19.065157 systemd-resolved[1425]: Positive Trust Anchors: Sep 12 23:08:19.065185 systemd-resolved[1425]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 23:08:19.065216 systemd-resolved[1425]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 23:08:19.069066 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 23:08:19.070825 systemd-resolved[1425]: Defaulting to hostname 'linux'. Sep 12 23:08:19.071210 systemd-networkd[1450]: lo: Link UP Sep 12 23:08:19.071503 systemd-networkd[1450]: lo: Gained carrier Sep 12 23:08:19.073614 systemd-networkd[1450]: Enumeration completed Sep 12 23:08:19.074125 systemd-networkd[1450]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:08:19.074213 systemd-networkd[1450]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:08:19.075001 systemd-networkd[1450]: eth0: Link UP Sep 12 23:08:19.075257 systemd-networkd[1450]: eth0: Gained carrier Sep 12 23:08:19.075353 systemd-networkd[1450]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:08:19.076458 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 23:08:19.077735 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 23:08:19.079058 systemd[1]: Reached target network.target - Network. Sep 12 23:08:19.080014 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:08:19.081548 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 23:08:19.084595 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Sep 12 23:08:19.084904 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 12 23:08:19.085093 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 12 23:08:19.085786 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 23:08:19.087103 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 23:08:19.088355 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 23:08:19.089730 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 23:08:19.090905 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 23:08:19.092177 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 23:08:19.093411 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 23:08:19.093463 systemd[1]: Reached target paths.target - Path Units. Sep 12 23:08:19.094368 systemd[1]: Reached target timers.target - Timer Units. Sep 12 23:08:19.096494 systemd-networkd[1450]: eth0: DHCPv4 address 10.0.0.150/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 23:08:19.096597 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 23:08:19.100116 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 23:08:19.103371 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 23:08:19.103489 systemd-timesyncd[1426]: Network configuration changed, trying to establish connection. Sep 12 23:08:20.120507 systemd-timesyncd[1426]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 12 23:08:20.120553 systemd-timesyncd[1426]: Initial clock synchronization to Fri 2025-09-12 23:08:20.120424 UTC. Sep 12 23:08:20.121606 systemd-resolved[1425]: Clock change detected. Flushing caches. Sep 12 23:08:20.122626 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 23:08:20.123928 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 23:08:20.138639 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 23:08:20.140211 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 23:08:20.143133 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 23:08:20.145716 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 23:08:20.147676 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 23:08:20.150522 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 23:08:20.151541 systemd[1]: Reached target basic.target - Basic System. Sep 12 23:08:20.152551 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 23:08:20.152577 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 23:08:20.153708 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 23:08:20.162330 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 23:08:20.169833 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 23:08:20.173854 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 23:08:20.178880 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 23:08:20.181057 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 23:08:20.182603 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 23:08:20.184179 jq[1529]: false Sep 12 23:08:20.186965 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 23:08:20.194815 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 23:08:20.275102 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 23:08:20.277912 extend-filesystems[1531]: Found /dev/vda6 Sep 12 23:08:20.278614 google_oslogin_nss_cache[1533]: oslogin_cache_refresh[1533]: Refreshing passwd entry cache Sep 12 23:08:20.278877 oslogin_cache_refresh[1533]: Refreshing passwd entry cache Sep 12 23:08:20.283215 extend-filesystems[1531]: Found /dev/vda9 Sep 12 23:08:20.284894 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 23:08:20.290841 extend-filesystems[1531]: Checking size of /dev/vda9 Sep 12 23:08:20.290835 oslogin_cache_refresh[1533]: Failure getting users, quitting Sep 12 23:08:20.292022 google_oslogin_nss_cache[1533]: oslogin_cache_refresh[1533]: Failure getting users, quitting Sep 12 23:08:20.292022 google_oslogin_nss_cache[1533]: oslogin_cache_refresh[1533]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 23:08:20.292022 google_oslogin_nss_cache[1533]: oslogin_cache_refresh[1533]: Refreshing group entry cache Sep 12 23:08:20.290858 oslogin_cache_refresh[1533]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 23:08:20.290910 oslogin_cache_refresh[1533]: Refreshing group entry cache Sep 12 23:08:20.293090 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 23:08:20.295597 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 23:08:20.296381 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 23:08:20.300390 google_oslogin_nss_cache[1533]: oslogin_cache_refresh[1533]: Failure getting groups, quitting Sep 12 23:08:20.300390 google_oslogin_nss_cache[1533]: oslogin_cache_refresh[1533]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 23:08:20.300380 oslogin_cache_refresh[1533]: Failure getting groups, quitting Sep 12 23:08:20.300396 oslogin_cache_refresh[1533]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 23:08:20.319293 extend-filesystems[1531]: Resized partition /dev/vda9 Sep 12 23:08:20.328881 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 23:08:20.363572 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 23:08:20.372487 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 23:08:20.374407 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 23:08:20.374690 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 23:08:20.375109 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 23:08:20.375374 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 23:08:20.377034 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 23:08:20.377324 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 23:08:20.381081 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 23:08:20.381769 jq[1561]: true Sep 12 23:08:20.381963 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 23:08:20.397063 (ntainerd)[1565]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 23:08:20.401258 jq[1564]: true Sep 12 23:08:20.417802 extend-filesystems[1578]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 23:08:20.421932 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:08:20.437558 kernel: kvm_amd: TSC scaling supported Sep 12 23:08:20.437635 kernel: kvm_amd: Nested Virtualization enabled Sep 12 23:08:20.437704 kernel: kvm_amd: Nested Paging enabled Sep 12 23:08:20.437730 kernel: kvm_amd: LBR virtualization supported Sep 12 23:08:20.437827 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 12 23:08:20.437862 kernel: kvm_amd: Virtual GIF supported Sep 12 23:08:20.500876 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 12 23:08:20.507303 update_engine[1550]: I20250912 23:08:20.506929 1550 main.cc:92] Flatcar Update Engine starting Sep 12 23:08:20.507019 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 23:08:20.512774 systemd-logind[1546]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 23:08:20.512806 systemd-logind[1546]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 23:08:20.514160 systemd-logind[1546]: New seat seat0. Sep 12 23:08:20.515230 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 23:08:20.575936 tar[1563]: linux-amd64/LICENSE Sep 12 23:08:20.588770 tar[1563]: linux-amd64/helm Sep 12 23:08:20.577815 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 23:08:20.577517 dbus-daemon[1526]: [system] SELinux support is enabled Sep 12 23:08:20.580562 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 23:08:20.580597 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 23:08:20.581172 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 23:08:20.581198 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 23:08:20.591763 update_engine[1550]: I20250912 23:08:20.591688 1550 update_check_scheduler.cc:74] Next update check in 9m20s Sep 12 23:08:20.592196 dbus-daemon[1526]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 12 23:08:20.592354 systemd[1]: Started update-engine.service - Update Engine. Sep 12 23:08:20.596970 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 23:08:20.599682 kernel: EDAC MC: Ver: 3.0.0 Sep 12 23:08:20.611396 sshd_keygen[1556]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 23:08:20.637748 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 23:08:20.644342 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 23:08:20.663541 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 23:08:20.663966 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 23:08:20.668177 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 23:08:20.673712 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 12 23:08:20.703090 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 23:08:20.705168 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 23:08:20.708798 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 23:08:20.711157 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 23:08:20.713002 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:08:20.714359 locksmithd[1599]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 23:08:20.781451 extend-filesystems[1578]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 12 23:08:20.781451 extend-filesystems[1578]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 23:08:20.781451 extend-filesystems[1578]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 12 23:08:20.786982 extend-filesystems[1531]: Resized filesystem in /dev/vda9 Sep 12 23:08:20.783133 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 23:08:20.783513 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 23:08:20.842499 bash[1594]: Updated "/home/core/.ssh/authorized_keys" Sep 12 23:08:20.843924 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 23:08:20.848621 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 23:08:20.874850 containerd[1565]: time="2025-09-12T23:08:20Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 23:08:20.875700 containerd[1565]: time="2025-09-12T23:08:20.875668740Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 23:08:20.889012 containerd[1565]: time="2025-09-12T23:08:20.888944096Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.229µs" Sep 12 23:08:20.889012 containerd[1565]: time="2025-09-12T23:08:20.888996484Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 23:08:20.889012 containerd[1565]: time="2025-09-12T23:08:20.889022082Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 23:08:20.889340 containerd[1565]: time="2025-09-12T23:08:20.889311385Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 23:08:20.889364 containerd[1565]: time="2025-09-12T23:08:20.889338917Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 23:08:20.889396 containerd[1565]: time="2025-09-12T23:08:20.889375154Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 23:08:20.889507 containerd[1565]: time="2025-09-12T23:08:20.889478398Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 23:08:20.889507 containerd[1565]: time="2025-09-12T23:08:20.889502143Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 23:08:20.889975 containerd[1565]: time="2025-09-12T23:08:20.889933912Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 23:08:20.889975 containerd[1565]: time="2025-09-12T23:08:20.889960703Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 23:08:20.890020 containerd[1565]: time="2025-09-12T23:08:20.889977514Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 23:08:20.890020 containerd[1565]: time="2025-09-12T23:08:20.889991180Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 23:08:20.890158 containerd[1565]: time="2025-09-12T23:08:20.890124259Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 23:08:20.890526 containerd[1565]: time="2025-09-12T23:08:20.890485086Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 23:08:20.890557 containerd[1565]: time="2025-09-12T23:08:20.890535711Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 23:08:20.890557 containerd[1565]: time="2025-09-12T23:08:20.890551310Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 23:08:20.890617 containerd[1565]: time="2025-09-12T23:08:20.890598970Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 23:08:20.890921 containerd[1565]: time="2025-09-12T23:08:20.890889665Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 23:08:20.891013 containerd[1565]: time="2025-09-12T23:08:20.890987638Z" level=info msg="metadata content store policy set" policy=shared Sep 12 23:08:20.896910 containerd[1565]: time="2025-09-12T23:08:20.896872164Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 23:08:20.896969 containerd[1565]: time="2025-09-12T23:08:20.896920284Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 23:08:20.896969 containerd[1565]: time="2025-09-12T23:08:20.896937186Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 23:08:20.897008 containerd[1565]: time="2025-09-12T23:08:20.896971580Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 23:08:20.897008 containerd[1565]: time="2025-09-12T23:08:20.896987150Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 23:08:20.897008 containerd[1565]: time="2025-09-12T23:08:20.896999663Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 23:08:20.897060 containerd[1565]: time="2025-09-12T23:08:20.897015894Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 23:08:20.897060 containerd[1565]: time="2025-09-12T23:08:20.897029800Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 23:08:20.897060 containerd[1565]: time="2025-09-12T23:08:20.897042063Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 23:08:20.897060 containerd[1565]: time="2025-09-12T23:08:20.897053234Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 23:08:20.897133 containerd[1565]: time="2025-09-12T23:08:20.897063823Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 23:08:20.897133 containerd[1565]: time="2025-09-12T23:08:20.897084763Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 23:08:20.897245 containerd[1565]: time="2025-09-12T23:08:20.897221309Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 23:08:20.897267 containerd[1565]: time="2025-09-12T23:08:20.897246406Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 23:08:20.897296 containerd[1565]: time="2025-09-12T23:08:20.897263458Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 23:08:20.897296 containerd[1565]: time="2025-09-12T23:08:20.897276082Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 23:08:20.897296 containerd[1565]: time="2025-09-12T23:08:20.897288775Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 23:08:20.897348 containerd[1565]: time="2025-09-12T23:08:20.897301720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 23:08:20.897348 containerd[1565]: time="2025-09-12T23:08:20.897315075Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 23:08:20.897348 containerd[1565]: time="2025-09-12T23:08:20.897326977Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 23:08:20.897348 containerd[1565]: time="2025-09-12T23:08:20.897339230Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 23:08:20.897426 containerd[1565]: time="2025-09-12T23:08:20.897351864Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 23:08:20.897426 containerd[1565]: time="2025-09-12T23:08:20.897363255Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 23:08:20.897462 containerd[1565]: time="2025-09-12T23:08:20.897438055Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 23:08:20.897462 containerd[1565]: time="2025-09-12T23:08:20.897452552Z" level=info msg="Start snapshots syncer" Sep 12 23:08:20.897506 containerd[1565]: time="2025-09-12T23:08:20.897479834Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 23:08:20.897840 containerd[1565]: time="2025-09-12T23:08:20.897785807Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 23:08:20.897946 containerd[1565]: time="2025-09-12T23:08:20.897856720Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 23:08:20.899862 containerd[1565]: time="2025-09-12T23:08:20.899832426Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 23:08:20.900006 containerd[1565]: time="2025-09-12T23:08:20.899971807Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 23:08:20.900006 containerd[1565]: time="2025-09-12T23:08:20.900001783Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 23:08:20.900115 containerd[1565]: time="2025-09-12T23:08:20.900014477Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 23:08:20.900115 containerd[1565]: time="2025-09-12T23:08:20.900027201Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 23:08:20.900115 containerd[1565]: time="2025-09-12T23:08:20.900040285Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 23:08:20.900115 containerd[1565]: time="2025-09-12T23:08:20.900056005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 23:08:20.900115 containerd[1565]: time="2025-09-12T23:08:20.900069210Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 23:08:20.900115 containerd[1565]: time="2025-09-12T23:08:20.900109325Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900123662Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900136867Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900166803Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900183524Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900194885Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900206106Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900224391Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900241883Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900255860Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900275987Z" level=info msg="runtime interface created" Sep 12 23:08:20.900285 containerd[1565]: time="2025-09-12T23:08:20.900282520Z" level=info msg="created NRI interface" Sep 12 23:08:20.900574 containerd[1565]: time="2025-09-12T23:08:20.900293210Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 23:08:20.900574 containerd[1565]: time="2025-09-12T23:08:20.900307326Z" level=info msg="Connect containerd service" Sep 12 23:08:20.900574 containerd[1565]: time="2025-09-12T23:08:20.900339156Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 23:08:20.901325 containerd[1565]: time="2025-09-12T23:08:20.901283727Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 23:08:20.969025 tar[1563]: linux-amd64/README.md Sep 12 23:08:20.983777 containerd[1565]: time="2025-09-12T23:08:20.983724243Z" level=info msg="Start subscribing containerd event" Sep 12 23:08:20.983881 containerd[1565]: time="2025-09-12T23:08:20.983796418Z" level=info msg="Start recovering state" Sep 12 23:08:20.984032 containerd[1565]: time="2025-09-12T23:08:20.983984821Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 23:08:20.984075 containerd[1565]: time="2025-09-12T23:08:20.983988348Z" level=info msg="Start event monitor" Sep 12 23:08:20.984075 containerd[1565]: time="2025-09-12T23:08:20.984069671Z" level=info msg="Start cni network conf syncer for default" Sep 12 23:08:20.984128 containerd[1565]: time="2025-09-12T23:08:20.984084268Z" level=info msg="Start streaming server" Sep 12 23:08:20.984128 containerd[1565]: time="2025-09-12T23:08:20.984098975Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 23:08:20.984128 containerd[1565]: time="2025-09-12T23:08:20.984108083Z" level=info msg="runtime interface starting up..." Sep 12 23:08:20.984128 containerd[1565]: time="2025-09-12T23:08:20.984116398Z" level=info msg="starting plugins..." Sep 12 23:08:20.984240 containerd[1565]: time="2025-09-12T23:08:20.984144190Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 23:08:20.984240 containerd[1565]: time="2025-09-12T23:08:20.984086963Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 23:08:20.985166 containerd[1565]: time="2025-09-12T23:08:20.984736130Z" level=info msg="containerd successfully booted in 0.110633s" Sep 12 23:08:20.986480 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 23:08:20.991358 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 23:08:21.215892 systemd-networkd[1450]: eth0: Gained IPv6LL Sep 12 23:08:21.219514 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 23:08:21.221347 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 23:08:21.223970 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 12 23:08:21.226451 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:08:21.228864 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 23:08:21.263765 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 23:08:21.285761 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 23:08:21.286068 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 12 23:08:21.287802 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 23:08:22.032601 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:22.034674 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 23:08:22.036362 systemd[1]: Startup finished in 3.757s (kernel) + 7.737s (initrd) + 4.691s (userspace) = 16.186s. Sep 12 23:08:22.043212 (kubelet)[1676]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:08:22.494118 kubelet[1676]: E0912 23:08:22.494044 1676 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:08:22.498165 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:08:22.498367 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:08:22.498846 systemd[1]: kubelet.service: Consumed 1.053s CPU time, 268.6M memory peak. Sep 12 23:08:23.510069 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 23:08:23.511350 systemd[1]: Started sshd@0-10.0.0.150:22-10.0.0.1:38486.service - OpenSSH per-connection server daemon (10.0.0.1:38486). Sep 12 23:08:23.590192 sshd[1689]: Accepted publickey for core from 10.0.0.1 port 38486 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:08:23.591913 sshd-session[1689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:08:23.598225 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 23:08:23.599298 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 23:08:23.606308 systemd-logind[1546]: New session 1 of user core. Sep 12 23:08:23.622492 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 23:08:23.625490 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 23:08:23.649238 (systemd)[1694]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 23:08:23.651846 systemd-logind[1546]: New session c1 of user core. Sep 12 23:08:23.804984 systemd[1694]: Queued start job for default target default.target. Sep 12 23:08:23.821910 systemd[1694]: Created slice app.slice - User Application Slice. Sep 12 23:08:23.821934 systemd[1694]: Reached target paths.target - Paths. Sep 12 23:08:23.821976 systemd[1694]: Reached target timers.target - Timers. Sep 12 23:08:23.823580 systemd[1694]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 23:08:23.834879 systemd[1694]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 23:08:23.835004 systemd[1694]: Reached target sockets.target - Sockets. Sep 12 23:08:23.835042 systemd[1694]: Reached target basic.target - Basic System. Sep 12 23:08:23.835090 systemd[1694]: Reached target default.target - Main User Target. Sep 12 23:08:23.835127 systemd[1694]: Startup finished in 176ms. Sep 12 23:08:23.835410 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 23:08:23.837109 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 23:08:23.904800 systemd[1]: Started sshd@1-10.0.0.150:22-10.0.0.1:38488.service - OpenSSH per-connection server daemon (10.0.0.1:38488). Sep 12 23:08:23.960163 sshd[1705]: Accepted publickey for core from 10.0.0.1 port 38488 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:08:23.962040 sshd-session[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:08:23.966395 systemd-logind[1546]: New session 2 of user core. Sep 12 23:08:23.983784 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 23:08:24.037096 sshd[1708]: Connection closed by 10.0.0.1 port 38488 Sep 12 23:08:24.037428 sshd-session[1705]: pam_unix(sshd:session): session closed for user core Sep 12 23:08:24.051412 systemd[1]: sshd@1-10.0.0.150:22-10.0.0.1:38488.service: Deactivated successfully. Sep 12 23:08:24.053590 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 23:08:24.054556 systemd-logind[1546]: Session 2 logged out. Waiting for processes to exit. Sep 12 23:08:24.057503 systemd[1]: Started sshd@2-10.0.0.150:22-10.0.0.1:38500.service - OpenSSH per-connection server daemon (10.0.0.1:38500). Sep 12 23:08:24.058563 systemd-logind[1546]: Removed session 2. Sep 12 23:08:24.114242 sshd[1714]: Accepted publickey for core from 10.0.0.1 port 38500 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:08:24.115794 sshd-session[1714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:08:24.121072 systemd-logind[1546]: New session 3 of user core. Sep 12 23:08:24.130923 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 23:08:24.182043 sshd[1717]: Connection closed by 10.0.0.1 port 38500 Sep 12 23:08:24.182515 sshd-session[1714]: pam_unix(sshd:session): session closed for user core Sep 12 23:08:24.193771 systemd[1]: sshd@2-10.0.0.150:22-10.0.0.1:38500.service: Deactivated successfully. Sep 12 23:08:24.196423 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 23:08:24.197399 systemd-logind[1546]: Session 3 logged out. Waiting for processes to exit. Sep 12 23:08:24.201344 systemd[1]: Started sshd@3-10.0.0.150:22-10.0.0.1:38502.service - OpenSSH per-connection server daemon (10.0.0.1:38502). Sep 12 23:08:24.202033 systemd-logind[1546]: Removed session 3. Sep 12 23:08:24.263991 sshd[1723]: Accepted publickey for core from 10.0.0.1 port 38502 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:08:24.265852 sshd-session[1723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:08:24.270857 systemd-logind[1546]: New session 4 of user core. Sep 12 23:08:24.280845 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 23:08:24.336038 sshd[1726]: Connection closed by 10.0.0.1 port 38502 Sep 12 23:08:24.336590 sshd-session[1723]: pam_unix(sshd:session): session closed for user core Sep 12 23:08:24.355801 systemd[1]: sshd@3-10.0.0.150:22-10.0.0.1:38502.service: Deactivated successfully. Sep 12 23:08:24.357544 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 23:08:24.358492 systemd-logind[1546]: Session 4 logged out. Waiting for processes to exit. Sep 12 23:08:24.361313 systemd[1]: Started sshd@4-10.0.0.150:22-10.0.0.1:38506.service - OpenSSH per-connection server daemon (10.0.0.1:38506). Sep 12 23:08:24.362283 systemd-logind[1546]: Removed session 4. Sep 12 23:08:24.420229 sshd[1732]: Accepted publickey for core from 10.0.0.1 port 38506 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:08:24.421860 sshd-session[1732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:08:24.426769 systemd-logind[1546]: New session 5 of user core. Sep 12 23:08:24.445816 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 23:08:24.504062 sudo[1736]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 23:08:24.504384 sudo[1736]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:08:24.525481 sudo[1736]: pam_unix(sudo:session): session closed for user root Sep 12 23:08:24.527375 sshd[1735]: Connection closed by 10.0.0.1 port 38506 Sep 12 23:08:24.527828 sshd-session[1732]: pam_unix(sshd:session): session closed for user core Sep 12 23:08:24.541608 systemd[1]: sshd@4-10.0.0.150:22-10.0.0.1:38506.service: Deactivated successfully. Sep 12 23:08:24.543719 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 23:08:24.544667 systemd-logind[1546]: Session 5 logged out. Waiting for processes to exit. Sep 12 23:08:24.548033 systemd[1]: Started sshd@5-10.0.0.150:22-10.0.0.1:38512.service - OpenSSH per-connection server daemon (10.0.0.1:38512). Sep 12 23:08:24.548590 systemd-logind[1546]: Removed session 5. Sep 12 23:08:24.603511 sshd[1742]: Accepted publickey for core from 10.0.0.1 port 38512 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:08:24.605186 sshd-session[1742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:08:24.609798 systemd-logind[1546]: New session 6 of user core. Sep 12 23:08:24.619786 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 23:08:24.675406 sudo[1747]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 23:08:24.675784 sudo[1747]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:08:24.687578 sudo[1747]: pam_unix(sudo:session): session closed for user root Sep 12 23:08:24.694912 sudo[1746]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 23:08:24.695314 sudo[1746]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:08:24.705675 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 23:08:24.755782 augenrules[1769]: No rules Sep 12 23:08:24.756769 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 23:08:24.757056 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 23:08:24.758420 sudo[1746]: pam_unix(sudo:session): session closed for user root Sep 12 23:08:24.760206 sshd[1745]: Connection closed by 10.0.0.1 port 38512 Sep 12 23:08:24.760559 sshd-session[1742]: pam_unix(sshd:session): session closed for user core Sep 12 23:08:24.768040 systemd[1]: sshd@5-10.0.0.150:22-10.0.0.1:38512.service: Deactivated successfully. Sep 12 23:08:24.769888 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 23:08:24.770750 systemd-logind[1546]: Session 6 logged out. Waiting for processes to exit. Sep 12 23:08:24.773341 systemd[1]: Started sshd@6-10.0.0.150:22-10.0.0.1:38526.service - OpenSSH per-connection server daemon (10.0.0.1:38526). Sep 12 23:08:24.774026 systemd-logind[1546]: Removed session 6. Sep 12 23:08:24.827517 sshd[1778]: Accepted publickey for core from 10.0.0.1 port 38526 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:08:24.829359 sshd-session[1778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:08:24.834634 systemd-logind[1546]: New session 7 of user core. Sep 12 23:08:24.845840 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 23:08:24.899551 sudo[1782]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 23:08:24.899915 sudo[1782]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:08:25.203673 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 23:08:25.219004 (dockerd)[1802]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 23:08:25.457799 dockerd[1802]: time="2025-09-12T23:08:25.457603093Z" level=info msg="Starting up" Sep 12 23:08:25.458597 dockerd[1802]: time="2025-09-12T23:08:25.458564385Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 23:08:25.477746 dockerd[1802]: time="2025-09-12T23:08:25.477633637Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 23:08:26.328822 dockerd[1802]: time="2025-09-12T23:08:26.328734575Z" level=info msg="Loading containers: start." Sep 12 23:08:26.670708 kernel: Initializing XFRM netlink socket Sep 12 23:08:26.983478 systemd-networkd[1450]: docker0: Link UP Sep 12 23:08:26.988210 dockerd[1802]: time="2025-09-12T23:08:26.988152679Z" level=info msg="Loading containers: done." Sep 12 23:08:27.016374 dockerd[1802]: time="2025-09-12T23:08:27.016300126Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 23:08:27.016605 dockerd[1802]: time="2025-09-12T23:08:27.016410052Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 23:08:27.016605 dockerd[1802]: time="2025-09-12T23:08:27.016552549Z" level=info msg="Initializing buildkit" Sep 12 23:08:27.054194 dockerd[1802]: time="2025-09-12T23:08:27.054126524Z" level=info msg="Completed buildkit initialization" Sep 12 23:08:27.064927 dockerd[1802]: time="2025-09-12T23:08:27.064869541Z" level=info msg="Daemon has completed initialization" Sep 12 23:08:27.065031 dockerd[1802]: time="2025-09-12T23:08:27.064977413Z" level=info msg="API listen on /run/docker.sock" Sep 12 23:08:27.065208 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 23:08:28.164085 containerd[1565]: time="2025-09-12T23:08:28.164017766Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 12 23:08:28.858716 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1488586164.mount: Deactivated successfully. Sep 12 23:08:29.897980 containerd[1565]: time="2025-09-12T23:08:29.897888232Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:29.898925 containerd[1565]: time="2025-09-12T23:08:29.898845848Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=30114893" Sep 12 23:08:29.901776 containerd[1565]: time="2025-09-12T23:08:29.901728965Z" level=info msg="ImageCreate event name:\"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:29.904593 containerd[1565]: time="2025-09-12T23:08:29.904553372Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:29.905467 containerd[1565]: time="2025-09-12T23:08:29.905421870Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"30111492\" in 1.741355724s" Sep 12 23:08:29.905535 containerd[1565]: time="2025-09-12T23:08:29.905475100Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\"" Sep 12 23:08:29.906109 containerd[1565]: time="2025-09-12T23:08:29.906085024Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 12 23:08:31.189851 containerd[1565]: time="2025-09-12T23:08:31.189769501Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:31.190681 containerd[1565]: time="2025-09-12T23:08:31.190640584Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=26020844" Sep 12 23:08:31.191895 containerd[1565]: time="2025-09-12T23:08:31.191821639Z" level=info msg="ImageCreate event name:\"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:31.194535 containerd[1565]: time="2025-09-12T23:08:31.194497517Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:31.195445 containerd[1565]: time="2025-09-12T23:08:31.195399439Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"27681301\" in 1.28928504s" Sep 12 23:08:31.195445 containerd[1565]: time="2025-09-12T23:08:31.195438682Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\"" Sep 12 23:08:31.195960 containerd[1565]: time="2025-09-12T23:08:31.195909906Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 12 23:08:32.591457 containerd[1565]: time="2025-09-12T23:08:32.591346137Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:32.592684 containerd[1565]: time="2025-09-12T23:08:32.592626528Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=20155568" Sep 12 23:08:32.594151 containerd[1565]: time="2025-09-12T23:08:32.594119769Z" level=info msg="ImageCreate event name:\"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:32.597270 containerd[1565]: time="2025-09-12T23:08:32.597238377Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:32.598687 containerd[1565]: time="2025-09-12T23:08:32.598617403Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"21816043\" in 1.402669546s" Sep 12 23:08:32.598765 containerd[1565]: time="2025-09-12T23:08:32.598692945Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\"" Sep 12 23:08:32.599376 containerd[1565]: time="2025-09-12T23:08:32.599336833Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 12 23:08:32.702699 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 23:08:32.707919 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:08:33.525794 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:33.543056 (kubelet)[2096]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:08:33.637938 kubelet[2096]: E0912 23:08:33.637847 2096 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:08:33.645307 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:08:33.645534 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:08:33.646052 systemd[1]: kubelet.service: Consumed 886ms CPU time, 111.1M memory peak. Sep 12 23:08:35.200726 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1093865922.mount: Deactivated successfully. Sep 12 23:08:35.960104 containerd[1565]: time="2025-09-12T23:08:35.960022768Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:35.960722 containerd[1565]: time="2025-09-12T23:08:35.960684269Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=31929469" Sep 12 23:08:35.961880 containerd[1565]: time="2025-09-12T23:08:35.961836029Z" level=info msg="ImageCreate event name:\"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:35.964665 containerd[1565]: time="2025-09-12T23:08:35.964603779Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:35.965585 containerd[1565]: time="2025-09-12T23:08:35.965540837Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"31928488\" in 3.366173427s" Sep 12 23:08:35.965585 containerd[1565]: time="2025-09-12T23:08:35.965577175Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\"" Sep 12 23:08:35.966457 containerd[1565]: time="2025-09-12T23:08:35.966412832Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 12 23:08:36.505753 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2662323365.mount: Deactivated successfully. Sep 12 23:08:38.221357 containerd[1565]: time="2025-09-12T23:08:38.221291494Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:38.222552 containerd[1565]: time="2025-09-12T23:08:38.222510450Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Sep 12 23:08:38.224041 containerd[1565]: time="2025-09-12T23:08:38.223978894Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:38.228103 containerd[1565]: time="2025-09-12T23:08:38.228045210Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:38.229124 containerd[1565]: time="2025-09-12T23:08:38.229093616Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 2.262639757s" Sep 12 23:08:38.229124 containerd[1565]: time="2025-09-12T23:08:38.229122180Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 12 23:08:38.229682 containerd[1565]: time="2025-09-12T23:08:38.229624612Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 23:08:38.737568 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2737894594.mount: Deactivated successfully. Sep 12 23:08:38.745788 containerd[1565]: time="2025-09-12T23:08:38.745729713Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:08:38.746619 containerd[1565]: time="2025-09-12T23:08:38.746564709Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 23:08:38.748057 containerd[1565]: time="2025-09-12T23:08:38.748009148Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:08:38.750399 containerd[1565]: time="2025-09-12T23:08:38.750371097Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:08:38.751067 containerd[1565]: time="2025-09-12T23:08:38.751026667Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 521.373431ms" Sep 12 23:08:38.751113 containerd[1565]: time="2025-09-12T23:08:38.751068335Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 23:08:38.751701 containerd[1565]: time="2025-09-12T23:08:38.751631511Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 12 23:08:39.276771 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1628091541.mount: Deactivated successfully. Sep 12 23:08:41.607296 containerd[1565]: time="2025-09-12T23:08:41.607194491Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:41.608001 containerd[1565]: time="2025-09-12T23:08:41.607940300Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58378433" Sep 12 23:08:41.609179 containerd[1565]: time="2025-09-12T23:08:41.609111927Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:41.612563 containerd[1565]: time="2025-09-12T23:08:41.612493328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:08:41.613750 containerd[1565]: time="2025-09-12T23:08:41.613722123Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.862031531s" Sep 12 23:08:41.613797 containerd[1565]: time="2025-09-12T23:08:41.613750175Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 12 23:08:43.702852 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 23:08:43.705001 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:08:43.929185 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:43.936995 (kubelet)[2257]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:08:43.974949 kubelet[2257]: E0912 23:08:43.974805 2257 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:08:43.979493 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:08:43.979752 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:08:43.980225 systemd[1]: kubelet.service: Consumed 219ms CPU time, 110.5M memory peak. Sep 12 23:08:43.992344 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:43.992572 systemd[1]: kubelet.service: Consumed 219ms CPU time, 110.5M memory peak. Sep 12 23:08:43.995061 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:08:44.021664 systemd[1]: Reload requested from client PID 2272 ('systemctl') (unit session-7.scope)... Sep 12 23:08:44.021691 systemd[1]: Reloading... Sep 12 23:08:44.131761 zram_generator::config[2315]: No configuration found. Sep 12 23:08:44.980866 systemd[1]: Reloading finished in 958 ms. Sep 12 23:08:45.048121 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 23:08:45.048295 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 23:08:45.048733 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:45.048812 systemd[1]: kubelet.service: Consumed 159ms CPU time, 98.2M memory peak. Sep 12 23:08:45.050950 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:08:45.241805 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:45.251035 (kubelet)[2363]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:08:45.288680 kubelet[2363]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:08:45.288680 kubelet[2363]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 23:08:45.288680 kubelet[2363]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:08:45.288680 kubelet[2363]: I0912 23:08:45.288539 2363 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:08:45.867678 kubelet[2363]: I0912 23:08:45.866752 2363 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 23:08:45.867678 kubelet[2363]: I0912 23:08:45.866795 2363 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:08:45.867678 kubelet[2363]: I0912 23:08:45.867219 2363 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 23:08:45.896554 kubelet[2363]: I0912 23:08:45.896487 2363 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:08:45.896897 kubelet[2363]: E0912 23:08:45.896847 2363 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.150:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 12 23:08:45.906482 kubelet[2363]: I0912 23:08:45.906416 2363 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 23:08:45.913884 kubelet[2363]: I0912 23:08:45.913823 2363 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:08:45.914350 kubelet[2363]: I0912 23:08:45.914292 2363 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:08:45.914582 kubelet[2363]: I0912 23:08:45.914330 2363 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:08:45.914582 kubelet[2363]: I0912 23:08:45.914582 2363 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:08:45.914780 kubelet[2363]: I0912 23:08:45.914596 2363 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 23:08:45.915623 kubelet[2363]: I0912 23:08:45.915575 2363 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:08:45.918333 kubelet[2363]: I0912 23:08:45.918290 2363 kubelet.go:480] "Attempting to sync node with API server" Sep 12 23:08:45.918333 kubelet[2363]: I0912 23:08:45.918325 2363 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:08:45.918424 kubelet[2363]: I0912 23:08:45.918372 2363 kubelet.go:386] "Adding apiserver pod source" Sep 12 23:08:45.920066 kubelet[2363]: I0912 23:08:45.920038 2363 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:08:45.928678 kubelet[2363]: E0912 23:08:45.928226 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.150:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 23:08:45.928678 kubelet[2363]: I0912 23:08:45.928345 2363 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 23:08:45.928980 kubelet[2363]: I0912 23:08:45.928955 2363 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 23:08:45.930209 kubelet[2363]: E0912 23:08:45.930173 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.150:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 23:08:45.930427 kubelet[2363]: W0912 23:08:45.930391 2363 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 23:08:45.934485 kubelet[2363]: I0912 23:08:45.934441 2363 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 23:08:45.935084 kubelet[2363]: I0912 23:08:45.934557 2363 server.go:1289] "Started kubelet" Sep 12 23:08:45.936011 kubelet[2363]: I0912 23:08:45.935932 2363 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:08:45.937537 kubelet[2363]: I0912 23:08:45.937515 2363 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:08:45.937646 kubelet[2363]: I0912 23:08:45.937544 2363 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:08:45.937861 kubelet[2363]: I0912 23:08:45.937828 2363 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:08:45.940478 kubelet[2363]: E0912 23:08:45.939265 2363 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.150:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.150:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864abafc172a371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 23:08:45.934478193 +0000 UTC m=+0.678522974,LastTimestamp:2025-09-12 23:08:45.934478193 +0000 UTC m=+0.678522974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 23:08:45.940619 kubelet[2363]: I0912 23:08:45.940588 2363 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:08:45.941103 kubelet[2363]: E0912 23:08:45.941037 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:45.941150 kubelet[2363]: I0912 23:08:45.941105 2363 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 23:08:45.941216 kubelet[2363]: E0912 23:08:45.941179 2363 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.150:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.150:6443: connect: connection refused" interval="200ms" Sep 12 23:08:45.941245 kubelet[2363]: I0912 23:08:45.941225 2363 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 23:08:45.941334 kubelet[2363]: I0912 23:08:45.941306 2363 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:08:45.941586 kubelet[2363]: I0912 23:08:45.941560 2363 server.go:317] "Adding debug handlers to kubelet server" Sep 12 23:08:45.941950 kubelet[2363]: E0912 23:08:45.941918 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.150:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 23:08:45.942087 kubelet[2363]: E0912 23:08:45.942041 2363 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:08:45.942308 kubelet[2363]: I0912 23:08:45.941536 2363 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:08:45.943843 kubelet[2363]: I0912 23:08:45.943813 2363 factory.go:223] Registration of the containerd container factory successfully Sep 12 23:08:45.943843 kubelet[2363]: I0912 23:08:45.943829 2363 factory.go:223] Registration of the systemd container factory successfully Sep 12 23:08:45.960130 kubelet[2363]: I0912 23:08:45.960090 2363 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 23:08:45.960130 kubelet[2363]: I0912 23:08:45.960115 2363 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 23:08:45.960130 kubelet[2363]: I0912 23:08:45.960131 2363 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:08:45.962275 kubelet[2363]: I0912 23:08:45.962216 2363 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 23:08:45.964584 kubelet[2363]: I0912 23:08:45.963862 2363 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 23:08:45.964584 kubelet[2363]: I0912 23:08:45.963897 2363 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 23:08:45.964584 kubelet[2363]: I0912 23:08:45.963927 2363 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 23:08:45.964584 kubelet[2363]: I0912 23:08:45.963935 2363 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 23:08:45.964584 kubelet[2363]: E0912 23:08:45.963988 2363 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 23:08:45.964584 kubelet[2363]: E0912 23:08:45.964470 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.150:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 23:08:46.042110 kubelet[2363]: E0912 23:08:46.042055 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.064389 kubelet[2363]: E0912 23:08:46.064317 2363 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 23:08:46.142358 kubelet[2363]: E0912 23:08:46.142184 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.142358 kubelet[2363]: E0912 23:08:46.142199 2363 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.150:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.150:6443: connect: connection refused" interval="400ms" Sep 12 23:08:46.242915 kubelet[2363]: E0912 23:08:46.242835 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.265086 kubelet[2363]: E0912 23:08:46.265045 2363 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 23:08:46.343961 kubelet[2363]: E0912 23:08:46.343862 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.444284 kubelet[2363]: E0912 23:08:46.444115 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.543130 kubelet[2363]: E0912 23:08:46.543038 2363 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.150:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.150:6443: connect: connection refused" interval="800ms" Sep 12 23:08:46.545134 kubelet[2363]: E0912 23:08:46.545072 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.645753 kubelet[2363]: E0912 23:08:46.645669 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.665961 kubelet[2363]: E0912 23:08:46.665890 2363 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 23:08:46.746682 kubelet[2363]: E0912 23:08:46.746608 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.792565 kubelet[2363]: E0912 23:08:46.792496 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.150:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 23:08:46.793217 kubelet[2363]: I0912 23:08:46.793128 2363 policy_none.go:49] "None policy: Start" Sep 12 23:08:46.793217 kubelet[2363]: I0912 23:08:46.793193 2363 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 23:08:46.793217 kubelet[2363]: I0912 23:08:46.793214 2363 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:08:46.827472 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 23:08:46.845262 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 23:08:46.846753 kubelet[2363]: E0912 23:08:46.846727 2363 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:46.849069 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 23:08:46.870237 kubelet[2363]: E0912 23:08:46.870197 2363 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 23:08:46.870556 kubelet[2363]: I0912 23:08:46.870515 2363 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:08:46.870556 kubelet[2363]: I0912 23:08:46.870542 2363 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:08:46.870880 kubelet[2363]: I0912 23:08:46.870851 2363 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:08:46.871943 kubelet[2363]: E0912 23:08:46.871846 2363 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 23:08:46.871943 kubelet[2363]: E0912 23:08:46.871890 2363 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 23:08:46.972183 kubelet[2363]: I0912 23:08:46.972128 2363 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 23:08:46.972623 kubelet[2363]: E0912 23:08:46.972580 2363 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.150:6443/api/v1/nodes\": dial tcp 10.0.0.150:6443: connect: connection refused" node="localhost" Sep 12 23:08:46.978494 kubelet[2363]: E0912 23:08:46.978208 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.150:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 23:08:47.017644 kubelet[2363]: E0912 23:08:47.017476 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.150:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 23:08:47.174984 kubelet[2363]: I0912 23:08:47.174927 2363 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 23:08:47.175444 kubelet[2363]: E0912 23:08:47.175404 2363 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.150:6443/api/v1/nodes\": dial tcp 10.0.0.150:6443: connect: connection refused" node="localhost" Sep 12 23:08:47.327879 kubelet[2363]: E0912 23:08:47.327412 2363 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.150:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 23:08:47.344008 kubelet[2363]: E0912 23:08:47.343937 2363 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.150:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.150:6443: connect: connection refused" interval="1.6s" Sep 12 23:08:47.492789 systemd[1]: Created slice kubepods-burstable-pod44915cd3782837a3d6bf7426bd3757d3.slice - libcontainer container kubepods-burstable-pod44915cd3782837a3d6bf7426bd3757d3.slice. Sep 12 23:08:47.537807 kubelet[2363]: E0912 23:08:47.537750 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:47.552236 kubelet[2363]: I0912 23:08:47.552165 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 12 23:08:47.552671 kubelet[2363]: I0912 23:08:47.552271 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/44915cd3782837a3d6bf7426bd3757d3-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"44915cd3782837a3d6bf7426bd3757d3\") " pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:47.552671 kubelet[2363]: I0912 23:08:47.552311 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/44915cd3782837a3d6bf7426bd3757d3-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"44915cd3782837a3d6bf7426bd3757d3\") " pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:47.552671 kubelet[2363]: I0912 23:08:47.552376 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/44915cd3782837a3d6bf7426bd3757d3-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"44915cd3782837a3d6bf7426bd3757d3\") " pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:47.552671 kubelet[2363]: I0912 23:08:47.552418 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:47.552671 kubelet[2363]: I0912 23:08:47.552443 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:47.552875 kubelet[2363]: I0912 23:08:47.552483 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:47.552875 kubelet[2363]: I0912 23:08:47.552516 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:47.552875 kubelet[2363]: I0912 23:08:47.552549 2363 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:47.553707 systemd[1]: Created slice kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice - libcontainer container kubepods-burstable-podb678d5c6713e936e66aa5bb73166297e.slice. Sep 12 23:08:47.570793 kubelet[2363]: E0912 23:08:47.570716 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:47.575496 systemd[1]: Created slice kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice - libcontainer container kubepods-burstable-pod7b968cf906b2d9d713a362c43868bef2.slice. Sep 12 23:08:47.577106 kubelet[2363]: I0912 23:08:47.576933 2363 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 23:08:47.577549 kubelet[2363]: E0912 23:08:47.577506 2363 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.150:6443/api/v1/nodes\": dial tcp 10.0.0.150:6443: connect: connection refused" node="localhost" Sep 12 23:08:47.579717 kubelet[2363]: E0912 23:08:47.579604 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:47.840153 containerd[1565]: time="2025-09-12T23:08:47.839975861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:44915cd3782837a3d6bf7426bd3757d3,Namespace:kube-system,Attempt:0,}" Sep 12 23:08:47.867704 containerd[1565]: time="2025-09-12T23:08:47.867633940Z" level=info msg="connecting to shim 58b9866a7136b05a067347ceaebf99d80c03c0d44d84079d3a66317e1517fb08" address="unix:///run/containerd/s/4e6f46b0db9d65da52dd5f9dd81e48df5be838f19c1b9ffa56f3540d8abc1b0d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:08:47.872295 containerd[1565]: time="2025-09-12T23:08:47.872245217Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,}" Sep 12 23:08:47.882111 containerd[1565]: time="2025-09-12T23:08:47.881686122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,}" Sep 12 23:08:47.933850 kubelet[2363]: E0912 23:08:47.933786 2363 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.150:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.150:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 12 23:08:47.953925 systemd[1]: Started cri-containerd-58b9866a7136b05a067347ceaebf99d80c03c0d44d84079d3a66317e1517fb08.scope - libcontainer container 58b9866a7136b05a067347ceaebf99d80c03c0d44d84079d3a66317e1517fb08. Sep 12 23:08:48.031290 containerd[1565]: time="2025-09-12T23:08:48.030632180Z" level=info msg="connecting to shim 5afc0dbdf9a8e3346036d97c6e996a1a1e3be25f9134b046ae251eed3a17d322" address="unix:///run/containerd/s/ba2a4e3f0bee41ef6b72a37fb12c22b52e2acfbd368127b339e1b9eab767802a" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:08:48.033732 containerd[1565]: time="2025-09-12T23:08:48.033694733Z" level=info msg="connecting to shim 0f9f948af73e38810d2b41f4dd76a70e8e83aa0d80e768f591daa6c3f58e25b8" address="unix:///run/containerd/s/2d2c828c3206b9b74fed79dbaad464ba24944af45dd3629cf3c5c63963530b09" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:08:48.102258 containerd[1565]: time="2025-09-12T23:08:48.102100979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:44915cd3782837a3d6bf7426bd3757d3,Namespace:kube-system,Attempt:0,} returns sandbox id \"58b9866a7136b05a067347ceaebf99d80c03c0d44d84079d3a66317e1517fb08\"" Sep 12 23:08:48.111918 systemd[1]: Started cri-containerd-5afc0dbdf9a8e3346036d97c6e996a1a1e3be25f9134b046ae251eed3a17d322.scope - libcontainer container 5afc0dbdf9a8e3346036d97c6e996a1a1e3be25f9134b046ae251eed3a17d322. Sep 12 23:08:48.116565 containerd[1565]: time="2025-09-12T23:08:48.116504411Z" level=info msg="CreateContainer within sandbox \"58b9866a7136b05a067347ceaebf99d80c03c0d44d84079d3a66317e1517fb08\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 23:08:48.121845 systemd[1]: Started cri-containerd-0f9f948af73e38810d2b41f4dd76a70e8e83aa0d80e768f591daa6c3f58e25b8.scope - libcontainer container 0f9f948af73e38810d2b41f4dd76a70e8e83aa0d80e768f591daa6c3f58e25b8. Sep 12 23:08:48.129371 containerd[1565]: time="2025-09-12T23:08:48.129041632Z" level=info msg="Container 88c645a29c322877f769a0cc65af093ce8d37e9db3b09edb53389e8a8315cf7e: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:08:48.138544 containerd[1565]: time="2025-09-12T23:08:48.138489050Z" level=info msg="CreateContainer within sandbox \"58b9866a7136b05a067347ceaebf99d80c03c0d44d84079d3a66317e1517fb08\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"88c645a29c322877f769a0cc65af093ce8d37e9db3b09edb53389e8a8315cf7e\"" Sep 12 23:08:48.139155 containerd[1565]: time="2025-09-12T23:08:48.139126124Z" level=info msg="StartContainer for \"88c645a29c322877f769a0cc65af093ce8d37e9db3b09edb53389e8a8315cf7e\"" Sep 12 23:08:48.141456 containerd[1565]: time="2025-09-12T23:08:48.141422461Z" level=info msg="connecting to shim 88c645a29c322877f769a0cc65af093ce8d37e9db3b09edb53389e8a8315cf7e" address="unix:///run/containerd/s/4e6f46b0db9d65da52dd5f9dd81e48df5be838f19c1b9ffa56f3540d8abc1b0d" protocol=ttrpc version=3 Sep 12 23:08:48.218599 containerd[1565]: time="2025-09-12T23:08:48.218538060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:7b968cf906b2d9d713a362c43868bef2,Namespace:kube-system,Attempt:0,} returns sandbox id \"5afc0dbdf9a8e3346036d97c6e996a1a1e3be25f9134b046ae251eed3a17d322\"" Sep 12 23:08:48.221870 systemd[1]: Started cri-containerd-88c645a29c322877f769a0cc65af093ce8d37e9db3b09edb53389e8a8315cf7e.scope - libcontainer container 88c645a29c322877f769a0cc65af093ce8d37e9db3b09edb53389e8a8315cf7e. Sep 12 23:08:48.225347 containerd[1565]: time="2025-09-12T23:08:48.225313677Z" level=info msg="CreateContainer within sandbox \"5afc0dbdf9a8e3346036d97c6e996a1a1e3be25f9134b046ae251eed3a17d322\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 23:08:48.237692 containerd[1565]: time="2025-09-12T23:08:48.237626287Z" level=info msg="Container f3ae7daa8d98fda4fdb0200da75225668763e9bf7d1a7c00cbbaf8a89c325b7a: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:08:48.252675 containerd[1565]: time="2025-09-12T23:08:48.251341358Z" level=info msg="CreateContainer within sandbox \"5afc0dbdf9a8e3346036d97c6e996a1a1e3be25f9134b046ae251eed3a17d322\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f3ae7daa8d98fda4fdb0200da75225668763e9bf7d1a7c00cbbaf8a89c325b7a\"" Sep 12 23:08:48.252675 containerd[1565]: time="2025-09-12T23:08:48.252085714Z" level=info msg="StartContainer for \"f3ae7daa8d98fda4fdb0200da75225668763e9bf7d1a7c00cbbaf8a89c325b7a\"" Sep 12 23:08:48.257691 containerd[1565]: time="2025-09-12T23:08:48.253542296Z" level=info msg="connecting to shim f3ae7daa8d98fda4fdb0200da75225668763e9bf7d1a7c00cbbaf8a89c325b7a" address="unix:///run/containerd/s/ba2a4e3f0bee41ef6b72a37fb12c22b52e2acfbd368127b339e1b9eab767802a" protocol=ttrpc version=3 Sep 12 23:08:48.287070 systemd[1]: Started cri-containerd-f3ae7daa8d98fda4fdb0200da75225668763e9bf7d1a7c00cbbaf8a89c325b7a.scope - libcontainer container f3ae7daa8d98fda4fdb0200da75225668763e9bf7d1a7c00cbbaf8a89c325b7a. Sep 12 23:08:48.298639 containerd[1565]: time="2025-09-12T23:08:48.298576711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b678d5c6713e936e66aa5bb73166297e,Namespace:kube-system,Attempt:0,} returns sandbox id \"0f9f948af73e38810d2b41f4dd76a70e8e83aa0d80e768f591daa6c3f58e25b8\"" Sep 12 23:08:48.306329 containerd[1565]: time="2025-09-12T23:08:48.306294004Z" level=info msg="CreateContainer within sandbox \"0f9f948af73e38810d2b41f4dd76a70e8e83aa0d80e768f591daa6c3f58e25b8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 23:08:48.315548 containerd[1565]: time="2025-09-12T23:08:48.315492945Z" level=info msg="StartContainer for \"88c645a29c322877f769a0cc65af093ce8d37e9db3b09edb53389e8a8315cf7e\" returns successfully" Sep 12 23:08:48.319684 containerd[1565]: time="2025-09-12T23:08:48.319103206Z" level=info msg="Container 1fa95609993a71ba7ffc28b11f63dafbe945b15c64814d1a52792bfb2f7a32ac: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:08:48.328062 containerd[1565]: time="2025-09-12T23:08:48.328026320Z" level=info msg="CreateContainer within sandbox \"0f9f948af73e38810d2b41f4dd76a70e8e83aa0d80e768f591daa6c3f58e25b8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1fa95609993a71ba7ffc28b11f63dafbe945b15c64814d1a52792bfb2f7a32ac\"" Sep 12 23:08:48.328685 containerd[1565]: time="2025-09-12T23:08:48.328632046Z" level=info msg="StartContainer for \"1fa95609993a71ba7ffc28b11f63dafbe945b15c64814d1a52792bfb2f7a32ac\"" Sep 12 23:08:48.329660 containerd[1565]: time="2025-09-12T23:08:48.329616622Z" level=info msg="connecting to shim 1fa95609993a71ba7ffc28b11f63dafbe945b15c64814d1a52792bfb2f7a32ac" address="unix:///run/containerd/s/2d2c828c3206b9b74fed79dbaad464ba24944af45dd3629cf3c5c63963530b09" protocol=ttrpc version=3 Sep 12 23:08:48.365990 systemd[1]: Started cri-containerd-1fa95609993a71ba7ffc28b11f63dafbe945b15c64814d1a52792bfb2f7a32ac.scope - libcontainer container 1fa95609993a71ba7ffc28b11f63dafbe945b15c64814d1a52792bfb2f7a32ac. Sep 12 23:08:48.373234 containerd[1565]: time="2025-09-12T23:08:48.373080683Z" level=info msg="StartContainer for \"f3ae7daa8d98fda4fdb0200da75225668763e9bf7d1a7c00cbbaf8a89c325b7a\" returns successfully" Sep 12 23:08:48.380725 kubelet[2363]: I0912 23:08:48.380691 2363 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 23:08:48.485798 containerd[1565]: time="2025-09-12T23:08:48.485733607Z" level=info msg="StartContainer for \"1fa95609993a71ba7ffc28b11f63dafbe945b15c64814d1a52792bfb2f7a32ac\" returns successfully" Sep 12 23:08:49.011444 kubelet[2363]: E0912 23:08:49.011082 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:49.014867 kubelet[2363]: E0912 23:08:49.014449 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:49.017829 kubelet[2363]: E0912 23:08:49.017813 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:50.019272 kubelet[2363]: E0912 23:08:50.019071 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:50.022226 kubelet[2363]: E0912 23:08:50.022105 2363 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 23:08:50.331846 kubelet[2363]: E0912 23:08:50.331437 2363 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 23:08:50.521866 kubelet[2363]: I0912 23:08:50.521806 2363 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 23:08:50.542697 kubelet[2363]: I0912 23:08:50.542625 2363 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:50.548803 kubelet[2363]: E0912 23:08:50.548753 2363 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:50.548803 kubelet[2363]: I0912 23:08:50.548791 2363 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:50.550450 kubelet[2363]: E0912 23:08:50.550414 2363 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:50.550450 kubelet[2363]: I0912 23:08:50.550440 2363 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 23:08:50.551761 kubelet[2363]: E0912 23:08:50.551731 2363 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 12 23:08:50.927456 kubelet[2363]: I0912 23:08:50.927375 2363 apiserver.go:52] "Watching apiserver" Sep 12 23:08:50.941842 kubelet[2363]: I0912 23:08:50.941772 2363 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 23:08:52.707476 systemd[1]: Reload requested from client PID 2647 ('systemctl') (unit session-7.scope)... Sep 12 23:08:52.707500 systemd[1]: Reloading... Sep 12 23:08:52.895714 zram_generator::config[2690]: No configuration found. Sep 12 23:08:52.972982 kubelet[2363]: I0912 23:08:52.972851 2363 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:53.383641 systemd[1]: Reloading finished in 675 ms. Sep 12 23:08:53.418582 kubelet[2363]: I0912 23:08:53.418502 2363 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:08:53.418766 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:08:53.442960 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 23:08:53.443298 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:53.443362 systemd[1]: kubelet.service: Consumed 1.370s CPU time, 130.7M memory peak. Sep 12 23:08:53.446479 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:08:53.682307 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:08:53.693260 (kubelet)[2735]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:08:53.820587 kubelet[2735]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:08:53.820587 kubelet[2735]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 23:08:53.820587 kubelet[2735]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:08:53.821226 kubelet[2735]: I0912 23:08:53.820599 2735 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:08:53.827631 kubelet[2735]: I0912 23:08:53.827569 2735 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 23:08:53.827631 kubelet[2735]: I0912 23:08:53.827607 2735 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:08:53.827920 kubelet[2735]: I0912 23:08:53.827899 2735 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 23:08:53.829368 kubelet[2735]: I0912 23:08:53.829091 2735 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 12 23:08:53.831615 kubelet[2735]: I0912 23:08:53.831538 2735 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:08:53.838880 kubelet[2735]: I0912 23:08:53.838746 2735 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 23:08:53.844409 kubelet[2735]: I0912 23:08:53.844371 2735 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:08:53.844758 kubelet[2735]: I0912 23:08:53.844715 2735 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:08:53.844946 kubelet[2735]: I0912 23:08:53.844755 2735 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:08:53.845069 kubelet[2735]: I0912 23:08:53.844954 2735 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:08:53.845069 kubelet[2735]: I0912 23:08:53.844966 2735 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 23:08:53.845069 kubelet[2735]: I0912 23:08:53.845032 2735 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:08:53.845317 kubelet[2735]: I0912 23:08:53.845296 2735 kubelet.go:480] "Attempting to sync node with API server" Sep 12 23:08:53.845317 kubelet[2735]: I0912 23:08:53.845314 2735 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:08:53.845394 kubelet[2735]: I0912 23:08:53.845342 2735 kubelet.go:386] "Adding apiserver pod source" Sep 12 23:08:53.845394 kubelet[2735]: I0912 23:08:53.845369 2735 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:08:53.847674 kubelet[2735]: I0912 23:08:53.847120 2735 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 23:08:53.847937 kubelet[2735]: I0912 23:08:53.847903 2735 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 23:08:53.851960 kubelet[2735]: I0912 23:08:53.851918 2735 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 23:08:53.852030 kubelet[2735]: I0912 23:08:53.851994 2735 server.go:1289] "Started kubelet" Sep 12 23:08:53.852160 kubelet[2735]: I0912 23:08:53.852127 2735 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:08:53.852614 kubelet[2735]: I0912 23:08:53.852554 2735 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:08:53.852956 kubelet[2735]: I0912 23:08:53.852928 2735 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:08:53.853513 kubelet[2735]: I0912 23:08:53.853493 2735 server.go:317] "Adding debug handlers to kubelet server" Sep 12 23:08:53.862582 kubelet[2735]: I0912 23:08:53.862555 2735 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:08:53.864064 kubelet[2735]: I0912 23:08:53.864021 2735 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:08:53.866042 kubelet[2735]: I0912 23:08:53.865529 2735 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 23:08:53.866729 kubelet[2735]: E0912 23:08:53.866707 2735 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 23:08:53.867061 kubelet[2735]: I0912 23:08:53.867041 2735 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 23:08:53.867303 kubelet[2735]: E0912 23:08:53.867273 2735 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:08:53.868726 kubelet[2735]: I0912 23:08:53.868684 2735 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:08:53.870437 kubelet[2735]: I0912 23:08:53.870052 2735 factory.go:223] Registration of the systemd container factory successfully Sep 12 23:08:53.872222 kubelet[2735]: I0912 23:08:53.871055 2735 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:08:53.878673 kubelet[2735]: I0912 23:08:53.876735 2735 factory.go:223] Registration of the containerd container factory successfully Sep 12 23:08:53.895095 kubelet[2735]: I0912 23:08:53.895043 2735 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 23:08:53.898047 kubelet[2735]: I0912 23:08:53.897852 2735 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 23:08:53.898047 kubelet[2735]: I0912 23:08:53.897873 2735 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 23:08:53.898047 kubelet[2735]: I0912 23:08:53.897893 2735 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 23:08:53.898047 kubelet[2735]: I0912 23:08:53.897900 2735 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 23:08:53.898047 kubelet[2735]: E0912 23:08:53.897940 2735 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 23:08:53.916232 kubelet[2735]: I0912 23:08:53.916189 2735 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 23:08:53.916232 kubelet[2735]: I0912 23:08:53.916231 2735 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 23:08:53.916411 kubelet[2735]: I0912 23:08:53.916252 2735 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:08:53.916411 kubelet[2735]: I0912 23:08:53.916397 2735 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 23:08:53.916466 kubelet[2735]: I0912 23:08:53.916409 2735 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 23:08:53.916466 kubelet[2735]: I0912 23:08:53.916434 2735 policy_none.go:49] "None policy: Start" Sep 12 23:08:53.916466 kubelet[2735]: I0912 23:08:53.916444 2735 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 23:08:53.916466 kubelet[2735]: I0912 23:08:53.916455 2735 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:08:53.916574 kubelet[2735]: I0912 23:08:53.916562 2735 state_mem.go:75] "Updated machine memory state" Sep 12 23:08:53.922057 kubelet[2735]: E0912 23:08:53.921765 2735 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 23:08:53.922178 kubelet[2735]: I0912 23:08:53.922153 2735 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:08:53.922210 kubelet[2735]: I0912 23:08:53.922170 2735 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:08:53.923165 kubelet[2735]: I0912 23:08:53.922786 2735 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:08:53.924781 kubelet[2735]: E0912 23:08:53.923433 2735 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 23:08:53.999312 kubelet[2735]: I0912 23:08:53.999252 2735 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:53.999473 kubelet[2735]: I0912 23:08:53.999399 2735 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 23:08:53.999559 kubelet[2735]: I0912 23:08:53.999258 2735 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:54.012691 kubelet[2735]: E0912 23:08:54.011706 2735 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.027676 kubelet[2735]: I0912 23:08:54.027603 2735 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 23:08:54.040286 kubelet[2735]: I0912 23:08:54.040239 2735 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 12 23:08:54.040455 kubelet[2735]: I0912 23:08:54.040325 2735 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 23:08:54.070019 kubelet[2735]: I0912 23:08:54.069955 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/44915cd3782837a3d6bf7426bd3757d3-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"44915cd3782837a3d6bf7426bd3757d3\") " pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:54.070019 kubelet[2735]: I0912 23:08:54.070004 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.070019 kubelet[2735]: I0912 23:08:54.070026 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.070266 kubelet[2735]: I0912 23:08:54.070039 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/44915cd3782837a3d6bf7426bd3757d3-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"44915cd3782837a3d6bf7426bd3757d3\") " pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:54.070266 kubelet[2735]: I0912 23:08:54.070057 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/44915cd3782837a3d6bf7426bd3757d3-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"44915cd3782837a3d6bf7426bd3757d3\") " pod="kube-system/kube-apiserver-localhost" Sep 12 23:08:54.070266 kubelet[2735]: I0912 23:08:54.070078 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.070266 kubelet[2735]: I0912 23:08:54.070094 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.070266 kubelet[2735]: I0912 23:08:54.070111 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b678d5c6713e936e66aa5bb73166297e-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b678d5c6713e936e66aa5bb73166297e\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.070427 kubelet[2735]: I0912 23:08:54.070129 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7b968cf906b2d9d713a362c43868bef2-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"7b968cf906b2d9d713a362c43868bef2\") " pod="kube-system/kube-scheduler-localhost" Sep 12 23:08:54.847432 kubelet[2735]: I0912 23:08:54.846301 2735 apiserver.go:52] "Watching apiserver" Sep 12 23:08:54.867518 kubelet[2735]: I0912 23:08:54.867418 2735 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 23:08:54.912738 kubelet[2735]: I0912 23:08:54.912454 2735 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.913726 kubelet[2735]: I0912 23:08:54.913709 2735 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 23:08:54.933803 kubelet[2735]: E0912 23:08:54.933723 2735 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 12 23:08:54.935275 kubelet[2735]: E0912 23:08:54.935243 2735 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Sep 12 23:08:54.980373 kubelet[2735]: I0912 23:08:54.980266 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=2.98024501 podStartE2EDuration="2.98024501s" podCreationTimestamp="2025-09-12 23:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:08:54.962879069 +0000 UTC m=+1.258916810" watchObservedRunningTime="2025-09-12 23:08:54.98024501 +0000 UTC m=+1.276282751" Sep 12 23:08:55.000260 kubelet[2735]: I0912 23:08:55.000089 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.000067484 podStartE2EDuration="1.000067484s" podCreationTimestamp="2025-09-12 23:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:08:54.980204923 +0000 UTC m=+1.276242664" watchObservedRunningTime="2025-09-12 23:08:55.000067484 +0000 UTC m=+1.296105235" Sep 12 23:08:58.990076 kubelet[2735]: I0912 23:08:58.990018 2735 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 23:08:58.990587 containerd[1565]: time="2025-09-12T23:08:58.990336454Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 23:08:58.990919 kubelet[2735]: I0912 23:08:58.990695 2735 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 23:08:59.034564 kubelet[2735]: I0912 23:08:59.034500 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=5.034462703 podStartE2EDuration="5.034462703s" podCreationTimestamp="2025-09-12 23:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:08:55.000420944 +0000 UTC m=+1.296458695" watchObservedRunningTime="2025-09-12 23:08:59.034462703 +0000 UTC m=+5.330500434" Sep 12 23:08:59.766082 systemd[1]: Created slice kubepods-besteffort-podd8a7b144_bbfe_408c_8394_a95d8563b290.slice - libcontainer container kubepods-besteffort-podd8a7b144_bbfe_408c_8394_a95d8563b290.slice. Sep 12 23:08:59.804925 kubelet[2735]: I0912 23:08:59.804868 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg52x\" (UniqueName: \"kubernetes.io/projected/d8a7b144-bbfe-408c-8394-a95d8563b290-kube-api-access-rg52x\") pod \"kube-proxy-5n5mf\" (UID: \"d8a7b144-bbfe-408c-8394-a95d8563b290\") " pod="kube-system/kube-proxy-5n5mf" Sep 12 23:08:59.804925 kubelet[2735]: I0912 23:08:59.804912 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d8a7b144-bbfe-408c-8394-a95d8563b290-kube-proxy\") pod \"kube-proxy-5n5mf\" (UID: \"d8a7b144-bbfe-408c-8394-a95d8563b290\") " pod="kube-system/kube-proxy-5n5mf" Sep 12 23:08:59.804925 kubelet[2735]: I0912 23:08:59.804936 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d8a7b144-bbfe-408c-8394-a95d8563b290-xtables-lock\") pod \"kube-proxy-5n5mf\" (UID: \"d8a7b144-bbfe-408c-8394-a95d8563b290\") " pod="kube-system/kube-proxy-5n5mf" Sep 12 23:08:59.804925 kubelet[2735]: I0912 23:08:59.804950 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8a7b144-bbfe-408c-8394-a95d8563b290-lib-modules\") pod \"kube-proxy-5n5mf\" (UID: \"d8a7b144-bbfe-408c-8394-a95d8563b290\") " pod="kube-system/kube-proxy-5n5mf" Sep 12 23:09:00.079451 containerd[1565]: time="2025-09-12T23:09:00.079310059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5n5mf,Uid:d8a7b144-bbfe-408c-8394-a95d8563b290,Namespace:kube-system,Attempt:0,}" Sep 12 23:09:00.103530 containerd[1565]: time="2025-09-12T23:09:00.103457942Z" level=info msg="connecting to shim 79ff1935820330fc55ba57f6c10747fafdf1a4c745512d0df6fd53b3a2f9375c" address="unix:///run/containerd/s/205d72fc98619e5030aa8cee2da7f69e31b19d35afc589066a7cc021544f2f54" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:00.141841 systemd[1]: Started cri-containerd-79ff1935820330fc55ba57f6c10747fafdf1a4c745512d0df6fd53b3a2f9375c.scope - libcontainer container 79ff1935820330fc55ba57f6c10747fafdf1a4c745512d0df6fd53b3a2f9375c. Sep 12 23:09:00.168069 containerd[1565]: time="2025-09-12T23:09:00.168016314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5n5mf,Uid:d8a7b144-bbfe-408c-8394-a95d8563b290,Namespace:kube-system,Attempt:0,} returns sandbox id \"79ff1935820330fc55ba57f6c10747fafdf1a4c745512d0df6fd53b3a2f9375c\"" Sep 12 23:09:00.175371 containerd[1565]: time="2025-09-12T23:09:00.175313817Z" level=info msg="CreateContainer within sandbox \"79ff1935820330fc55ba57f6c10747fafdf1a4c745512d0df6fd53b3a2f9375c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 23:09:00.279541 containerd[1565]: time="2025-09-12T23:09:00.278994402Z" level=info msg="Container adc078e0c3554eee6fad74a5171c7a4b44d83ba130aff3cf89f773f77f99826e: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:00.296077 containerd[1565]: time="2025-09-12T23:09:00.296009455Z" level=info msg="CreateContainer within sandbox \"79ff1935820330fc55ba57f6c10747fafdf1a4c745512d0df6fd53b3a2f9375c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"adc078e0c3554eee6fad74a5171c7a4b44d83ba130aff3cf89f773f77f99826e\"" Sep 12 23:09:00.297394 containerd[1565]: time="2025-09-12T23:09:00.297304075Z" level=info msg="StartContainer for \"adc078e0c3554eee6fad74a5171c7a4b44d83ba130aff3cf89f773f77f99826e\"" Sep 12 23:09:00.299099 containerd[1565]: time="2025-09-12T23:09:00.299061107Z" level=info msg="connecting to shim adc078e0c3554eee6fad74a5171c7a4b44d83ba130aff3cf89f773f77f99826e" address="unix:///run/containerd/s/205d72fc98619e5030aa8cee2da7f69e31b19d35afc589066a7cc021544f2f54" protocol=ttrpc version=3 Sep 12 23:09:00.330894 systemd[1]: Started cri-containerd-adc078e0c3554eee6fad74a5171c7a4b44d83ba130aff3cf89f773f77f99826e.scope - libcontainer container adc078e0c3554eee6fad74a5171c7a4b44d83ba130aff3cf89f773f77f99826e. Sep 12 23:09:00.331951 systemd[1]: Created slice kubepods-besteffort-poda116dc92_7859_46a3_8040_2712231f2ea8.slice - libcontainer container kubepods-besteffort-poda116dc92_7859_46a3_8040_2712231f2ea8.slice. Sep 12 23:09:00.381797 containerd[1565]: time="2025-09-12T23:09:00.381748298Z" level=info msg="StartContainer for \"adc078e0c3554eee6fad74a5171c7a4b44d83ba130aff3cf89f773f77f99826e\" returns successfully" Sep 12 23:09:00.410782 kubelet[2735]: I0912 23:09:00.410455 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a116dc92-7859-46a3-8040-2712231f2ea8-var-lib-calico\") pod \"tigera-operator-755d956888-dm7z7\" (UID: \"a116dc92-7859-46a3-8040-2712231f2ea8\") " pod="tigera-operator/tigera-operator-755d956888-dm7z7" Sep 12 23:09:00.410782 kubelet[2735]: I0912 23:09:00.410523 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfjqw\" (UniqueName: \"kubernetes.io/projected/a116dc92-7859-46a3-8040-2712231f2ea8-kube-api-access-zfjqw\") pod \"tigera-operator-755d956888-dm7z7\" (UID: \"a116dc92-7859-46a3-8040-2712231f2ea8\") " pod="tigera-operator/tigera-operator-755d956888-dm7z7" Sep 12 23:09:00.636332 containerd[1565]: time="2025-09-12T23:09:00.636199269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-dm7z7,Uid:a116dc92-7859-46a3-8040-2712231f2ea8,Namespace:tigera-operator,Attempt:0,}" Sep 12 23:09:00.658461 containerd[1565]: time="2025-09-12T23:09:00.658372585Z" level=info msg="connecting to shim ed44d8af4d4fdfb3c7c9a51ee93fcf6ab1d9a755a5b3654a65ac27ac8296a063" address="unix:///run/containerd/s/5a94f9f0ae11bf82481bed0a78d4cb3e11b5b4bf34092aeace0d11f004305239" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:00.687168 systemd[1]: Started cri-containerd-ed44d8af4d4fdfb3c7c9a51ee93fcf6ab1d9a755a5b3654a65ac27ac8296a063.scope - libcontainer container ed44d8af4d4fdfb3c7c9a51ee93fcf6ab1d9a755a5b3654a65ac27ac8296a063. Sep 12 23:09:00.739422 containerd[1565]: time="2025-09-12T23:09:00.739381904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-dm7z7,Uid:a116dc92-7859-46a3-8040-2712231f2ea8,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"ed44d8af4d4fdfb3c7c9a51ee93fcf6ab1d9a755a5b3654a65ac27ac8296a063\"" Sep 12 23:09:00.740827 containerd[1565]: time="2025-09-12T23:09:00.740801063Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 23:09:00.940067 kubelet[2735]: I0912 23:09:00.939781 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5n5mf" podStartSLOduration=1.939763898 podStartE2EDuration="1.939763898s" podCreationTimestamp="2025-09-12 23:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:09:00.939631926 +0000 UTC m=+7.235669667" watchObservedRunningTime="2025-09-12 23:09:00.939763898 +0000 UTC m=+7.235801639" Sep 12 23:09:02.414223 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount576637576.mount: Deactivated successfully. Sep 12 23:09:03.689130 containerd[1565]: time="2025-09-12T23:09:03.689050463Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:03.689787 containerd[1565]: time="2025-09-12T23:09:03.689716700Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 23:09:03.690866 containerd[1565]: time="2025-09-12T23:09:03.690831521Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:03.693073 containerd[1565]: time="2025-09-12T23:09:03.693039701Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:03.693598 containerd[1565]: time="2025-09-12T23:09:03.693560682Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.952594174s" Sep 12 23:09:03.693598 containerd[1565]: time="2025-09-12T23:09:03.693591381Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 23:09:03.698236 containerd[1565]: time="2025-09-12T23:09:03.698194707Z" level=info msg="CreateContainer within sandbox \"ed44d8af4d4fdfb3c7c9a51ee93fcf6ab1d9a755a5b3654a65ac27ac8296a063\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 23:09:03.708820 containerd[1565]: time="2025-09-12T23:09:03.708572718Z" level=info msg="Container 162a4d4c19db5a346e87327bc16e848ede97e123d95dbc29c03899fb09684514: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:03.716161 containerd[1565]: time="2025-09-12T23:09:03.716110026Z" level=info msg="CreateContainer within sandbox \"ed44d8af4d4fdfb3c7c9a51ee93fcf6ab1d9a755a5b3654a65ac27ac8296a063\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"162a4d4c19db5a346e87327bc16e848ede97e123d95dbc29c03899fb09684514\"" Sep 12 23:09:03.716685 containerd[1565]: time="2025-09-12T23:09:03.716637108Z" level=info msg="StartContainer for \"162a4d4c19db5a346e87327bc16e848ede97e123d95dbc29c03899fb09684514\"" Sep 12 23:09:03.717392 containerd[1565]: time="2025-09-12T23:09:03.717366094Z" level=info msg="connecting to shim 162a4d4c19db5a346e87327bc16e848ede97e123d95dbc29c03899fb09684514" address="unix:///run/containerd/s/5a94f9f0ae11bf82481bed0a78d4cb3e11b5b4bf34092aeace0d11f004305239" protocol=ttrpc version=3 Sep 12 23:09:03.780792 systemd[1]: Started cri-containerd-162a4d4c19db5a346e87327bc16e848ede97e123d95dbc29c03899fb09684514.scope - libcontainer container 162a4d4c19db5a346e87327bc16e848ede97e123d95dbc29c03899fb09684514. Sep 12 23:09:03.812407 containerd[1565]: time="2025-09-12T23:09:03.812354862Z" level=info msg="StartContainer for \"162a4d4c19db5a346e87327bc16e848ede97e123d95dbc29c03899fb09684514\" returns successfully" Sep 12 23:09:03.944550 kubelet[2735]: I0912 23:09:03.943706 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-dm7z7" podStartSLOduration=0.989570857 podStartE2EDuration="3.943690898s" podCreationTimestamp="2025-09-12 23:09:00 +0000 UTC" firstStartedPulling="2025-09-12 23:09:00.740279177 +0000 UTC m=+7.036316908" lastFinishedPulling="2025-09-12 23:09:03.694399208 +0000 UTC m=+9.990436949" observedRunningTime="2025-09-12 23:09:03.943627717 +0000 UTC m=+10.239665458" watchObservedRunningTime="2025-09-12 23:09:03.943690898 +0000 UTC m=+10.239728639" Sep 12 23:09:05.852823 update_engine[1550]: I20250912 23:09:05.852711 1550 update_attempter.cc:509] Updating boot flags... Sep 12 23:09:09.065644 sudo[1782]: pam_unix(sudo:session): session closed for user root Sep 12 23:09:09.068057 sshd[1781]: Connection closed by 10.0.0.1 port 38526 Sep 12 23:09:09.069063 sshd-session[1778]: pam_unix(sshd:session): session closed for user core Sep 12 23:09:09.074583 systemd[1]: sshd@6-10.0.0.150:22-10.0.0.1:38526.service: Deactivated successfully. Sep 12 23:09:09.077999 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 23:09:09.078310 systemd[1]: session-7.scope: Consumed 4.856s CPU time, 227.1M memory peak. Sep 12 23:09:09.080250 systemd-logind[1546]: Session 7 logged out. Waiting for processes to exit. Sep 12 23:09:09.083049 systemd-logind[1546]: Removed session 7. Sep 12 23:09:12.687725 systemd[1]: Created slice kubepods-besteffort-podb5caa9b3_df8d_4362_8fbf_3bace9c30f67.slice - libcontainer container kubepods-besteffort-podb5caa9b3_df8d_4362_8fbf_3bace9c30f67.slice. Sep 12 23:09:12.699511 kubelet[2735]: I0912 23:09:12.699019 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2pbw\" (UniqueName: \"kubernetes.io/projected/b5caa9b3-df8d-4362-8fbf-3bace9c30f67-kube-api-access-b2pbw\") pod \"calico-typha-86b4b94f56-s8nq4\" (UID: \"b5caa9b3-df8d-4362-8fbf-3bace9c30f67\") " pod="calico-system/calico-typha-86b4b94f56-s8nq4" Sep 12 23:09:12.699511 kubelet[2735]: I0912 23:09:12.699076 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5caa9b3-df8d-4362-8fbf-3bace9c30f67-tigera-ca-bundle\") pod \"calico-typha-86b4b94f56-s8nq4\" (UID: \"b5caa9b3-df8d-4362-8fbf-3bace9c30f67\") " pod="calico-system/calico-typha-86b4b94f56-s8nq4" Sep 12 23:09:12.699511 kubelet[2735]: I0912 23:09:12.699126 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b5caa9b3-df8d-4362-8fbf-3bace9c30f67-typha-certs\") pod \"calico-typha-86b4b94f56-s8nq4\" (UID: \"b5caa9b3-df8d-4362-8fbf-3bace9c30f67\") " pod="calico-system/calico-typha-86b4b94f56-s8nq4" Sep 12 23:09:13.000504 containerd[1565]: time="2025-09-12T23:09:13.000423721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-86b4b94f56-s8nq4,Uid:b5caa9b3-df8d-4362-8fbf-3bace9c30f67,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:13.154783 systemd[1]: Created slice kubepods-besteffort-pod31972b5d_50ad_429a_a897_1ede3300f239.slice - libcontainer container kubepods-besteffort-pod31972b5d_50ad_429a_a897_1ede3300f239.slice. Sep 12 23:09:13.202918 kubelet[2735]: I0912 23:09:13.202855 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-lib-modules\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.202918 kubelet[2735]: I0912 23:09:13.202920 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31972b5d-50ad-429a-a897-1ede3300f239-tigera-ca-bundle\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203185 kubelet[2735]: I0912 23:09:13.202965 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-var-run-calico\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203185 kubelet[2735]: I0912 23:09:13.202984 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-var-lib-calico\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203185 kubelet[2735]: I0912 23:09:13.203001 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-cni-bin-dir\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203185 kubelet[2735]: I0912 23:09:13.203016 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-cni-log-dir\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203185 kubelet[2735]: I0912 23:09:13.203029 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-cni-net-dir\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203422 kubelet[2735]: I0912 23:09:13.203051 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-policysync\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203422 kubelet[2735]: I0912 23:09:13.203084 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-flexvol-driver-host\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203422 kubelet[2735]: I0912 23:09:13.203110 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gm5z\" (UniqueName: \"kubernetes.io/projected/31972b5d-50ad-429a-a897-1ede3300f239-kube-api-access-6gm5z\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203422 kubelet[2735]: I0912 23:09:13.203126 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/31972b5d-50ad-429a-a897-1ede3300f239-node-certs\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.203422 kubelet[2735]: I0912 23:09:13.203139 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/31972b5d-50ad-429a-a897-1ede3300f239-xtables-lock\") pod \"calico-node-kbs4n\" (UID: \"31972b5d-50ad-429a-a897-1ede3300f239\") " pod="calico-system/calico-node-kbs4n" Sep 12 23:09:13.247632 containerd[1565]: time="2025-09-12T23:09:13.247544899Z" level=info msg="connecting to shim 94054cf3cd737aae34d4e6418ffd13bcd3c554ca6ec9be25fdf0b65db7a0f6b6" address="unix:///run/containerd/s/aff3330f8460bd32ea12183ac204f0d5508ee2c15af0f51cec919fa3e90b21fb" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:13.299113 systemd[1]: Started cri-containerd-94054cf3cd737aae34d4e6418ffd13bcd3c554ca6ec9be25fdf0b65db7a0f6b6.scope - libcontainer container 94054cf3cd737aae34d4e6418ffd13bcd3c554ca6ec9be25fdf0b65db7a0f6b6. Sep 12 23:09:13.306587 kubelet[2735]: E0912 23:09:13.306537 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.306712 kubelet[2735]: W0912 23:09:13.306594 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.306712 kubelet[2735]: E0912 23:09:13.306632 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.308033 kubelet[2735]: E0912 23:09:13.306961 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.308033 kubelet[2735]: W0912 23:09:13.306991 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.308033 kubelet[2735]: E0912 23:09:13.307013 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.308033 kubelet[2735]: E0912 23:09:13.307223 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.308033 kubelet[2735]: W0912 23:09:13.307233 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.308033 kubelet[2735]: E0912 23:09:13.307250 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.311228 kubelet[2735]: E0912 23:09:13.311200 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.311228 kubelet[2735]: W0912 23:09:13.311225 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.311372 kubelet[2735]: E0912 23:09:13.311244 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.311727 kubelet[2735]: E0912 23:09:13.311706 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.311727 kubelet[2735]: W0912 23:09:13.311724 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.312353 kubelet[2735]: E0912 23:09:13.311742 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.314020 kubelet[2735]: E0912 23:09:13.313970 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.314020 kubelet[2735]: W0912 23:09:13.313994 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.314141 kubelet[2735]: E0912 23:09:13.314051 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.318296 kubelet[2735]: E0912 23:09:13.318069 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.318296 kubelet[2735]: W0912 23:09:13.318090 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.318296 kubelet[2735]: E0912 23:09:13.318108 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.321734 kubelet[2735]: E0912 23:09:13.321555 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.322151 kubelet[2735]: W0912 23:09:13.321641 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.322151 kubelet[2735]: E0912 23:09:13.322056 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.323838 kubelet[2735]: E0912 23:09:13.322958 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.324115 kubelet[2735]: W0912 23:09:13.323990 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.324115 kubelet[2735]: E0912 23:09:13.324017 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.326426 kubelet[2735]: E0912 23:09:13.325610 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.327844 kubelet[2735]: W0912 23:09:13.325639 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.328675 kubelet[2735]: E0912 23:09:13.326578 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.332348 kubelet[2735]: E0912 23:09:13.332226 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.332348 kubelet[2735]: W0912 23:09:13.332278 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.332792 kubelet[2735]: E0912 23:09:13.332430 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.340618 kubelet[2735]: E0912 23:09:13.340581 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.340618 kubelet[2735]: W0912 23:09:13.340604 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.340618 kubelet[2735]: E0912 23:09:13.340622 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.396367 containerd[1565]: time="2025-09-12T23:09:13.396305592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-86b4b94f56-s8nq4,Uid:b5caa9b3-df8d-4362-8fbf-3bace9c30f67,Namespace:calico-system,Attempt:0,} returns sandbox id \"94054cf3cd737aae34d4e6418ffd13bcd3c554ca6ec9be25fdf0b65db7a0f6b6\"" Sep 12 23:09:13.400301 containerd[1565]: time="2025-09-12T23:09:13.399870728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 23:09:13.464208 containerd[1565]: time="2025-09-12T23:09:13.464135271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kbs4n,Uid:31972b5d-50ad-429a-a897-1ede3300f239,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:13.509848 kubelet[2735]: E0912 23:09:13.509728 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:13.584508 kubelet[2735]: E0912 23:09:13.584323 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.584508 kubelet[2735]: W0912 23:09:13.584349 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.584508 kubelet[2735]: E0912 23:09:13.584370 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.584787 kubelet[2735]: E0912 23:09:13.584578 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.584787 kubelet[2735]: W0912 23:09:13.584590 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.584787 kubelet[2735]: E0912 23:09:13.584603 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.584895 kubelet[2735]: E0912 23:09:13.584875 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.584895 kubelet[2735]: W0912 23:09:13.584886 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.584969 kubelet[2735]: E0912 23:09:13.584898 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.585219 kubelet[2735]: E0912 23:09:13.585186 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.585219 kubelet[2735]: W0912 23:09:13.585223 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.585378 kubelet[2735]: E0912 23:09:13.585236 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.585728 kubelet[2735]: E0912 23:09:13.585693 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.585863 kubelet[2735]: W0912 23:09:13.585831 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.585951 kubelet[2735]: E0912 23:09:13.585872 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.586219 kubelet[2735]: E0912 23:09:13.586200 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.586219 kubelet[2735]: W0912 23:09:13.586212 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.586219 kubelet[2735]: E0912 23:09:13.586222 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.586495 kubelet[2735]: E0912 23:09:13.586461 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.586495 kubelet[2735]: W0912 23:09:13.586478 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.586495 kubelet[2735]: E0912 23:09:13.586489 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.587763 kubelet[2735]: E0912 23:09:13.587563 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.587763 kubelet[2735]: W0912 23:09:13.587584 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.587763 kubelet[2735]: E0912 23:09:13.587597 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.588118 kubelet[2735]: E0912 23:09:13.588091 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.588118 kubelet[2735]: W0912 23:09:13.588109 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.588118 kubelet[2735]: E0912 23:09:13.588122 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.588602 kubelet[2735]: E0912 23:09:13.588516 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.588602 kubelet[2735]: W0912 23:09:13.588532 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.588602 kubelet[2735]: E0912 23:09:13.588544 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.588771 kubelet[2735]: E0912 23:09:13.588761 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.588817 kubelet[2735]: W0912 23:09:13.588772 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.588817 kubelet[2735]: E0912 23:09:13.588784 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.589005 kubelet[2735]: E0912 23:09:13.588983 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.589209 kubelet[2735]: W0912 23:09:13.589082 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.589209 kubelet[2735]: E0912 23:09:13.589101 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.589533 kubelet[2735]: E0912 23:09:13.589510 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.589533 kubelet[2735]: W0912 23:09:13.589524 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.589533 kubelet[2735]: E0912 23:09:13.589537 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.589814 kubelet[2735]: E0912 23:09:13.589784 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.589814 kubelet[2735]: W0912 23:09:13.589807 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.589814 kubelet[2735]: E0912 23:09:13.589817 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.590039 kubelet[2735]: E0912 23:09:13.590015 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.590039 kubelet[2735]: W0912 23:09:13.590031 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.590195 kubelet[2735]: E0912 23:09:13.590042 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.590279 kubelet[2735]: E0912 23:09:13.590246 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.590279 kubelet[2735]: W0912 23:09:13.590256 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.590279 kubelet[2735]: E0912 23:09:13.590267 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.590492 kubelet[2735]: E0912 23:09:13.590472 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.590492 kubelet[2735]: W0912 23:09:13.590486 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.590492 kubelet[2735]: E0912 23:09:13.590497 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.590780 kubelet[2735]: E0912 23:09:13.590727 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.590780 kubelet[2735]: W0912 23:09:13.590737 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.590780 kubelet[2735]: E0912 23:09:13.590746 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.591044 kubelet[2735]: E0912 23:09:13.590937 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.591044 kubelet[2735]: W0912 23:09:13.590948 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.591044 kubelet[2735]: E0912 23:09:13.590958 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.591261 kubelet[2735]: E0912 23:09:13.591160 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.591261 kubelet[2735]: W0912 23:09:13.591171 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.591261 kubelet[2735]: E0912 23:09:13.591180 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.606856 kubelet[2735]: E0912 23:09:13.606781 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.606856 kubelet[2735]: W0912 23:09:13.606822 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.606856 kubelet[2735]: E0912 23:09:13.606847 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.607108 kubelet[2735]: I0912 23:09:13.606885 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e4de4ee-2e8e-4032-b7ab-9b77d4141fea-kubelet-dir\") pod \"csi-node-driver-bm7r8\" (UID: \"0e4de4ee-2e8e-4032-b7ab-9b77d4141fea\") " pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:13.607198 kubelet[2735]: E0912 23:09:13.607160 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.607198 kubelet[2735]: W0912 23:09:13.607184 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.607198 kubelet[2735]: E0912 23:09:13.607197 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.607313 kubelet[2735]: I0912 23:09:13.607233 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0e4de4ee-2e8e-4032-b7ab-9b77d4141fea-socket-dir\") pod \"csi-node-driver-bm7r8\" (UID: \"0e4de4ee-2e8e-4032-b7ab-9b77d4141fea\") " pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:13.607604 kubelet[2735]: E0912 23:09:13.607561 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.607604 kubelet[2735]: W0912 23:09:13.607589 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.607731 kubelet[2735]: E0912 23:09:13.607614 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.607886 kubelet[2735]: E0912 23:09:13.607853 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.607886 kubelet[2735]: W0912 23:09:13.607866 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.607886 kubelet[2735]: E0912 23:09:13.607876 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.608111 kubelet[2735]: E0912 23:09:13.608080 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.608111 kubelet[2735]: W0912 23:09:13.608091 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.608111 kubelet[2735]: E0912 23:09:13.608099 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.608214 kubelet[2735]: I0912 23:09:13.608130 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0e4de4ee-2e8e-4032-b7ab-9b77d4141fea-varrun\") pod \"csi-node-driver-bm7r8\" (UID: \"0e4de4ee-2e8e-4032-b7ab-9b77d4141fea\") " pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:13.608380 kubelet[2735]: E0912 23:09:13.608344 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.608380 kubelet[2735]: W0912 23:09:13.608361 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.608380 kubelet[2735]: E0912 23:09:13.608373 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.608589 kubelet[2735]: E0912 23:09:13.608565 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.608589 kubelet[2735]: W0912 23:09:13.608577 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.608589 kubelet[2735]: E0912 23:09:13.608588 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.608857 kubelet[2735]: E0912 23:09:13.608836 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.608857 kubelet[2735]: W0912 23:09:13.608850 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.608953 kubelet[2735]: E0912 23:09:13.608862 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.608953 kubelet[2735]: I0912 23:09:13.608889 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjcvd\" (UniqueName: \"kubernetes.io/projected/0e4de4ee-2e8e-4032-b7ab-9b77d4141fea-kube-api-access-bjcvd\") pod \"csi-node-driver-bm7r8\" (UID: \"0e4de4ee-2e8e-4032-b7ab-9b77d4141fea\") " pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:13.609111 kubelet[2735]: E0912 23:09:13.609084 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.609111 kubelet[2735]: W0912 23:09:13.609099 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.609111 kubelet[2735]: E0912 23:09:13.609109 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.609337 kubelet[2735]: E0912 23:09:13.609311 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.609337 kubelet[2735]: W0912 23:09:13.609322 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.609337 kubelet[2735]: E0912 23:09:13.609331 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.609598 kubelet[2735]: E0912 23:09:13.609571 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.610341 kubelet[2735]: W0912 23:09:13.609583 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.610341 kubelet[2735]: E0912 23:09:13.610336 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.610601 kubelet[2735]: E0912 23:09:13.610582 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.610601 kubelet[2735]: W0912 23:09:13.610596 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.610693 kubelet[2735]: E0912 23:09:13.610607 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.610859 kubelet[2735]: E0912 23:09:13.610831 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.610859 kubelet[2735]: W0912 23:09:13.610843 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.610859 kubelet[2735]: E0912 23:09:13.610852 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.610963 kubelet[2735]: I0912 23:09:13.610872 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0e4de4ee-2e8e-4032-b7ab-9b77d4141fea-registration-dir\") pod \"csi-node-driver-bm7r8\" (UID: \"0e4de4ee-2e8e-4032-b7ab-9b77d4141fea\") " pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:13.611109 kubelet[2735]: E0912 23:09:13.611090 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.611109 kubelet[2735]: W0912 23:09:13.611101 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.611169 kubelet[2735]: E0912 23:09:13.611110 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.611305 kubelet[2735]: E0912 23:09:13.611287 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.611305 kubelet[2735]: W0912 23:09:13.611298 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.611305 kubelet[2735]: E0912 23:09:13.611306 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.712029 kubelet[2735]: E0912 23:09:13.711971 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.712029 kubelet[2735]: W0912 23:09:13.712003 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.712029 kubelet[2735]: E0912 23:09:13.712033 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.712710 kubelet[2735]: E0912 23:09:13.712245 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.712710 kubelet[2735]: W0912 23:09:13.712253 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.712710 kubelet[2735]: E0912 23:09:13.712261 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.712710 kubelet[2735]: E0912 23:09:13.712493 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.712710 kubelet[2735]: W0912 23:09:13.712501 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.712710 kubelet[2735]: E0912 23:09:13.712509 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.712710 kubelet[2735]: E0912 23:09:13.712712 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.712873 kubelet[2735]: W0912 23:09:13.712721 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.712873 kubelet[2735]: E0912 23:09:13.712730 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.713137 kubelet[2735]: E0912 23:09:13.713092 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.713137 kubelet[2735]: W0912 23:09:13.713125 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.713192 kubelet[2735]: E0912 23:09:13.713148 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.713400 kubelet[2735]: E0912 23:09:13.713382 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.713400 kubelet[2735]: W0912 23:09:13.713393 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.713451 kubelet[2735]: E0912 23:09:13.713402 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.713623 kubelet[2735]: E0912 23:09:13.713600 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.713623 kubelet[2735]: W0912 23:09:13.713612 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.713623 kubelet[2735]: E0912 23:09:13.713621 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.713856 kubelet[2735]: E0912 23:09:13.713833 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.713856 kubelet[2735]: W0912 23:09:13.713845 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.713856 kubelet[2735]: E0912 23:09:13.713853 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.714045 kubelet[2735]: E0912 23:09:13.714029 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.714045 kubelet[2735]: W0912 23:09:13.714040 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.714095 kubelet[2735]: E0912 23:09:13.714048 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.714231 kubelet[2735]: E0912 23:09:13.714218 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.714231 kubelet[2735]: W0912 23:09:13.714227 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.714276 kubelet[2735]: E0912 23:09:13.714236 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.714430 kubelet[2735]: E0912 23:09:13.714415 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.714430 kubelet[2735]: W0912 23:09:13.714425 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.714479 kubelet[2735]: E0912 23:09:13.714434 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.714666 kubelet[2735]: E0912 23:09:13.714630 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.714666 kubelet[2735]: W0912 23:09:13.714642 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.714726 kubelet[2735]: E0912 23:09:13.714676 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.714994 kubelet[2735]: E0912 23:09:13.714962 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.714994 kubelet[2735]: W0912 23:09:13.714980 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.714994 kubelet[2735]: E0912 23:09:13.714992 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.715198 kubelet[2735]: E0912 23:09:13.715178 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.715198 kubelet[2735]: W0912 23:09:13.715187 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.715198 kubelet[2735]: E0912 23:09:13.715197 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.715438 kubelet[2735]: E0912 23:09:13.715424 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.715438 kubelet[2735]: W0912 23:09:13.715434 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.715492 kubelet[2735]: E0912 23:09:13.715442 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.715639 kubelet[2735]: E0912 23:09:13.715620 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.715639 kubelet[2735]: W0912 23:09:13.715633 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.715639 kubelet[2735]: E0912 23:09:13.715645 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.715867 kubelet[2735]: E0912 23:09:13.715854 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.715867 kubelet[2735]: W0912 23:09:13.715864 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.715916 kubelet[2735]: E0912 23:09:13.715873 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.716056 kubelet[2735]: E0912 23:09:13.716044 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.716056 kubelet[2735]: W0912 23:09:13.716054 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.716102 kubelet[2735]: E0912 23:09:13.716062 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.716238 kubelet[2735]: E0912 23:09:13.716223 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.716238 kubelet[2735]: W0912 23:09:13.716233 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.716321 kubelet[2735]: E0912 23:09:13.716241 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.716469 kubelet[2735]: E0912 23:09:13.716444 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.716469 kubelet[2735]: W0912 23:09:13.716456 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.716548 kubelet[2735]: E0912 23:09:13.716475 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.716717 kubelet[2735]: E0912 23:09:13.716693 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.716717 kubelet[2735]: W0912 23:09:13.716707 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.716717 kubelet[2735]: E0912 23:09:13.716719 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.717001 kubelet[2735]: E0912 23:09:13.716978 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.717001 kubelet[2735]: W0912 23:09:13.716991 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.717001 kubelet[2735]: E0912 23:09:13.717001 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.717258 kubelet[2735]: E0912 23:09:13.717231 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.717258 kubelet[2735]: W0912 23:09:13.717242 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.717326 kubelet[2735]: E0912 23:09:13.717262 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.717510 kubelet[2735]: E0912 23:09:13.717489 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.717510 kubelet[2735]: W0912 23:09:13.717501 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.717510 kubelet[2735]: E0912 23:09:13.717511 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.718006 kubelet[2735]: E0912 23:09:13.717976 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.718006 kubelet[2735]: W0912 23:09:13.717989 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.718006 kubelet[2735]: E0912 23:09:13.717998 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:13.790783 kubelet[2735]: E0912 23:09:13.790721 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:13.790783 kubelet[2735]: W0912 23:09:13.790754 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:13.790783 kubelet[2735]: E0912 23:09:13.790799 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:14.007353 containerd[1565]: time="2025-09-12T23:09:14.007272585Z" level=info msg="connecting to shim 8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081" address="unix:///run/containerd/s/1a455a0ac31b7703e71ca6838255c5bd4e498d51f7c46cde789294f1cc478ddc" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:14.040852 systemd[1]: Started cri-containerd-8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081.scope - libcontainer container 8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081. Sep 12 23:09:14.195629 containerd[1565]: time="2025-09-12T23:09:14.195546629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kbs4n,Uid:31972b5d-50ad-429a-a897-1ede3300f239,Namespace:calico-system,Attempt:0,} returns sandbox id \"8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081\"" Sep 12 23:09:14.898979 kubelet[2735]: E0912 23:09:14.898854 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:15.220049 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount654682215.mount: Deactivated successfully. Sep 12 23:09:15.966194 containerd[1565]: time="2025-09-12T23:09:15.966126644Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:15.987407 containerd[1565]: time="2025-09-12T23:09:15.987375826Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 23:09:16.042048 containerd[1565]: time="2025-09-12T23:09:16.041970083Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:16.053521 containerd[1565]: time="2025-09-12T23:09:16.053417013Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:16.054094 containerd[1565]: time="2025-09-12T23:09:16.053988632Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.654076234s" Sep 12 23:09:16.054094 containerd[1565]: time="2025-09-12T23:09:16.054053434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 23:09:16.055403 containerd[1565]: time="2025-09-12T23:09:16.055375719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 23:09:16.071355 containerd[1565]: time="2025-09-12T23:09:16.071299777Z" level=info msg="CreateContainer within sandbox \"94054cf3cd737aae34d4e6418ffd13bcd3c554ca6ec9be25fdf0b65db7a0f6b6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 23:09:16.082062 containerd[1565]: time="2025-09-12T23:09:16.081994037Z" level=info msg="Container d09e75a06ce1465e1d66c7db166bee7e744fb39e32bf7844f25a8ad00a793f90: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:16.090179 containerd[1565]: time="2025-09-12T23:09:16.090121202Z" level=info msg="CreateContainer within sandbox \"94054cf3cd737aae34d4e6418ffd13bcd3c554ca6ec9be25fdf0b65db7a0f6b6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d09e75a06ce1465e1d66c7db166bee7e744fb39e32bf7844f25a8ad00a793f90\"" Sep 12 23:09:16.090738 containerd[1565]: time="2025-09-12T23:09:16.090707558Z" level=info msg="StartContainer for \"d09e75a06ce1465e1d66c7db166bee7e744fb39e32bf7844f25a8ad00a793f90\"" Sep 12 23:09:16.091730 containerd[1565]: time="2025-09-12T23:09:16.091687347Z" level=info msg="connecting to shim d09e75a06ce1465e1d66c7db166bee7e744fb39e32bf7844f25a8ad00a793f90" address="unix:///run/containerd/s/aff3330f8460bd32ea12183ac204f0d5508ee2c15af0f51cec919fa3e90b21fb" protocol=ttrpc version=3 Sep 12 23:09:16.121855 systemd[1]: Started cri-containerd-d09e75a06ce1465e1d66c7db166bee7e744fb39e32bf7844f25a8ad00a793f90.scope - libcontainer container d09e75a06ce1465e1d66c7db166bee7e744fb39e32bf7844f25a8ad00a793f90. Sep 12 23:09:16.200678 containerd[1565]: time="2025-09-12T23:09:16.200314601Z" level=info msg="StartContainer for \"d09e75a06ce1465e1d66c7db166bee7e744fb39e32bf7844f25a8ad00a793f90\" returns successfully" Sep 12 23:09:16.898919 kubelet[2735]: E0912 23:09:16.898866 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:16.990798 kubelet[2735]: I0912 23:09:16.990724 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-86b4b94f56-s8nq4" podStartSLOduration=2.333557116 podStartE2EDuration="4.990692508s" podCreationTimestamp="2025-09-12 23:09:12 +0000 UTC" firstStartedPulling="2025-09-12 23:09:13.398008029 +0000 UTC m=+19.694045770" lastFinishedPulling="2025-09-12 23:09:16.055143421 +0000 UTC m=+22.351181162" observedRunningTime="2025-09-12 23:09:16.990510925 +0000 UTC m=+23.286548666" watchObservedRunningTime="2025-09-12 23:09:16.990692508 +0000 UTC m=+23.286730249" Sep 12 23:09:17.016006 kubelet[2735]: E0912 23:09:17.015975 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.016006 kubelet[2735]: W0912 23:09:17.016001 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.016116 kubelet[2735]: E0912 23:09:17.016023 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.016187 kubelet[2735]: E0912 23:09:17.016172 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.016187 kubelet[2735]: W0912 23:09:17.016183 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.016239 kubelet[2735]: E0912 23:09:17.016191 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.016345 kubelet[2735]: E0912 23:09:17.016330 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.016345 kubelet[2735]: W0912 23:09:17.016340 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.016399 kubelet[2735]: E0912 23:09:17.016348 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.016581 kubelet[2735]: E0912 23:09:17.016560 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.016581 kubelet[2735]: W0912 23:09:17.016571 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.016581 kubelet[2735]: E0912 23:09:17.016580 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.016846 kubelet[2735]: E0912 23:09:17.016820 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.016846 kubelet[2735]: W0912 23:09:17.016831 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.016846 kubelet[2735]: E0912 23:09:17.016840 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.017007 kubelet[2735]: E0912 23:09:17.016991 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.017043 kubelet[2735]: W0912 23:09:17.017011 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.017043 kubelet[2735]: E0912 23:09:17.017020 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.017215 kubelet[2735]: E0912 23:09:17.017191 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.017215 kubelet[2735]: W0912 23:09:17.017203 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.017215 kubelet[2735]: E0912 23:09:17.017211 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.017450 kubelet[2735]: E0912 23:09:17.017414 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.017450 kubelet[2735]: W0912 23:09:17.017433 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.017450 kubelet[2735]: E0912 23:09:17.017453 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.017717 kubelet[2735]: E0912 23:09:17.017698 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.017717 kubelet[2735]: W0912 23:09:17.017712 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.017803 kubelet[2735]: E0912 23:09:17.017723 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.017947 kubelet[2735]: E0912 23:09:17.017930 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.017947 kubelet[2735]: W0912 23:09:17.017941 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.018016 kubelet[2735]: E0912 23:09:17.017950 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.018150 kubelet[2735]: E0912 23:09:17.018132 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.018150 kubelet[2735]: W0912 23:09:17.018144 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.018209 kubelet[2735]: E0912 23:09:17.018154 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.018377 kubelet[2735]: E0912 23:09:17.018360 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.018377 kubelet[2735]: W0912 23:09:17.018372 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.018433 kubelet[2735]: E0912 23:09:17.018383 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.018580 kubelet[2735]: E0912 23:09:17.018554 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.018580 kubelet[2735]: W0912 23:09:17.018567 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.018580 kubelet[2735]: E0912 23:09:17.018576 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.018793 kubelet[2735]: E0912 23:09:17.018777 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.018793 kubelet[2735]: W0912 23:09:17.018788 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.018847 kubelet[2735]: E0912 23:09:17.018798 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.018981 kubelet[2735]: E0912 23:09:17.018965 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.018981 kubelet[2735]: W0912 23:09:17.018976 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.019034 kubelet[2735]: E0912 23:09:17.018984 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.038302 kubelet[2735]: E0912 23:09:17.038266 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.038302 kubelet[2735]: W0912 23:09:17.038285 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.038302 kubelet[2735]: E0912 23:09:17.038296 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.038566 kubelet[2735]: E0912 23:09:17.038482 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.038566 kubelet[2735]: W0912 23:09:17.038492 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.038566 kubelet[2735]: E0912 23:09:17.038500 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.038727 kubelet[2735]: E0912 23:09:17.038707 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.038727 kubelet[2735]: W0912 23:09:17.038719 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.038727 kubelet[2735]: E0912 23:09:17.038727 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.038949 kubelet[2735]: E0912 23:09:17.038931 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.038949 kubelet[2735]: W0912 23:09:17.038943 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.038949 kubelet[2735]: E0912 23:09:17.038952 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.039248 kubelet[2735]: E0912 23:09:17.039210 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.039248 kubelet[2735]: W0912 23:09:17.039238 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.039359 kubelet[2735]: E0912 23:09:17.039266 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.039682 kubelet[2735]: E0912 23:09:17.039595 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.039682 kubelet[2735]: W0912 23:09:17.039610 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.039682 kubelet[2735]: E0912 23:09:17.039622 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.040079 kubelet[2735]: E0912 23:09:17.039871 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.040079 kubelet[2735]: W0912 23:09:17.039926 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.040079 kubelet[2735]: E0912 23:09:17.039936 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.040724 kubelet[2735]: E0912 23:09:17.040705 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.040724 kubelet[2735]: W0912 23:09:17.040718 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.040826 kubelet[2735]: E0912 23:09:17.040744 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.040955 kubelet[2735]: E0912 23:09:17.040936 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.040955 kubelet[2735]: W0912 23:09:17.040947 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.041044 kubelet[2735]: E0912 23:09:17.040956 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.041224 kubelet[2735]: E0912 23:09:17.041197 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.041224 kubelet[2735]: W0912 23:09:17.041216 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.041316 kubelet[2735]: E0912 23:09:17.041233 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.041591 kubelet[2735]: E0912 23:09:17.041567 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.041591 kubelet[2735]: W0912 23:09:17.041583 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.041591 kubelet[2735]: E0912 23:09:17.041595 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.041876 kubelet[2735]: E0912 23:09:17.041858 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.041876 kubelet[2735]: W0912 23:09:17.041869 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.041943 kubelet[2735]: E0912 23:09:17.041879 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.042139 kubelet[2735]: E0912 23:09:17.042121 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.042139 kubelet[2735]: W0912 23:09:17.042133 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.042214 kubelet[2735]: E0912 23:09:17.042143 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.042514 kubelet[2735]: E0912 23:09:17.042494 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.042514 kubelet[2735]: W0912 23:09:17.042509 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.042589 kubelet[2735]: E0912 23:09:17.042522 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.042744 kubelet[2735]: E0912 23:09:17.042718 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.042776 kubelet[2735]: W0912 23:09:17.042730 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.042776 kubelet[2735]: E0912 23:09:17.042762 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.042981 kubelet[2735]: E0912 23:09:17.042964 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.042981 kubelet[2735]: W0912 23:09:17.042975 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.043089 kubelet[2735]: E0912 23:09:17.042984 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.043309 kubelet[2735]: E0912 23:09:17.043293 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.043309 kubelet[2735]: W0912 23:09:17.043307 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.043371 kubelet[2735]: E0912 23:09:17.043318 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.043713 kubelet[2735]: E0912 23:09:17.043644 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:17.043713 kubelet[2735]: W0912 23:09:17.043680 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:17.043713 kubelet[2735]: E0912 23:09:17.043692 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:17.982767 kubelet[2735]: I0912 23:09:17.982690 2735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:09:18.025842 kubelet[2735]: E0912 23:09:18.025781 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.025842 kubelet[2735]: W0912 23:09:18.025815 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.025842 kubelet[2735]: E0912 23:09:18.025842 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.026199 kubelet[2735]: E0912 23:09:18.026168 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.026199 kubelet[2735]: W0912 23:09:18.026184 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.026288 kubelet[2735]: E0912 23:09:18.026207 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.026440 kubelet[2735]: E0912 23:09:18.026417 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.026440 kubelet[2735]: W0912 23:09:18.026428 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.026440 kubelet[2735]: E0912 23:09:18.026437 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.026639 kubelet[2735]: E0912 23:09:18.026624 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.026639 kubelet[2735]: W0912 23:09:18.026634 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.026764 kubelet[2735]: E0912 23:09:18.026643 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.026917 kubelet[2735]: E0912 23:09:18.026893 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.026917 kubelet[2735]: W0912 23:09:18.026903 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.026917 kubelet[2735]: E0912 23:09:18.026912 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.027103 kubelet[2735]: E0912 23:09:18.027088 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.027103 kubelet[2735]: W0912 23:09:18.027098 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.027170 kubelet[2735]: E0912 23:09:18.027106 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.027290 kubelet[2735]: E0912 23:09:18.027276 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.027290 kubelet[2735]: W0912 23:09:18.027285 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.027290 kubelet[2735]: E0912 23:09:18.027293 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.027739 kubelet[2735]: E0912 23:09:18.027705 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.027739 kubelet[2735]: W0912 23:09:18.027726 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.027739 kubelet[2735]: E0912 23:09:18.027736 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.027974 kubelet[2735]: E0912 23:09:18.027960 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.027974 kubelet[2735]: W0912 23:09:18.027969 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.027974 kubelet[2735]: E0912 23:09:18.027978 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.028177 kubelet[2735]: E0912 23:09:18.028150 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.028177 kubelet[2735]: W0912 23:09:18.028159 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.028177 kubelet[2735]: E0912 23:09:18.028167 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.028376 kubelet[2735]: E0912 23:09:18.028365 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.028376 kubelet[2735]: W0912 23:09:18.028373 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.028463 kubelet[2735]: E0912 23:09:18.028383 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.028693 kubelet[2735]: E0912 23:09:18.028638 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.028785 kubelet[2735]: W0912 23:09:18.028689 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.028785 kubelet[2735]: E0912 23:09:18.028734 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.029057 kubelet[2735]: E0912 23:09:18.029030 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.029057 kubelet[2735]: W0912 23:09:18.029045 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.029057 kubelet[2735]: E0912 23:09:18.029055 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.029352 kubelet[2735]: E0912 23:09:18.029288 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.029352 kubelet[2735]: W0912 23:09:18.029317 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.029352 kubelet[2735]: E0912 23:09:18.029331 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.029610 kubelet[2735]: E0912 23:09:18.029579 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.029610 kubelet[2735]: W0912 23:09:18.029603 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.029695 kubelet[2735]: E0912 23:09:18.029629 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.046219 kubelet[2735]: E0912 23:09:18.046175 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.046219 kubelet[2735]: W0912 23:09:18.046200 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.046219 kubelet[2735]: E0912 23:09:18.046219 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.046512 kubelet[2735]: E0912 23:09:18.046491 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.046512 kubelet[2735]: W0912 23:09:18.046505 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.046588 kubelet[2735]: E0912 23:09:18.046519 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.046848 kubelet[2735]: E0912 23:09:18.046829 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.046848 kubelet[2735]: W0912 23:09:18.046843 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.046951 kubelet[2735]: E0912 23:09:18.046854 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.047566 kubelet[2735]: E0912 23:09:18.047208 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.047566 kubelet[2735]: W0912 23:09:18.047248 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.047566 kubelet[2735]: E0912 23:09:18.047277 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.047705 kubelet[2735]: E0912 23:09:18.047677 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.047705 kubelet[2735]: W0912 23:09:18.047691 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.047705 kubelet[2735]: E0912 23:09:18.047704 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.047988 kubelet[2735]: E0912 23:09:18.047969 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.048100 kubelet[2735]: W0912 23:09:18.048067 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.048100 kubelet[2735]: E0912 23:09:18.048085 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.048414 kubelet[2735]: E0912 23:09:18.048381 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.048414 kubelet[2735]: W0912 23:09:18.048397 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.048414 kubelet[2735]: E0912 23:09:18.048409 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.048662 kubelet[2735]: E0912 23:09:18.048632 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.048689 kubelet[2735]: W0912 23:09:18.048669 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.048689 kubelet[2735]: E0912 23:09:18.048682 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.048954 kubelet[2735]: E0912 23:09:18.048928 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.048954 kubelet[2735]: W0912 23:09:18.048942 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.048954 kubelet[2735]: E0912 23:09:18.048952 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.049299 kubelet[2735]: E0912 23:09:18.049272 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.049299 kubelet[2735]: W0912 23:09:18.049291 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.049361 kubelet[2735]: E0912 23:09:18.049306 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.049529 kubelet[2735]: E0912 23:09:18.049505 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.049529 kubelet[2735]: W0912 23:09:18.049520 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.049581 kubelet[2735]: E0912 23:09:18.049531 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.049806 kubelet[2735]: E0912 23:09:18.049787 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.049806 kubelet[2735]: W0912 23:09:18.049801 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.049857 kubelet[2735]: E0912 23:09:18.049813 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.050045 kubelet[2735]: E0912 23:09:18.050027 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.050045 kubelet[2735]: W0912 23:09:18.050041 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.050097 kubelet[2735]: E0912 23:09:18.050053 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.050284 kubelet[2735]: E0912 23:09:18.050267 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.050284 kubelet[2735]: W0912 23:09:18.050281 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.050327 kubelet[2735]: E0912 23:09:18.050292 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.050542 kubelet[2735]: E0912 23:09:18.050524 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.050542 kubelet[2735]: W0912 23:09:18.050537 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.050599 kubelet[2735]: E0912 23:09:18.050548 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.050797 kubelet[2735]: E0912 23:09:18.050779 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.050797 kubelet[2735]: W0912 23:09:18.050793 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.050853 kubelet[2735]: E0912 23:09:18.050804 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.051249 kubelet[2735]: E0912 23:09:18.051205 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.051249 kubelet[2735]: W0912 23:09:18.051239 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.051307 kubelet[2735]: E0912 23:09:18.051264 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.051515 kubelet[2735]: E0912 23:09:18.051491 2735 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:09:18.051515 kubelet[2735]: W0912 23:09:18.051504 2735 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:09:18.051515 kubelet[2735]: E0912 23:09:18.051513 2735 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:09:18.898523 kubelet[2735]: E0912 23:09:18.898454 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:19.032743 containerd[1565]: time="2025-09-12T23:09:19.032670861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:19.033427 containerd[1565]: time="2025-09-12T23:09:19.033369166Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 23:09:19.034521 containerd[1565]: time="2025-09-12T23:09:19.034484168Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:19.036466 containerd[1565]: time="2025-09-12T23:09:19.036422382Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:19.036955 containerd[1565]: time="2025-09-12T23:09:19.036926722Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 2.981519233s" Sep 12 23:09:19.037005 containerd[1565]: time="2025-09-12T23:09:19.036957691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 23:09:19.041605 containerd[1565]: time="2025-09-12T23:09:19.041566729Z" level=info msg="CreateContainer within sandbox \"8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 23:09:19.052856 containerd[1565]: time="2025-09-12T23:09:19.052787518Z" level=info msg="Container dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:19.067427 containerd[1565]: time="2025-09-12T23:09:19.067382068Z" level=info msg="CreateContainer within sandbox \"8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a\"" Sep 12 23:09:19.068002 containerd[1565]: time="2025-09-12T23:09:19.067965498Z" level=info msg="StartContainer for \"dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a\"" Sep 12 23:09:19.069370 containerd[1565]: time="2025-09-12T23:09:19.069343736Z" level=info msg="connecting to shim dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a" address="unix:///run/containerd/s/1a455a0ac31b7703e71ca6838255c5bd4e498d51f7c46cde789294f1cc478ddc" protocol=ttrpc version=3 Sep 12 23:09:19.093813 systemd[1]: Started cri-containerd-dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a.scope - libcontainer container dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a. Sep 12 23:09:19.140862 containerd[1565]: time="2025-09-12T23:09:19.140804804Z" level=info msg="StartContainer for \"dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a\" returns successfully" Sep 12 23:09:19.149310 systemd[1]: cri-containerd-dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a.scope: Deactivated successfully. Sep 12 23:09:19.151422 containerd[1565]: time="2025-09-12T23:09:19.151388092Z" level=info msg="received exit event container_id:\"dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a\" id:\"dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a\" pid:3483 exited_at:{seconds:1757718559 nanos:151069131}" Sep 12 23:09:19.151550 containerd[1565]: time="2025-09-12T23:09:19.151499201Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a\" id:\"dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a\" pid:3483 exited_at:{seconds:1757718559 nanos:151069131}" Sep 12 23:09:19.174329 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dd4fd733532393092a24aec6f8b4f42df876b97b8cd494f13faaa1128752c75a-rootfs.mount: Deactivated successfully. Sep 12 23:09:19.990046 containerd[1565]: time="2025-09-12T23:09:19.989997235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 23:09:20.899319 kubelet[2735]: E0912 23:09:20.899236 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:22.898541 kubelet[2735]: E0912 23:09:22.898432 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:23.207532 containerd[1565]: time="2025-09-12T23:09:23.207448662Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:23.227823 containerd[1565]: time="2025-09-12T23:09:23.227754923Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 23:09:23.265548 containerd[1565]: time="2025-09-12T23:09:23.265472315Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:23.284594 containerd[1565]: time="2025-09-12T23:09:23.284541066Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:23.285172 containerd[1565]: time="2025-09-12T23:09:23.285134022Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.295097262s" Sep 12 23:09:23.285172 containerd[1565]: time="2025-09-12T23:09:23.285170191Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 23:09:23.351316 containerd[1565]: time="2025-09-12T23:09:23.351239261Z" level=info msg="CreateContainer within sandbox \"8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 23:09:23.491692 containerd[1565]: time="2025-09-12T23:09:23.491500887Z" level=info msg="Container b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:23.650117 containerd[1565]: time="2025-09-12T23:09:23.650054493Z" level=info msg="CreateContainer within sandbox \"8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5\"" Sep 12 23:09:23.650785 containerd[1565]: time="2025-09-12T23:09:23.650738580Z" level=info msg="StartContainer for \"b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5\"" Sep 12 23:09:23.652888 containerd[1565]: time="2025-09-12T23:09:23.652376524Z" level=info msg="connecting to shim b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5" address="unix:///run/containerd/s/1a455a0ac31b7703e71ca6838255c5bd4e498d51f7c46cde789294f1cc478ddc" protocol=ttrpc version=3 Sep 12 23:09:23.681961 systemd[1]: Started cri-containerd-b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5.scope - libcontainer container b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5. Sep 12 23:09:24.376681 containerd[1565]: time="2025-09-12T23:09:24.375931454Z" level=info msg="StartContainer for \"b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5\" returns successfully" Sep 12 23:09:24.898732 kubelet[2735]: E0912 23:09:24.898622 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:26.898766 kubelet[2735]: E0912 23:09:26.898686 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:27.702701 systemd[1]: cri-containerd-b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5.scope: Deactivated successfully. Sep 12 23:09:27.703555 systemd[1]: cri-containerd-b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5.scope: Consumed 608ms CPU time, 173.6M memory peak, 2.3M read from disk, 171.3M written to disk. Sep 12 23:09:27.704932 containerd[1565]: time="2025-09-12T23:09:27.704890157Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5\" id:\"b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5\" pid:3543 exited_at:{seconds:1757718567 nanos:703406004}" Sep 12 23:09:27.705921 containerd[1565]: time="2025-09-12T23:09:27.705432968Z" level=info msg="received exit event container_id:\"b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5\" id:\"b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5\" pid:3543 exited_at:{seconds:1757718567 nanos:703406004}" Sep 12 23:09:27.744103 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b8024cb9eb1ff1053afc5da4998813b0616bacde72cb4676b202218f49af5bd5-rootfs.mount: Deactivated successfully. Sep 12 23:09:27.756160 kubelet[2735]: I0912 23:09:27.756132 2735 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 23:09:28.372489 systemd[1]: Created slice kubepods-besteffort-pod689cc000_12e9_401f_b65b_8e76eeb53d27.slice - libcontainer container kubepods-besteffort-pod689cc000_12e9_401f_b65b_8e76eeb53d27.slice. Sep 12 23:09:28.525447 containerd[1565]: time="2025-09-12T23:09:28.525365041Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 23:09:28.532125 systemd[1]: Created slice kubepods-burstable-podbb420c54_7b0f_4c5c_8007_3e79ab20da98.slice - libcontainer container kubepods-burstable-podbb420c54_7b0f_4c5c_8007_3e79ab20da98.slice. Sep 12 23:09:28.540117 kubelet[2735]: I0912 23:09:28.539925 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/689cc000-12e9-401f-b65b-8e76eeb53d27-tigera-ca-bundle\") pod \"calico-kube-controllers-547fc7c647-tnhhc\" (UID: \"689cc000-12e9-401f-b65b-8e76eeb53d27\") " pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" Sep 12 23:09:28.540117 kubelet[2735]: I0912 23:09:28.540065 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqwdr\" (UniqueName: \"kubernetes.io/projected/689cc000-12e9-401f-b65b-8e76eeb53d27-kube-api-access-wqwdr\") pod \"calico-kube-controllers-547fc7c647-tnhhc\" (UID: \"689cc000-12e9-401f-b65b-8e76eeb53d27\") " pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" Sep 12 23:09:28.579826 systemd[1]: Created slice kubepods-besteffort-pod922b6645_484d_4129_a67b_ab9327cccfde.slice - libcontainer container kubepods-besteffort-pod922b6645_484d_4129_a67b_ab9327cccfde.slice. Sep 12 23:09:28.588602 systemd[1]: Created slice kubepods-burstable-podee957723_a32e_4e2c_b9a1_bcecc48992cf.slice - libcontainer container kubepods-burstable-podee957723_a32e_4e2c_b9a1_bcecc48992cf.slice. Sep 12 23:09:28.596765 systemd[1]: Created slice kubepods-besteffort-pod8f0d555c_c909_4b23_ab0e_a3117d51e702.slice - libcontainer container kubepods-besteffort-pod8f0d555c_c909_4b23_ab0e_a3117d51e702.slice. Sep 12 23:09:28.604273 systemd[1]: Created slice kubepods-besteffort-pod908cf082_bf9d_4477_b109_2080b0bf0e32.slice - libcontainer container kubepods-besteffort-pod908cf082_bf9d_4477_b109_2080b0bf0e32.slice. Sep 12 23:09:28.610459 systemd[1]: Created slice kubepods-besteffort-pode769e284_06fa_4f9a_98a3_28a76d353dce.slice - libcontainer container kubepods-besteffort-pode769e284_06fa_4f9a_98a3_28a76d353dce.slice. Sep 12 23:09:28.615804 systemd[1]: Created slice kubepods-besteffort-pod33f18c28_0d65_4d63_9999_ef624c2e4338.slice - libcontainer container kubepods-besteffort-pod33f18c28_0d65_4d63_9999_ef624c2e4338.slice. Sep 12 23:09:28.641686 kubelet[2735]: I0912 23:09:28.641300 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/922b6645-484d-4129-a67b-ab9327cccfde-whisker-ca-bundle\") pod \"whisker-7846745f6f-tjjpq\" (UID: \"922b6645-484d-4129-a67b-ab9327cccfde\") " pod="calico-system/whisker-7846745f6f-tjjpq" Sep 12 23:09:28.642103 kubelet[2735]: I0912 23:09:28.641689 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgk7m\" (UniqueName: \"kubernetes.io/projected/908cf082-bf9d-4477-b109-2080b0bf0e32-kube-api-access-mgk7m\") pod \"goldmane-54d579b49d-xn4hp\" (UID: \"908cf082-bf9d-4477-b109-2080b0bf0e32\") " pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:28.642103 kubelet[2735]: I0912 23:09:28.641723 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh7df\" (UniqueName: \"kubernetes.io/projected/ee957723-a32e-4e2c-b9a1-bcecc48992cf-kube-api-access-bh7df\") pod \"coredns-674b8bbfcf-w8xwj\" (UID: \"ee957723-a32e-4e2c-b9a1-bcecc48992cf\") " pod="kube-system/coredns-674b8bbfcf-w8xwj" Sep 12 23:09:28.642103 kubelet[2735]: I0912 23:09:28.641809 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/908cf082-bf9d-4477-b109-2080b0bf0e32-config\") pod \"goldmane-54d579b49d-xn4hp\" (UID: \"908cf082-bf9d-4477-b109-2080b0bf0e32\") " pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:28.642103 kubelet[2735]: I0912 23:09:28.641846 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e769e284-06fa-4f9a-98a3-28a76d353dce-calico-apiserver-certs\") pod \"calico-apiserver-8f4ff4584-wsjrr\" (UID: \"e769e284-06fa-4f9a-98a3-28a76d353dce\") " pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" Sep 12 23:09:28.642103 kubelet[2735]: I0912 23:09:28.641870 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvhrw\" (UniqueName: \"kubernetes.io/projected/e769e284-06fa-4f9a-98a3-28a76d353dce-kube-api-access-tvhrw\") pod \"calico-apiserver-8f4ff4584-wsjrr\" (UID: \"e769e284-06fa-4f9a-98a3-28a76d353dce\") " pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" Sep 12 23:09:28.643689 kubelet[2735]: I0912 23:09:28.641898 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb420c54-7b0f-4c5c-8007-3e79ab20da98-config-volume\") pod \"coredns-674b8bbfcf-r4tmb\" (UID: \"bb420c54-7b0f-4c5c-8007-3e79ab20da98\") " pod="kube-system/coredns-674b8bbfcf-r4tmb" Sep 12 23:09:28.643689 kubelet[2735]: I0912 23:09:28.641924 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/33f18c28-0d65-4d63-9999-ef624c2e4338-calico-apiserver-certs\") pod \"calico-apiserver-6bfd95f59f-wx5z6\" (UID: \"33f18c28-0d65-4d63-9999-ef624c2e4338\") " pod="calico-apiserver/calico-apiserver-6bfd95f59f-wx5z6" Sep 12 23:09:28.643689 kubelet[2735]: I0912 23:09:28.641951 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/922b6645-484d-4129-a67b-ab9327cccfde-whisker-backend-key-pair\") pod \"whisker-7846745f6f-tjjpq\" (UID: \"922b6645-484d-4129-a67b-ab9327cccfde\") " pod="calico-system/whisker-7846745f6f-tjjpq" Sep 12 23:09:28.643689 kubelet[2735]: I0912 23:09:28.641975 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79nsp\" (UniqueName: \"kubernetes.io/projected/922b6645-484d-4129-a67b-ab9327cccfde-kube-api-access-79nsp\") pod \"whisker-7846745f6f-tjjpq\" (UID: \"922b6645-484d-4129-a67b-ab9327cccfde\") " pod="calico-system/whisker-7846745f6f-tjjpq" Sep 12 23:09:28.643689 kubelet[2735]: I0912 23:09:28.642004 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/908cf082-bf9d-4477-b109-2080b0bf0e32-goldmane-key-pair\") pod \"goldmane-54d579b49d-xn4hp\" (UID: \"908cf082-bf9d-4477-b109-2080b0bf0e32\") " pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:28.643831 kubelet[2735]: I0912 23:09:28.642063 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee957723-a32e-4e2c-b9a1-bcecc48992cf-config-volume\") pod \"coredns-674b8bbfcf-w8xwj\" (UID: \"ee957723-a32e-4e2c-b9a1-bcecc48992cf\") " pod="kube-system/coredns-674b8bbfcf-w8xwj" Sep 12 23:09:28.643831 kubelet[2735]: I0912 23:09:28.642102 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtmkk\" (UniqueName: \"kubernetes.io/projected/33f18c28-0d65-4d63-9999-ef624c2e4338-kube-api-access-jtmkk\") pod \"calico-apiserver-6bfd95f59f-wx5z6\" (UID: \"33f18c28-0d65-4d63-9999-ef624c2e4338\") " pod="calico-apiserver/calico-apiserver-6bfd95f59f-wx5z6" Sep 12 23:09:28.643831 kubelet[2735]: I0912 23:09:28.642127 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/908cf082-bf9d-4477-b109-2080b0bf0e32-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-xn4hp\" (UID: \"908cf082-bf9d-4477-b109-2080b0bf0e32\") " pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:28.643831 kubelet[2735]: I0912 23:09:28.642155 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqm2k\" (UniqueName: \"kubernetes.io/projected/bb420c54-7b0f-4c5c-8007-3e79ab20da98-kube-api-access-lqm2k\") pod \"coredns-674b8bbfcf-r4tmb\" (UID: \"bb420c54-7b0f-4c5c-8007-3e79ab20da98\") " pod="kube-system/coredns-674b8bbfcf-r4tmb" Sep 12 23:09:28.643831 kubelet[2735]: I0912 23:09:28.642187 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8f0d555c-c909-4b23-ab0e-a3117d51e702-calico-apiserver-certs\") pod \"calico-apiserver-8f4ff4584-49xj7\" (UID: \"8f0d555c-c909-4b23-ab0e-a3117d51e702\") " pod="calico-apiserver/calico-apiserver-8f4ff4584-49xj7" Sep 12 23:09:28.643942 kubelet[2735]: I0912 23:09:28.642210 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx92x\" (UniqueName: \"kubernetes.io/projected/8f0d555c-c909-4b23-ab0e-a3117d51e702-kube-api-access-zx92x\") pod \"calico-apiserver-8f4ff4584-49xj7\" (UID: \"8f0d555c-c909-4b23-ab0e-a3117d51e702\") " pod="calico-apiserver/calico-apiserver-8f4ff4584-49xj7" Sep 12 23:09:28.676354 containerd[1565]: time="2025-09-12T23:09:28.676272774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-547fc7c647-tnhhc,Uid:689cc000-12e9-401f-b65b-8e76eeb53d27,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:28.794733 containerd[1565]: time="2025-09-12T23:09:28.794640798Z" level=error msg="Failed to destroy network for sandbox \"5fb93cbfa0e11f58784a16583d856b6d19554eaa7cb2550b38c83f3d4b30b5fe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.796549 containerd[1565]: time="2025-09-12T23:09:28.796468035Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-547fc7c647-tnhhc,Uid:689cc000-12e9-401f-b65b-8e76eeb53d27,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fb93cbfa0e11f58784a16583d856b6d19554eaa7cb2550b38c83f3d4b30b5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.802877 kubelet[2735]: E0912 23:09:28.802797 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fb93cbfa0e11f58784a16583d856b6d19554eaa7cb2550b38c83f3d4b30b5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.802877 kubelet[2735]: E0912 23:09:28.802880 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fb93cbfa0e11f58784a16583d856b6d19554eaa7cb2550b38c83f3d4b30b5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" Sep 12 23:09:28.803027 kubelet[2735]: E0912 23:09:28.802902 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fb93cbfa0e11f58784a16583d856b6d19554eaa7cb2550b38c83f3d4b30b5fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" Sep 12 23:09:28.803027 kubelet[2735]: E0912 23:09:28.802968 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-547fc7c647-tnhhc_calico-system(689cc000-12e9-401f-b65b-8e76eeb53d27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-547fc7c647-tnhhc_calico-system(689cc000-12e9-401f-b65b-8e76eeb53d27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5fb93cbfa0e11f58784a16583d856b6d19554eaa7cb2550b38c83f3d4b30b5fe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" podUID="689cc000-12e9-401f-b65b-8e76eeb53d27" Sep 12 23:09:28.836543 containerd[1565]: time="2025-09-12T23:09:28.836435812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-r4tmb,Uid:bb420c54-7b0f-4c5c-8007-3e79ab20da98,Namespace:kube-system,Attempt:0,}" Sep 12 23:09:28.884290 containerd[1565]: time="2025-09-12T23:09:28.884226392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7846745f6f-tjjpq,Uid:922b6645-484d-4129-a67b-ab9327cccfde,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:28.892640 containerd[1565]: time="2025-09-12T23:09:28.892508989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w8xwj,Uid:ee957723-a32e-4e2c-b9a1-bcecc48992cf,Namespace:kube-system,Attempt:0,}" Sep 12 23:09:28.897880 containerd[1565]: time="2025-09-12T23:09:28.897834886Z" level=error msg="Failed to destroy network for sandbox \"8f57b2806b5e5a432680ab3baa6f1bf33ff90d7e5a4d94443d336d57d1e9cd1c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.901763 containerd[1565]: time="2025-09-12T23:09:28.901316855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-49xj7,Uid:8f0d555c-c909-4b23-ab0e-a3117d51e702,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:09:28.905715 systemd[1]: Created slice kubepods-besteffort-pod0e4de4ee_2e8e_4032_b7ab_9b77d4141fea.slice - libcontainer container kubepods-besteffort-pod0e4de4ee_2e8e_4032_b7ab_9b77d4141fea.slice. Sep 12 23:09:28.908182 containerd[1565]: time="2025-09-12T23:09:28.908146168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xn4hp,Uid:908cf082-bf9d-4477-b109-2080b0bf0e32,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:28.908266 containerd[1565]: time="2025-09-12T23:09:28.908146068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bm7r8,Uid:0e4de4ee-2e8e-4032-b7ab-9b77d4141fea,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:28.913824 containerd[1565]: time="2025-09-12T23:09:28.913801675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-wsjrr,Uid:e769e284-06fa-4f9a-98a3-28a76d353dce,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:09:28.919468 containerd[1565]: time="2025-09-12T23:09:28.919433306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bfd95f59f-wx5z6,Uid:33f18c28-0d65-4d63-9999-ef624c2e4338,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:09:28.945776 containerd[1565]: time="2025-09-12T23:09:28.945683154Z" level=error msg="Failed to destroy network for sandbox \"74352abf4a0989add487ebaa2909803fc151af5a980a7ab4f8291b50c223eddd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.954783 containerd[1565]: time="2025-09-12T23:09:28.954690385Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-r4tmb,Uid:bb420c54-7b0f-4c5c-8007-3e79ab20da98,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f57b2806b5e5a432680ab3baa6f1bf33ff90d7e5a4d94443d336d57d1e9cd1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.955058 kubelet[2735]: E0912 23:09:28.954995 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f57b2806b5e5a432680ab3baa6f1bf33ff90d7e5a4d94443d336d57d1e9cd1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.955130 kubelet[2735]: E0912 23:09:28.955079 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f57b2806b5e5a432680ab3baa6f1bf33ff90d7e5a4d94443d336d57d1e9cd1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-r4tmb" Sep 12 23:09:28.955130 kubelet[2735]: E0912 23:09:28.955106 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f57b2806b5e5a432680ab3baa6f1bf33ff90d7e5a4d94443d336d57d1e9cd1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-r4tmb" Sep 12 23:09:28.955210 kubelet[2735]: E0912 23:09:28.955173 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-r4tmb_kube-system(bb420c54-7b0f-4c5c-8007-3e79ab20da98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-r4tmb_kube-system(bb420c54-7b0f-4c5c-8007-3e79ab20da98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8f57b2806b5e5a432680ab3baa6f1bf33ff90d7e5a4d94443d336d57d1e9cd1c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-r4tmb" podUID="bb420c54-7b0f-4c5c-8007-3e79ab20da98" Sep 12 23:09:28.969926 containerd[1565]: time="2025-09-12T23:09:28.969859333Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7846745f6f-tjjpq,Uid:922b6645-484d-4129-a67b-ab9327cccfde,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"74352abf4a0989add487ebaa2909803fc151af5a980a7ab4f8291b50c223eddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.970146 kubelet[2735]: E0912 23:09:28.970114 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"74352abf4a0989add487ebaa2909803fc151af5a980a7ab4f8291b50c223eddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:28.970222 kubelet[2735]: E0912 23:09:28.970169 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"74352abf4a0989add487ebaa2909803fc151af5a980a7ab4f8291b50c223eddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7846745f6f-tjjpq" Sep 12 23:09:28.970222 kubelet[2735]: E0912 23:09:28.970197 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"74352abf4a0989add487ebaa2909803fc151af5a980a7ab4f8291b50c223eddd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7846745f6f-tjjpq" Sep 12 23:09:28.970295 kubelet[2735]: E0912 23:09:28.970264 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7846745f6f-tjjpq_calico-system(922b6645-484d-4129-a67b-ab9327cccfde)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7846745f6f-tjjpq_calico-system(922b6645-484d-4129-a67b-ab9327cccfde)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"74352abf4a0989add487ebaa2909803fc151af5a980a7ab4f8291b50c223eddd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7846745f6f-tjjpq" podUID="922b6645-484d-4129-a67b-ab9327cccfde" Sep 12 23:09:29.030381 containerd[1565]: time="2025-09-12T23:09:29.030317086Z" level=error msg="Failed to destroy network for sandbox \"b9aa492758ab3d4cba46388cdf471ad9772e2f69019dc1020dd1158e561feb02\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.037996 containerd[1565]: time="2025-09-12T23:09:29.037854398Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w8xwj,Uid:ee957723-a32e-4e2c-b9a1-bcecc48992cf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9aa492758ab3d4cba46388cdf471ad9772e2f69019dc1020dd1158e561feb02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.038194 kubelet[2735]: E0912 23:09:29.038116 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9aa492758ab3d4cba46388cdf471ad9772e2f69019dc1020dd1158e561feb02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.038194 kubelet[2735]: E0912 23:09:29.038179 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9aa492758ab3d4cba46388cdf471ad9772e2f69019dc1020dd1158e561feb02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-w8xwj" Sep 12 23:09:29.038308 kubelet[2735]: E0912 23:09:29.038200 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9aa492758ab3d4cba46388cdf471ad9772e2f69019dc1020dd1158e561feb02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-w8xwj" Sep 12 23:09:29.038308 kubelet[2735]: E0912 23:09:29.038270 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-w8xwj_kube-system(ee957723-a32e-4e2c-b9a1-bcecc48992cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-w8xwj_kube-system(ee957723-a32e-4e2c-b9a1-bcecc48992cf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9aa492758ab3d4cba46388cdf471ad9772e2f69019dc1020dd1158e561feb02\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-w8xwj" podUID="ee957723-a32e-4e2c-b9a1-bcecc48992cf" Sep 12 23:09:29.082972 containerd[1565]: time="2025-09-12T23:09:29.082908717Z" level=error msg="Failed to destroy network for sandbox \"a43703eb77ae396110bc61d1f3ffc0400f8818d66005c44834b695e5f2821f4f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.084902 containerd[1565]: time="2025-09-12T23:09:29.084827054Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bm7r8,Uid:0e4de4ee-2e8e-4032-b7ab-9b77d4141fea,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a43703eb77ae396110bc61d1f3ffc0400f8818d66005c44834b695e5f2821f4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.085558 kubelet[2735]: E0912 23:09:29.085442 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a43703eb77ae396110bc61d1f3ffc0400f8818d66005c44834b695e5f2821f4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.085629 kubelet[2735]: E0912 23:09:29.085612 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a43703eb77ae396110bc61d1f3ffc0400f8818d66005c44834b695e5f2821f4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:29.085701 kubelet[2735]: E0912 23:09:29.085679 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a43703eb77ae396110bc61d1f3ffc0400f8818d66005c44834b695e5f2821f4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:29.086029 kubelet[2735]: E0912 23:09:29.085865 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bm7r8_calico-system(0e4de4ee-2e8e-4032-b7ab-9b77d4141fea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bm7r8_calico-system(0e4de4ee-2e8e-4032-b7ab-9b77d4141fea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a43703eb77ae396110bc61d1f3ffc0400f8818d66005c44834b695e5f2821f4f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:29.088145 containerd[1565]: time="2025-09-12T23:09:29.088056036Z" level=error msg="Failed to destroy network for sandbox \"0885462d19ff47ff323b1ec4bcc4cbe2fd43a9f67cad1a8470011f8d33841d9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.091159 containerd[1565]: time="2025-09-12T23:09:29.090988249Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-49xj7,Uid:8f0d555c-c909-4b23-ab0e-a3117d51e702,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0885462d19ff47ff323b1ec4bcc4cbe2fd43a9f67cad1a8470011f8d33841d9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.091568 kubelet[2735]: E0912 23:09:29.091513 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0885462d19ff47ff323b1ec4bcc4cbe2fd43a9f67cad1a8470011f8d33841d9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.091568 kubelet[2735]: E0912 23:09:29.091577 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0885462d19ff47ff323b1ec4bcc4cbe2fd43a9f67cad1a8470011f8d33841d9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8f4ff4584-49xj7" Sep 12 23:09:29.091568 kubelet[2735]: E0912 23:09:29.091600 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0885462d19ff47ff323b1ec4bcc4cbe2fd43a9f67cad1a8470011f8d33841d9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8f4ff4584-49xj7" Sep 12 23:09:29.091964 kubelet[2735]: E0912 23:09:29.091894 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8f4ff4584-49xj7_calico-apiserver(8f0d555c-c909-4b23-ab0e-a3117d51e702)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8f4ff4584-49xj7_calico-apiserver(8f0d555c-c909-4b23-ab0e-a3117d51e702)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0885462d19ff47ff323b1ec4bcc4cbe2fd43a9f67cad1a8470011f8d33841d9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8f4ff4584-49xj7" podUID="8f0d555c-c909-4b23-ab0e-a3117d51e702" Sep 12 23:09:29.108584 containerd[1565]: time="2025-09-12T23:09:29.108375444Z" level=error msg="Failed to destroy network for sandbox \"68e7536ec69ef3ee235a3829fd7d8a0d4912fcc38ab38c551ebd16ede5bdf61f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.109920 containerd[1565]: time="2025-09-12T23:09:29.109857761Z" level=error msg="Failed to destroy network for sandbox \"b9a65ce7d971f3a77caa1acf8c08b82e28039d812a7e6ef58e91c0111908def6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.111206 containerd[1565]: time="2025-09-12T23:09:29.111171582Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bfd95f59f-wx5z6,Uid:33f18c28-0d65-4d63-9999-ef624c2e4338,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e7536ec69ef3ee235a3829fd7d8a0d4912fcc38ab38c551ebd16ede5bdf61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.111573 kubelet[2735]: E0912 23:09:29.111515 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e7536ec69ef3ee235a3829fd7d8a0d4912fcc38ab38c551ebd16ede5bdf61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.111674 kubelet[2735]: E0912 23:09:29.111607 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e7536ec69ef3ee235a3829fd7d8a0d4912fcc38ab38c551ebd16ede5bdf61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bfd95f59f-wx5z6" Sep 12 23:09:29.111720 kubelet[2735]: E0912 23:09:29.111641 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68e7536ec69ef3ee235a3829fd7d8a0d4912fcc38ab38c551ebd16ede5bdf61f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bfd95f59f-wx5z6" Sep 12 23:09:29.111796 kubelet[2735]: E0912 23:09:29.111761 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6bfd95f59f-wx5z6_calico-apiserver(33f18c28-0d65-4d63-9999-ef624c2e4338)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6bfd95f59f-wx5z6_calico-apiserver(33f18c28-0d65-4d63-9999-ef624c2e4338)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"68e7536ec69ef3ee235a3829fd7d8a0d4912fcc38ab38c551ebd16ede5bdf61f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6bfd95f59f-wx5z6" podUID="33f18c28-0d65-4d63-9999-ef624c2e4338" Sep 12 23:09:29.112583 containerd[1565]: time="2025-09-12T23:09:29.112506782Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-wsjrr,Uid:e769e284-06fa-4f9a-98a3-28a76d353dce,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9a65ce7d971f3a77caa1acf8c08b82e28039d812a7e6ef58e91c0111908def6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.112781 kubelet[2735]: E0912 23:09:29.112721 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9a65ce7d971f3a77caa1acf8c08b82e28039d812a7e6ef58e91c0111908def6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.112860 kubelet[2735]: E0912 23:09:29.112813 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9a65ce7d971f3a77caa1acf8c08b82e28039d812a7e6ef58e91c0111908def6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" Sep 12 23:09:29.112903 kubelet[2735]: E0912 23:09:29.112868 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9a65ce7d971f3a77caa1acf8c08b82e28039d812a7e6ef58e91c0111908def6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" Sep 12 23:09:29.113027 kubelet[2735]: E0912 23:09:29.112951 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8f4ff4584-wsjrr_calico-apiserver(e769e284-06fa-4f9a-98a3-28a76d353dce)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8f4ff4584-wsjrr_calico-apiserver(e769e284-06fa-4f9a-98a3-28a76d353dce)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9a65ce7d971f3a77caa1acf8c08b82e28039d812a7e6ef58e91c0111908def6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" podUID="e769e284-06fa-4f9a-98a3-28a76d353dce" Sep 12 23:09:29.122813 containerd[1565]: time="2025-09-12T23:09:29.122756286Z" level=error msg="Failed to destroy network for sandbox \"bd7860383731c2345852bd71450c29f5c4857528ba2cff2a53f6a4e8e77e7de5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.124265 containerd[1565]: time="2025-09-12T23:09:29.124212493Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xn4hp,Uid:908cf082-bf9d-4477-b109-2080b0bf0e32,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd7860383731c2345852bd71450c29f5c4857528ba2cff2a53f6a4e8e77e7de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.124496 kubelet[2735]: E0912 23:09:29.124428 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd7860383731c2345852bd71450c29f5c4857528ba2cff2a53f6a4e8e77e7de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:29.124555 kubelet[2735]: E0912 23:09:29.124503 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd7860383731c2345852bd71450c29f5c4857528ba2cff2a53f6a4e8e77e7de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:29.124555 kubelet[2735]: E0912 23:09:29.124541 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd7860383731c2345852bd71450c29f5c4857528ba2cff2a53f6a4e8e77e7de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:29.124620 kubelet[2735]: E0912 23:09:29.124590 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-xn4hp_calico-system(908cf082-bf9d-4477-b109-2080b0bf0e32)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-xn4hp_calico-system(908cf082-bf9d-4477-b109-2080b0bf0e32)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bd7860383731c2345852bd71450c29f5c4857528ba2cff2a53f6a4e8e77e7de5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-xn4hp" podUID="908cf082-bf9d-4477-b109-2080b0bf0e32" Sep 12 23:09:29.753007 systemd[1]: run-netns-cni\x2d73fde5c3\x2d2f58\x2d0691\x2d8813\x2da910e039ad0d.mount: Deactivated successfully. Sep 12 23:09:32.295936 kubelet[2735]: I0912 23:09:32.295855 2735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:09:39.235420 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1753826324.mount: Deactivated successfully. Sep 12 23:09:40.899587 containerd[1565]: time="2025-09-12T23:09:40.899306857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7846745f6f-tjjpq,Uid:922b6645-484d-4129-a67b-ab9327cccfde,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:40.899587 containerd[1565]: time="2025-09-12T23:09:40.899365156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-wsjrr,Uid:e769e284-06fa-4f9a-98a3-28a76d353dce,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:09:41.899642 containerd[1565]: time="2025-09-12T23:09:41.899560125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-547fc7c647-tnhhc,Uid:689cc000-12e9-401f-b65b-8e76eeb53d27,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:41.900192 containerd[1565]: time="2025-09-12T23:09:41.899983230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bm7r8,Uid:0e4de4ee-2e8e-4032-b7ab-9b77d4141fea,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:42.295474 containerd[1565]: time="2025-09-12T23:09:42.295391762Z" level=error msg="Failed to destroy network for sandbox \"4422f8caf2108a86ed1f9cad45d8c557baca056ec82c79f2953fe6f96a700eb9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:42.298725 systemd[1]: run-netns-cni\x2df94c2f23\x2d5486\x2dd495\x2db705\x2da989cfc0c507.mount: Deactivated successfully. Sep 12 23:09:42.443948 containerd[1565]: time="2025-09-12T23:09:42.443887807Z" level=error msg="Failed to destroy network for sandbox \"bffc35f44735e9456f9445aae06339759fcd47238d3d387399951496dda4e47e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:42.607541 containerd[1565]: time="2025-09-12T23:09:42.607395547Z" level=error msg="Failed to destroy network for sandbox \"4c55727280737d505a05acd915a02134f14e60a5e7ad3bf43d7c179350f7c6a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:42.757520 containerd[1565]: time="2025-09-12T23:09:42.757451700Z" level=error msg="Failed to destroy network for sandbox \"d1c8940bd4febf453a31c09f5e9c271b70b533854195d479116f2dd2515bf035\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:42.899493 containerd[1565]: time="2025-09-12T23:09:42.899345128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-r4tmb,Uid:bb420c54-7b0f-4c5c-8007-3e79ab20da98,Namespace:kube-system,Attempt:0,}" Sep 12 23:09:42.899493 containerd[1565]: time="2025-09-12T23:09:42.899399891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w8xwj,Uid:ee957723-a32e-4e2c-b9a1-bcecc48992cf,Namespace:kube-system,Attempt:0,}" Sep 12 23:09:42.899843 containerd[1565]: time="2025-09-12T23:09:42.899766900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xn4hp,Uid:908cf082-bf9d-4477-b109-2080b0bf0e32,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:42.973747 systemd[1]: run-netns-cni\x2de1c0b271\x2d3f81\x2d3d34\x2d269b\x2d1c16b0399aca.mount: Deactivated successfully. Sep 12 23:09:42.973880 systemd[1]: run-netns-cni\x2d8fe53649\x2dbadd\x2d9dc7\x2d4ceb\x2d8019b407b7ab.mount: Deactivated successfully. Sep 12 23:09:42.973957 systemd[1]: run-netns-cni\x2d40481d7e\x2dd379\x2d006e\x2d5c0f\x2d70aba57f2ca5.mount: Deactivated successfully. Sep 12 23:09:43.011490 containerd[1565]: time="2025-09-12T23:09:43.011418738Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7846745f6f-tjjpq,Uid:922b6645-484d-4129-a67b-ab9327cccfde,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4422f8caf2108a86ed1f9cad45d8c557baca056ec82c79f2953fe6f96a700eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.013809 kubelet[2735]: E0912 23:09:43.013739 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4422f8caf2108a86ed1f9cad45d8c557baca056ec82c79f2953fe6f96a700eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.013809 kubelet[2735]: E0912 23:09:43.013820 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4422f8caf2108a86ed1f9cad45d8c557baca056ec82c79f2953fe6f96a700eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7846745f6f-tjjpq" Sep 12 23:09:43.013809 kubelet[2735]: E0912 23:09:43.013847 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4422f8caf2108a86ed1f9cad45d8c557baca056ec82c79f2953fe6f96a700eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7846745f6f-tjjpq" Sep 12 23:09:43.014588 kubelet[2735]: E0912 23:09:43.013961 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7846745f6f-tjjpq_calico-system(922b6645-484d-4129-a67b-ab9327cccfde)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7846745f6f-tjjpq_calico-system(922b6645-484d-4129-a67b-ab9327cccfde)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4422f8caf2108a86ed1f9cad45d8c557baca056ec82c79f2953fe6f96a700eb9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7846745f6f-tjjpq" podUID="922b6645-484d-4129-a67b-ab9327cccfde" Sep 12 23:09:43.015041 containerd[1565]: time="2025-09-12T23:09:43.014222061Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-wsjrr,Uid:e769e284-06fa-4f9a-98a3-28a76d353dce,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bffc35f44735e9456f9445aae06339759fcd47238d3d387399951496dda4e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.015307 kubelet[2735]: E0912 23:09:43.015246 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bffc35f44735e9456f9445aae06339759fcd47238d3d387399951496dda4e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.015438 kubelet[2735]: E0912 23:09:43.015314 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bffc35f44735e9456f9445aae06339759fcd47238d3d387399951496dda4e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" Sep 12 23:09:43.015438 kubelet[2735]: E0912 23:09:43.015334 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bffc35f44735e9456f9445aae06339759fcd47238d3d387399951496dda4e47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" Sep 12 23:09:43.015524 kubelet[2735]: E0912 23:09:43.015446 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8f4ff4584-wsjrr_calico-apiserver(e769e284-06fa-4f9a-98a3-28a76d353dce)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8f4ff4584-wsjrr_calico-apiserver(e769e284-06fa-4f9a-98a3-28a76d353dce)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bffc35f44735e9456f9445aae06339759fcd47238d3d387399951496dda4e47e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8f4ff4584-wsjrr" podUID="e769e284-06fa-4f9a-98a3-28a76d353dce" Sep 12 23:09:43.020026 containerd[1565]: time="2025-09-12T23:09:43.019925598Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-547fc7c647-tnhhc,Uid:689cc000-12e9-401f-b65b-8e76eeb53d27,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c55727280737d505a05acd915a02134f14e60a5e7ad3bf43d7c179350f7c6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.021043 kubelet[2735]: E0912 23:09:43.020950 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c55727280737d505a05acd915a02134f14e60a5e7ad3bf43d7c179350f7c6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.021307 kubelet[2735]: E0912 23:09:43.021256 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c55727280737d505a05acd915a02134f14e60a5e7ad3bf43d7c179350f7c6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" Sep 12 23:09:43.021307 kubelet[2735]: E0912 23:09:43.021288 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c55727280737d505a05acd915a02134f14e60a5e7ad3bf43d7c179350f7c6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" Sep 12 23:09:43.021543 kubelet[2735]: E0912 23:09:43.021376 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-547fc7c647-tnhhc_calico-system(689cc000-12e9-401f-b65b-8e76eeb53d27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-547fc7c647-tnhhc_calico-system(689cc000-12e9-401f-b65b-8e76eeb53d27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c55727280737d505a05acd915a02134f14e60a5e7ad3bf43d7c179350f7c6a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" podUID="689cc000-12e9-401f-b65b-8e76eeb53d27" Sep 12 23:09:43.022348 containerd[1565]: time="2025-09-12T23:09:43.022263317Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bm7r8,Uid:0e4de4ee-2e8e-4032-b7ab-9b77d4141fea,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1c8940bd4febf453a31c09f5e9c271b70b533854195d479116f2dd2515bf035\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.023904 kubelet[2735]: E0912 23:09:43.023838 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1c8940bd4febf453a31c09f5e9c271b70b533854195d479116f2dd2515bf035\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.023904 kubelet[2735]: E0912 23:09:43.023907 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1c8940bd4febf453a31c09f5e9c271b70b533854195d479116f2dd2515bf035\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:43.024104 kubelet[2735]: E0912 23:09:43.023926 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1c8940bd4febf453a31c09f5e9c271b70b533854195d479116f2dd2515bf035\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bm7r8" Sep 12 23:09:43.024104 kubelet[2735]: E0912 23:09:43.023978 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bm7r8_calico-system(0e4de4ee-2e8e-4032-b7ab-9b77d4141fea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bm7r8_calico-system(0e4de4ee-2e8e-4032-b7ab-9b77d4141fea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1c8940bd4febf453a31c09f5e9c271b70b533854195d479116f2dd2515bf035\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bm7r8" podUID="0e4de4ee-2e8e-4032-b7ab-9b77d4141fea" Sep 12 23:09:43.027228 containerd[1565]: time="2025-09-12T23:09:43.027165551Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:43.036758 containerd[1565]: time="2025-09-12T23:09:43.036580967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 23:09:43.053680 containerd[1565]: time="2025-09-12T23:09:43.053231366Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:43.063453 containerd[1565]: time="2025-09-12T23:09:43.063375721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:43.063907 containerd[1565]: time="2025-09-12T23:09:43.063837537Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 14.538428493s" Sep 12 23:09:43.064445 containerd[1565]: time="2025-09-12T23:09:43.064422466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 23:09:43.127257 containerd[1565]: time="2025-09-12T23:09:43.127177491Z" level=error msg="Failed to destroy network for sandbox \"0a947c9ee921c15e328cb6da5cea68eebeb1c0af018ab4baf57ae2a4e1e92f64\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.127774 containerd[1565]: time="2025-09-12T23:09:43.127722454Z" level=error msg="Failed to destroy network for sandbox \"057bc70bc51499d6f1d0efedf694361a3be35a1a747a216307285ccd3555b22e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.175820 containerd[1565]: time="2025-09-12T23:09:43.175641818Z" level=error msg="Failed to destroy network for sandbox \"e5c27d754f827343527ffffd08dbaa28be8edd2d6fec0b5d7c4c37812c79c884\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.400222 containerd[1565]: time="2025-09-12T23:09:43.400168281Z" level=info msg="CreateContainer within sandbox \"8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 23:09:43.434680 containerd[1565]: time="2025-09-12T23:09:43.434505906Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-r4tmb,Uid:bb420c54-7b0f-4c5c-8007-3e79ab20da98,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a947c9ee921c15e328cb6da5cea68eebeb1c0af018ab4baf57ae2a4e1e92f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.434884 kubelet[2735]: E0912 23:09:43.434841 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a947c9ee921c15e328cb6da5cea68eebeb1c0af018ab4baf57ae2a4e1e92f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.434973 kubelet[2735]: E0912 23:09:43.434914 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a947c9ee921c15e328cb6da5cea68eebeb1c0af018ab4baf57ae2a4e1e92f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-r4tmb" Sep 12 23:09:43.434973 kubelet[2735]: E0912 23:09:43.434935 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a947c9ee921c15e328cb6da5cea68eebeb1c0af018ab4baf57ae2a4e1e92f64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-r4tmb" Sep 12 23:09:43.435044 kubelet[2735]: E0912 23:09:43.434991 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-r4tmb_kube-system(bb420c54-7b0f-4c5c-8007-3e79ab20da98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-r4tmb_kube-system(bb420c54-7b0f-4c5c-8007-3e79ab20da98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a947c9ee921c15e328cb6da5cea68eebeb1c0af018ab4baf57ae2a4e1e92f64\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-r4tmb" podUID="bb420c54-7b0f-4c5c-8007-3e79ab20da98" Sep 12 23:09:43.460143 containerd[1565]: time="2025-09-12T23:09:43.460016309Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xn4hp,Uid:908cf082-bf9d-4477-b109-2080b0bf0e32,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"057bc70bc51499d6f1d0efedf694361a3be35a1a747a216307285ccd3555b22e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.460452 kubelet[2735]: E0912 23:09:43.460248 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"057bc70bc51499d6f1d0efedf694361a3be35a1a747a216307285ccd3555b22e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.460452 kubelet[2735]: E0912 23:09:43.460284 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"057bc70bc51499d6f1d0efedf694361a3be35a1a747a216307285ccd3555b22e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:43.460452 kubelet[2735]: E0912 23:09:43.460313 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"057bc70bc51499d6f1d0efedf694361a3be35a1a747a216307285ccd3555b22e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xn4hp" Sep 12 23:09:43.460553 kubelet[2735]: E0912 23:09:43.460355 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-xn4hp_calico-system(908cf082-bf9d-4477-b109-2080b0bf0e32)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-xn4hp_calico-system(908cf082-bf9d-4477-b109-2080b0bf0e32)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"057bc70bc51499d6f1d0efedf694361a3be35a1a747a216307285ccd3555b22e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-xn4hp" podUID="908cf082-bf9d-4477-b109-2080b0bf0e32" Sep 12 23:09:43.462063 containerd[1565]: time="2025-09-12T23:09:43.462009370Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w8xwj,Uid:ee957723-a32e-4e2c-b9a1-bcecc48992cf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5c27d754f827343527ffffd08dbaa28be8edd2d6fec0b5d7c4c37812c79c884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.462283 kubelet[2735]: E0912 23:09:43.462234 2735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5c27d754f827343527ffffd08dbaa28be8edd2d6fec0b5d7c4c37812c79c884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:09:43.462283 kubelet[2735]: E0912 23:09:43.462272 2735 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5c27d754f827343527ffffd08dbaa28be8edd2d6fec0b5d7c4c37812c79c884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-w8xwj" Sep 12 23:09:43.462352 kubelet[2735]: E0912 23:09:43.462287 2735 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5c27d754f827343527ffffd08dbaa28be8edd2d6fec0b5d7c4c37812c79c884\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-w8xwj" Sep 12 23:09:43.462352 kubelet[2735]: E0912 23:09:43.462325 2735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-w8xwj_kube-system(ee957723-a32e-4e2c-b9a1-bcecc48992cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-w8xwj_kube-system(ee957723-a32e-4e2c-b9a1-bcecc48992cf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e5c27d754f827343527ffffd08dbaa28be8edd2d6fec0b5d7c4c37812c79c884\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-w8xwj" podUID="ee957723-a32e-4e2c-b9a1-bcecc48992cf" Sep 12 23:09:43.484061 containerd[1565]: time="2025-09-12T23:09:43.483971282Z" level=info msg="Container 830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:43.500724 containerd[1565]: time="2025-09-12T23:09:43.500666485Z" level=info msg="CreateContainer within sandbox \"8157f18a602af92a7ef5ecc4dcff84f8f00aa80a70cd80bafc2a0f72c3b40081\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa\"" Sep 12 23:09:43.501223 containerd[1565]: time="2025-09-12T23:09:43.501190919Z" level=info msg="StartContainer for \"830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa\"" Sep 12 23:09:43.505937 containerd[1565]: time="2025-09-12T23:09:43.505893488Z" level=info msg="connecting to shim 830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa" address="unix:///run/containerd/s/1a455a0ac31b7703e71ca6838255c5bd4e498d51f7c46cde789294f1cc478ddc" protocol=ttrpc version=3 Sep 12 23:09:43.596015 systemd[1]: Started cri-containerd-830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa.scope - libcontainer container 830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa. Sep 12 23:09:43.671981 containerd[1565]: time="2025-09-12T23:09:43.671925364Z" level=info msg="StartContainer for \"830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa\" returns successfully" Sep 12 23:09:43.745089 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 23:09:43.746147 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 23:09:43.902041 containerd[1565]: time="2025-09-12T23:09:43.901973223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-49xj7,Uid:8f0d555c-c909-4b23-ab0e-a3117d51e702,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:09:43.902754 containerd[1565]: time="2025-09-12T23:09:43.902709045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bfd95f59f-wx5z6,Uid:33f18c28-0d65-4d63-9999-ef624c2e4338,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:09:43.942243 kubelet[2735]: I0912 23:09:43.941053 2735 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79nsp\" (UniqueName: \"kubernetes.io/projected/922b6645-484d-4129-a67b-ab9327cccfde-kube-api-access-79nsp\") pod \"922b6645-484d-4129-a67b-ab9327cccfde\" (UID: \"922b6645-484d-4129-a67b-ab9327cccfde\") " Sep 12 23:09:43.942243 kubelet[2735]: I0912 23:09:43.941104 2735 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/922b6645-484d-4129-a67b-ab9327cccfde-whisker-ca-bundle\") pod \"922b6645-484d-4129-a67b-ab9327cccfde\" (UID: \"922b6645-484d-4129-a67b-ab9327cccfde\") " Sep 12 23:09:43.942243 kubelet[2735]: I0912 23:09:43.941127 2735 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/922b6645-484d-4129-a67b-ab9327cccfde-whisker-backend-key-pair\") pod \"922b6645-484d-4129-a67b-ab9327cccfde\" (UID: \"922b6645-484d-4129-a67b-ab9327cccfde\") " Sep 12 23:09:43.945809 kubelet[2735]: I0912 23:09:43.945761 2735 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/922b6645-484d-4129-a67b-ab9327cccfde-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "922b6645-484d-4129-a67b-ab9327cccfde" (UID: "922b6645-484d-4129-a67b-ab9327cccfde"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 23:09:43.949479 kubelet[2735]: I0912 23:09:43.949446 2735 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922b6645-484d-4129-a67b-ab9327cccfde-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "922b6645-484d-4129-a67b-ab9327cccfde" (UID: "922b6645-484d-4129-a67b-ab9327cccfde"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 23:09:43.950185 kubelet[2735]: I0912 23:09:43.950141 2735 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922b6645-484d-4129-a67b-ab9327cccfde-kube-api-access-79nsp" (OuterVolumeSpecName: "kube-api-access-79nsp") pod "922b6645-484d-4129-a67b-ab9327cccfde" (UID: "922b6645-484d-4129-a67b-ab9327cccfde"). InnerVolumeSpecName "kube-api-access-79nsp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 23:09:43.977215 systemd[1]: run-netns-cni\x2d31d6070e\x2d5381\x2df1b0\x2d568a\x2d72bfae80027c.mount: Deactivated successfully. Sep 12 23:09:43.977628 systemd[1]: run-netns-cni\x2d72962dd5\x2d30ce\x2dbe16\x2de84e\x2dafce87421188.mount: Deactivated successfully. Sep 12 23:09:43.977805 systemd[1]: run-netns-cni\x2d28b04ac2\x2d2c36\x2d0287\x2dc50c\x2d78f3d1c7a5dd.mount: Deactivated successfully. Sep 12 23:09:43.977965 systemd[1]: var-lib-kubelet-pods-922b6645\x2d484d\x2d4129\x2da67b\x2dab9327cccfde-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d79nsp.mount: Deactivated successfully. Sep 12 23:09:43.978152 systemd[1]: var-lib-kubelet-pods-922b6645\x2d484d\x2d4129\x2da67b\x2dab9327cccfde-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 23:09:44.042112 kubelet[2735]: I0912 23:09:44.041949 2735 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/922b6645-484d-4129-a67b-ab9327cccfde-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 23:09:44.042112 kubelet[2735]: I0912 23:09:44.042004 2735 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/922b6645-484d-4129-a67b-ab9327cccfde-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 23:09:44.042112 kubelet[2735]: I0912 23:09:44.042013 2735 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-79nsp\" (UniqueName: \"kubernetes.io/projected/922b6645-484d-4129-a67b-ab9327cccfde-kube-api-access-79nsp\") on node \"localhost\" DevicePath \"\"" Sep 12 23:09:44.108795 systemd-networkd[1450]: cali9ea4b44843d: Link UP Sep 12 23:09:44.109464 systemd-networkd[1450]: cali9ea4b44843d: Gained carrier Sep 12 23:09:44.126356 containerd[1565]: 2025-09-12 23:09:43.969 [INFO][4167] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:09:44.126356 containerd[1565]: 2025-09-12 23:09:43.998 [INFO][4167] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0 calico-apiserver-8f4ff4584- calico-apiserver 8f0d555c-c909-4b23-ab0e-a3117d51e702 874 0 2025-09-12 23:09:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8f4ff4584 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8f4ff4584-49xj7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9ea4b44843d [] [] }} ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-" Sep 12 23:09:44.126356 containerd[1565]: 2025-09-12 23:09:43.998 [INFO][4167] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:09:44.126356 containerd[1565]: 2025-09-12 23:09:44.062 [INFO][4199] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" HandleID="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Workload="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.063 [INFO][4199] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" HandleID="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Workload="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00019fad0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8f4ff4584-49xj7", "timestamp":"2025-09-12 23:09:44.062090387 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.063 [INFO][4199] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.063 [INFO][4199] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.063 [INFO][4199] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.070 [INFO][4199] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" host="localhost" Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.076 [INFO][4199] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.080 [INFO][4199] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.082 [INFO][4199] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.083 [INFO][4199] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:44.126710 containerd[1565]: 2025-09-12 23:09:44.084 [INFO][4199] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" host="localhost" Sep 12 23:09:44.127063 containerd[1565]: 2025-09-12 23:09:44.085 [INFO][4199] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f Sep 12 23:09:44.127063 containerd[1565]: 2025-09-12 23:09:44.089 [INFO][4199] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" host="localhost" Sep 12 23:09:44.127063 containerd[1565]: 2025-09-12 23:09:44.094 [INFO][4199] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" host="localhost" Sep 12 23:09:44.127063 containerd[1565]: 2025-09-12 23:09:44.094 [INFO][4199] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" host="localhost" Sep 12 23:09:44.127063 containerd[1565]: 2025-09-12 23:09:44.094 [INFO][4199] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:44.127063 containerd[1565]: 2025-09-12 23:09:44.094 [INFO][4199] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" HandleID="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Workload="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:09:44.127270 containerd[1565]: 2025-09-12 23:09:44.099 [INFO][4167] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0", GenerateName:"calico-apiserver-8f4ff4584-", Namespace:"calico-apiserver", SelfLink:"", UID:"8f0d555c-c909-4b23-ab0e-a3117d51e702", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8f4ff4584", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8f4ff4584-49xj7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9ea4b44843d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:44.127361 containerd[1565]: 2025-09-12 23:09:44.099 [INFO][4167] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:09:44.127361 containerd[1565]: 2025-09-12 23:09:44.099 [INFO][4167] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9ea4b44843d ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:09:44.127361 containerd[1565]: 2025-09-12 23:09:44.109 [INFO][4167] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:09:44.127466 containerd[1565]: 2025-09-12 23:09:44.109 [INFO][4167] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0", GenerateName:"calico-apiserver-8f4ff4584-", Namespace:"calico-apiserver", SelfLink:"", UID:"8f0d555c-c909-4b23-ab0e-a3117d51e702", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8f4ff4584", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f", Pod:"calico-apiserver-8f4ff4584-49xj7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9ea4b44843d", MAC:"26:d8:51:e6:1a:74", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:44.127552 containerd[1565]: 2025-09-12 23:09:44.119 [INFO][4167] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Namespace="calico-apiserver" Pod="calico-apiserver-8f4ff4584-49xj7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:09:44.203217 systemd-networkd[1450]: calid6fb2575b28: Link UP Sep 12 23:09:44.204581 systemd-networkd[1450]: calid6fb2575b28: Gained carrier Sep 12 23:09:44.215789 containerd[1565]: time="2025-09-12T23:09:44.215607058Z" level=info msg="connecting to shim e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" address="unix:///run/containerd/s/5e7da22ea49fc7e9d458a6152efbb18de7e6fb4385874bb9e7cd8d2c2cb01130" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:44.219515 containerd[1565]: 2025-09-12 23:09:43.970 [INFO][4174] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:09:44.219515 containerd[1565]: 2025-09-12 23:09:43.997 [INFO][4174] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0 calico-apiserver-6bfd95f59f- calico-apiserver 33f18c28-0d65-4d63-9999-ef624c2e4338 873 0 2025-09-12 23:09:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bfd95f59f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6bfd95f59f-wx5z6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid6fb2575b28 [] [] }} ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-" Sep 12 23:09:44.219515 containerd[1565]: 2025-09-12 23:09:43.998 [INFO][4174] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" Sep 12 23:09:44.219515 containerd[1565]: 2025-09-12 23:09:44.063 [INFO][4202] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" HandleID="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Workload="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.063 [INFO][4202] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" HandleID="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Workload="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002053c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6bfd95f59f-wx5z6", "timestamp":"2025-09-12 23:09:44.063209277 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.063 [INFO][4202] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.094 [INFO][4202] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.095 [INFO][4202] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.171 [INFO][4202] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" host="localhost" Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.176 [INFO][4202] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.181 [INFO][4202] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.183 [INFO][4202] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.185 [INFO][4202] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:44.219713 containerd[1565]: 2025-09-12 23:09:44.185 [INFO][4202] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" host="localhost" Sep 12 23:09:44.219933 containerd[1565]: 2025-09-12 23:09:44.186 [INFO][4202] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b Sep 12 23:09:44.219933 containerd[1565]: 2025-09-12 23:09:44.190 [INFO][4202] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" host="localhost" Sep 12 23:09:44.219933 containerd[1565]: 2025-09-12 23:09:44.195 [INFO][4202] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" host="localhost" Sep 12 23:09:44.219933 containerd[1565]: 2025-09-12 23:09:44.195 [INFO][4202] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" host="localhost" Sep 12 23:09:44.219933 containerd[1565]: 2025-09-12 23:09:44.195 [INFO][4202] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:44.219933 containerd[1565]: 2025-09-12 23:09:44.195 [INFO][4202] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" HandleID="k8s-pod-network.33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Workload="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" Sep 12 23:09:44.220047 containerd[1565]: 2025-09-12 23:09:44.199 [INFO][4174] cni-plugin/k8s.go 418: Populated endpoint ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0", GenerateName:"calico-apiserver-6bfd95f59f-", Namespace:"calico-apiserver", SelfLink:"", UID:"33f18c28-0d65-4d63-9999-ef624c2e4338", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bfd95f59f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6bfd95f59f-wx5z6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid6fb2575b28", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:44.220108 containerd[1565]: 2025-09-12 23:09:44.199 [INFO][4174] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" Sep 12 23:09:44.220108 containerd[1565]: 2025-09-12 23:09:44.199 [INFO][4174] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6fb2575b28 ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" Sep 12 23:09:44.220108 containerd[1565]: 2025-09-12 23:09:44.205 [INFO][4174] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" Sep 12 23:09:44.220177 containerd[1565]: 2025-09-12 23:09:44.205 [INFO][4174] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0", GenerateName:"calico-apiserver-6bfd95f59f-", Namespace:"calico-apiserver", SelfLink:"", UID:"33f18c28-0d65-4d63-9999-ef624c2e4338", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bfd95f59f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b", Pod:"calico-apiserver-6bfd95f59f-wx5z6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid6fb2575b28", MAC:"1e:78:7c:68:e7:77", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:44.220239 containerd[1565]: 2025-09-12 23:09:44.213 [INFO][4174] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-wx5z6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--wx5z6-eth0" Sep 12 23:09:44.247338 containerd[1565]: time="2025-09-12T23:09:44.247270077Z" level=info msg="connecting to shim 33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b" address="unix:///run/containerd/s/2524f86dc1b37d7addaaba1ee7c36b1f8d335befcd37d58d87e0b9cacac2252c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:44.247838 systemd[1]: Started cri-containerd-e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f.scope - libcontainer container e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f. Sep 12 23:09:44.273883 systemd[1]: Started cri-containerd-33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b.scope - libcontainer container 33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b. Sep 12 23:09:44.278097 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:09:44.288453 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:09:44.315614 containerd[1565]: time="2025-09-12T23:09:44.315430693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8f4ff4584-49xj7,Uid:8f0d555c-c909-4b23-ab0e-a3117d51e702,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\"" Sep 12 23:09:44.321750 containerd[1565]: time="2025-09-12T23:09:44.321721172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:09:44.329221 containerd[1565]: time="2025-09-12T23:09:44.329184241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bfd95f59f-wx5z6,Uid:33f18c28-0d65-4d63-9999-ef624c2e4338,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b\"" Sep 12 23:09:44.569886 systemd[1]: Removed slice kubepods-besteffort-pod922b6645_484d_4129_a67b_ab9327cccfde.slice - libcontainer container kubepods-besteffort-pod922b6645_484d_4129_a67b_ab9327cccfde.slice. Sep 12 23:09:44.577574 kubelet[2735]: I0912 23:09:44.577494 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-kbs4n" podStartSLOduration=2.708807401 podStartE2EDuration="31.577461022s" podCreationTimestamp="2025-09-12 23:09:13 +0000 UTC" firstStartedPulling="2025-09-12 23:09:14.19720406 +0000 UTC m=+20.493241801" lastFinishedPulling="2025-09-12 23:09:43.065857681 +0000 UTC m=+49.361895422" observedRunningTime="2025-09-12 23:09:44.577132024 +0000 UTC m=+50.873169775" watchObservedRunningTime="2025-09-12 23:09:44.577461022 +0000 UTC m=+50.873498763" Sep 12 23:09:44.649983 kubelet[2735]: I0912 23:09:44.649736 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/761fbc56-9cf4-48c4-9e76-ba8c5984fff8-whisker-backend-key-pair\") pod \"whisker-78dd598d96-244zs\" (UID: \"761fbc56-9cf4-48c4-9e76-ba8c5984fff8\") " pod="calico-system/whisker-78dd598d96-244zs" Sep 12 23:09:44.649983 kubelet[2735]: I0912 23:09:44.649785 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8982p\" (UniqueName: \"kubernetes.io/projected/761fbc56-9cf4-48c4-9e76-ba8c5984fff8-kube-api-access-8982p\") pod \"whisker-78dd598d96-244zs\" (UID: \"761fbc56-9cf4-48c4-9e76-ba8c5984fff8\") " pod="calico-system/whisker-78dd598d96-244zs" Sep 12 23:09:44.650161 kubelet[2735]: I0912 23:09:44.649838 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761fbc56-9cf4-48c4-9e76-ba8c5984fff8-whisker-ca-bundle\") pod \"whisker-78dd598d96-244zs\" (UID: \"761fbc56-9cf4-48c4-9e76-ba8c5984fff8\") " pod="calico-system/whisker-78dd598d96-244zs" Sep 12 23:09:44.653048 systemd[1]: Created slice kubepods-besteffort-pod761fbc56_9cf4_48c4_9e76_ba8c5984fff8.slice - libcontainer container kubepods-besteffort-pod761fbc56_9cf4_48c4_9e76_ba8c5984fff8.slice. Sep 12 23:09:44.703834 containerd[1565]: time="2025-09-12T23:09:44.703784654Z" level=info msg="TaskExit event in podsandbox handler container_id:\"830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa\" id:\"91f344d3ddced508e677bea7def2f87a7cd78cbfdd6e98588e22ef9ec3942b85\" pid:4341 exit_status:1 exited_at:{seconds:1757718584 nanos:703420982}" Sep 12 23:09:44.957011 containerd[1565]: time="2025-09-12T23:09:44.956943329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78dd598d96-244zs,Uid:761fbc56-9cf4-48c4-9e76-ba8c5984fff8,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:45.076475 systemd-networkd[1450]: calibfeefb77fdd: Link UP Sep 12 23:09:45.076712 systemd-networkd[1450]: calibfeefb77fdd: Gained carrier Sep 12 23:09:45.094942 containerd[1565]: 2025-09-12 23:09:44.986 [INFO][4355] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:09:45.094942 containerd[1565]: 2025-09-12 23:09:44.996 [INFO][4355] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--78dd598d96--244zs-eth0 whisker-78dd598d96- calico-system 761fbc56-9cf4-48c4-9e76-ba8c5984fff8 974 0 2025-09-12 23:09:44 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78dd598d96 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-78dd598d96-244zs eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calibfeefb77fdd [] [] }} ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-" Sep 12 23:09:45.094942 containerd[1565]: 2025-09-12 23:09:44.996 [INFO][4355] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-eth0" Sep 12 23:09:45.094942 containerd[1565]: 2025-09-12 23:09:45.025 [INFO][4370] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" HandleID="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Workload="localhost-k8s-whisker--78dd598d96--244zs-eth0" Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.025 [INFO][4370] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" HandleID="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Workload="localhost-k8s-whisker--78dd598d96--244zs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002bc120), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-78dd598d96-244zs", "timestamp":"2025-09-12 23:09:45.02537509 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.025 [INFO][4370] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.025 [INFO][4370] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.025 [INFO][4370] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.032 [INFO][4370] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" host="localhost" Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.036 [INFO][4370] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.043 [INFO][4370] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.045 [INFO][4370] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.051 [INFO][4370] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:45.095230 containerd[1565]: 2025-09-12 23:09:45.051 [INFO][4370] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" host="localhost" Sep 12 23:09:45.095576 containerd[1565]: 2025-09-12 23:09:45.054 [INFO][4370] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23 Sep 12 23:09:45.095576 containerd[1565]: 2025-09-12 23:09:45.059 [INFO][4370] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" host="localhost" Sep 12 23:09:45.095576 containerd[1565]: 2025-09-12 23:09:45.066 [INFO][4370] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" host="localhost" Sep 12 23:09:45.095576 containerd[1565]: 2025-09-12 23:09:45.067 [INFO][4370] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" host="localhost" Sep 12 23:09:45.095576 containerd[1565]: 2025-09-12 23:09:45.067 [INFO][4370] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:45.095576 containerd[1565]: 2025-09-12 23:09:45.067 [INFO][4370] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" HandleID="k8s-pod-network.a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Workload="localhost-k8s-whisker--78dd598d96--244zs-eth0" Sep 12 23:09:45.095756 containerd[1565]: 2025-09-12 23:09:45.071 [INFO][4355] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78dd598d96--244zs-eth0", GenerateName:"whisker-78dd598d96-", Namespace:"calico-system", SelfLink:"", UID:"761fbc56-9cf4-48c4-9e76-ba8c5984fff8", ResourceVersion:"974", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78dd598d96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-78dd598d96-244zs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibfeefb77fdd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:45.095756 containerd[1565]: 2025-09-12 23:09:45.071 [INFO][4355] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-eth0" Sep 12 23:09:45.095834 containerd[1565]: 2025-09-12 23:09:45.071 [INFO][4355] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibfeefb77fdd ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-eth0" Sep 12 23:09:45.095834 containerd[1565]: 2025-09-12 23:09:45.074 [INFO][4355] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-eth0" Sep 12 23:09:45.095873 containerd[1565]: 2025-09-12 23:09:45.075 [INFO][4355] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78dd598d96--244zs-eth0", GenerateName:"whisker-78dd598d96-", Namespace:"calico-system", SelfLink:"", UID:"761fbc56-9cf4-48c4-9e76-ba8c5984fff8", ResourceVersion:"974", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78dd598d96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23", Pod:"whisker-78dd598d96-244zs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibfeefb77fdd", MAC:"f6:d9:b7:80:b3:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:45.095925 containerd[1565]: 2025-09-12 23:09:45.086 [INFO][4355] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" Namespace="calico-system" Pod="whisker-78dd598d96-244zs" WorkloadEndpoint="localhost-k8s-whisker--78dd598d96--244zs-eth0" Sep 12 23:09:45.132296 containerd[1565]: time="2025-09-12T23:09:45.132188073Z" level=info msg="connecting to shim a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23" address="unix:///run/containerd/s/87eb3522931f72f876530478a5a37069b825599d7d9fc7f46591aa5073358e05" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:45.177152 systemd[1]: Started cri-containerd-a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23.scope - libcontainer container a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23. Sep 12 23:09:45.203370 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:09:45.315866 containerd[1565]: time="2025-09-12T23:09:45.315725098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78dd598d96-244zs,Uid:761fbc56-9cf4-48c4-9e76-ba8c5984fff8,Namespace:calico-system,Attempt:0,} returns sandbox id \"a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23\"" Sep 12 23:09:45.503840 systemd-networkd[1450]: cali9ea4b44843d: Gained IPv6LL Sep 12 23:09:45.569923 systemd-networkd[1450]: calid6fb2575b28: Gained IPv6LL Sep 12 23:09:45.596445 systemd-networkd[1450]: vxlan.calico: Link UP Sep 12 23:09:45.596458 systemd-networkd[1450]: vxlan.calico: Gained carrier Sep 12 23:09:45.703234 containerd[1565]: time="2025-09-12T23:09:45.703180516Z" level=info msg="TaskExit event in podsandbox handler container_id:\"830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa\" id:\"1b3439f7483fe5b36a12dd7b42adad38a193ef57aea91dc732a1ed628eba3320\" pid:4585 exit_status:1 exited_at:{seconds:1757718585 nanos:702599926}" Sep 12 23:09:45.901072 kubelet[2735]: I0912 23:09:45.900939 2735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="922b6645-484d-4129-a67b-ab9327cccfde" path="/var/lib/kubelet/pods/922b6645-484d-4129-a67b-ab9327cccfde/volumes" Sep 12 23:09:46.655789 systemd-networkd[1450]: vxlan.calico: Gained IPv6LL Sep 12 23:09:46.719939 systemd-networkd[1450]: calibfeefb77fdd: Gained IPv6LL Sep 12 23:09:47.805583 containerd[1565]: time="2025-09-12T23:09:47.805524418Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:47.806492 containerd[1565]: time="2025-09-12T23:09:47.806455436Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 23:09:47.807543 containerd[1565]: time="2025-09-12T23:09:47.807497712Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:47.811596 containerd[1565]: time="2025-09-12T23:09:47.811525823Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:47.812150 containerd[1565]: time="2025-09-12T23:09:47.812108847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.490354724s" Sep 12 23:09:47.812150 containerd[1565]: time="2025-09-12T23:09:47.812146748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 23:09:47.816085 containerd[1565]: time="2025-09-12T23:09:47.816045957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:09:47.824551 containerd[1565]: time="2025-09-12T23:09:47.824510954Z" level=info msg="CreateContainer within sandbox \"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:09:47.859204 containerd[1565]: time="2025-09-12T23:09:47.859163074Z" level=info msg="Container 0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:47.874852 containerd[1565]: time="2025-09-12T23:09:47.874799472Z" level=info msg="CreateContainer within sandbox \"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\"" Sep 12 23:09:47.875464 containerd[1565]: time="2025-09-12T23:09:47.875336771Z" level=info msg="StartContainer for \"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\"" Sep 12 23:09:47.876343 containerd[1565]: time="2025-09-12T23:09:47.876319055Z" level=info msg="connecting to shim 0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7" address="unix:///run/containerd/s/5e7da22ea49fc7e9d458a6152efbb18de7e6fb4385874bb9e7cd8d2c2cb01130" protocol=ttrpc version=3 Sep 12 23:09:47.896873 systemd[1]: Started cri-containerd-0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7.scope - libcontainer container 0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7. Sep 12 23:09:48.073857 containerd[1565]: time="2025-09-12T23:09:48.073716097Z" level=info msg="StartContainer for \"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" returns successfully" Sep 12 23:09:48.198492 containerd[1565]: time="2025-09-12T23:09:48.198428570Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:48.199230 containerd[1565]: time="2025-09-12T23:09:48.199207582Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 23:09:48.201257 containerd[1565]: time="2025-09-12T23:09:48.201209099Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 385.133557ms" Sep 12 23:09:48.201257 containerd[1565]: time="2025-09-12T23:09:48.201237743Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 23:09:48.202144 containerd[1565]: time="2025-09-12T23:09:48.202111974Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 23:09:48.206797 containerd[1565]: time="2025-09-12T23:09:48.206747755Z" level=info msg="CreateContainer within sandbox \"33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:09:48.220555 containerd[1565]: time="2025-09-12T23:09:48.219839626Z" level=info msg="Container 22bca165d1cbbe146d4fa6ff70c50d5dcbc24acec6904eb38a242b806845e61e: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:48.230960 containerd[1565]: time="2025-09-12T23:09:48.230908981Z" level=info msg="CreateContainer within sandbox \"33d6ec978237fd428aec44833130fbe3530cb7e3a9b2b430e42ff3e24f31361b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"22bca165d1cbbe146d4fa6ff70c50d5dcbc24acec6904eb38a242b806845e61e\"" Sep 12 23:09:48.231767 containerd[1565]: time="2025-09-12T23:09:48.231727527Z" level=info msg="StartContainer for \"22bca165d1cbbe146d4fa6ff70c50d5dcbc24acec6904eb38a242b806845e61e\"" Sep 12 23:09:48.232845 containerd[1565]: time="2025-09-12T23:09:48.232813205Z" level=info msg="connecting to shim 22bca165d1cbbe146d4fa6ff70c50d5dcbc24acec6904eb38a242b806845e61e" address="unix:///run/containerd/s/2524f86dc1b37d7addaaba1ee7c36b1f8d335befcd37d58d87e0b9cacac2252c" protocol=ttrpc version=3 Sep 12 23:09:48.255854 systemd[1]: Started cri-containerd-22bca165d1cbbe146d4fa6ff70c50d5dcbc24acec6904eb38a242b806845e61e.scope - libcontainer container 22bca165d1cbbe146d4fa6ff70c50d5dcbc24acec6904eb38a242b806845e61e. Sep 12 23:09:48.316394 containerd[1565]: time="2025-09-12T23:09:48.316274577Z" level=info msg="StartContainer for \"22bca165d1cbbe146d4fa6ff70c50d5dcbc24acec6904eb38a242b806845e61e\" returns successfully" Sep 12 23:09:48.632499 kubelet[2735]: I0912 23:09:48.632417 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bfd95f59f-wx5z6" podStartSLOduration=34.760712706 podStartE2EDuration="38.632398768s" podCreationTimestamp="2025-09-12 23:09:10 +0000 UTC" firstStartedPulling="2025-09-12 23:09:44.3302433 +0000 UTC m=+50.626281041" lastFinishedPulling="2025-09-12 23:09:48.201929362 +0000 UTC m=+54.497967103" observedRunningTime="2025-09-12 23:09:48.606213603 +0000 UTC m=+54.902251344" watchObservedRunningTime="2025-09-12 23:09:48.632398768 +0000 UTC m=+54.928436509" Sep 12 23:09:48.633134 kubelet[2735]: I0912 23:09:48.632790 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8f4ff4584-49xj7" podStartSLOduration=36.140657792 podStartE2EDuration="39.632786185s" podCreationTimestamp="2025-09-12 23:09:09 +0000 UTC" firstStartedPulling="2025-09-12 23:09:44.320784383 +0000 UTC m=+50.616822124" lastFinishedPulling="2025-09-12 23:09:47.812912776 +0000 UTC m=+54.108950517" observedRunningTime="2025-09-12 23:09:48.632114143 +0000 UTC m=+54.928151884" watchObservedRunningTime="2025-09-12 23:09:48.632786185 +0000 UTC m=+54.928823926" Sep 12 23:09:49.577229 kubelet[2735]: I0912 23:09:49.577177 2735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:09:49.681696 kubelet[2735]: I0912 23:09:49.681622 2735 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvhrw\" (UniqueName: \"kubernetes.io/projected/e769e284-06fa-4f9a-98a3-28a76d353dce-kube-api-access-tvhrw\") pod \"e769e284-06fa-4f9a-98a3-28a76d353dce\" (UID: \"e769e284-06fa-4f9a-98a3-28a76d353dce\") " Sep 12 23:09:49.682259 kubelet[2735]: I0912 23:09:49.681746 2735 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e769e284-06fa-4f9a-98a3-28a76d353dce-calico-apiserver-certs\") pod \"e769e284-06fa-4f9a-98a3-28a76d353dce\" (UID: \"e769e284-06fa-4f9a-98a3-28a76d353dce\") " Sep 12 23:09:49.689777 kubelet[2735]: I0912 23:09:49.689708 2735 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e769e284-06fa-4f9a-98a3-28a76d353dce-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "e769e284-06fa-4f9a-98a3-28a76d353dce" (UID: "e769e284-06fa-4f9a-98a3-28a76d353dce"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 23:09:49.691262 kubelet[2735]: I0912 23:09:49.691221 2735 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e769e284-06fa-4f9a-98a3-28a76d353dce-kube-api-access-tvhrw" (OuterVolumeSpecName: "kube-api-access-tvhrw") pod "e769e284-06fa-4f9a-98a3-28a76d353dce" (UID: "e769e284-06fa-4f9a-98a3-28a76d353dce"). InnerVolumeSpecName "kube-api-access-tvhrw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 23:09:49.692094 systemd[1]: var-lib-kubelet-pods-e769e284\x2d06fa\x2d4f9a\x2d98a3\x2d28a76d353dce-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtvhrw.mount: Deactivated successfully. Sep 12 23:09:49.692252 systemd[1]: var-lib-kubelet-pods-e769e284\x2d06fa\x2d4f9a\x2d98a3\x2d28a76d353dce-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 12 23:09:49.708509 systemd[1]: Created slice kubepods-besteffort-podda8a039d_7002_456d_8c66_80d87fca79e8.slice - libcontainer container kubepods-besteffort-podda8a039d_7002_456d_8c66_80d87fca79e8.slice. Sep 12 23:09:49.782322 kubelet[2735]: I0912 23:09:49.782251 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbgcl\" (UniqueName: \"kubernetes.io/projected/da8a039d-7002-456d-8c66-80d87fca79e8-kube-api-access-vbgcl\") pod \"calico-apiserver-6bfd95f59f-4ls7l\" (UID: \"da8a039d-7002-456d-8c66-80d87fca79e8\") " pod="calico-apiserver/calico-apiserver-6bfd95f59f-4ls7l" Sep 12 23:09:49.782322 kubelet[2735]: I0912 23:09:49.782300 2735 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/da8a039d-7002-456d-8c66-80d87fca79e8-calico-apiserver-certs\") pod \"calico-apiserver-6bfd95f59f-4ls7l\" (UID: \"da8a039d-7002-456d-8c66-80d87fca79e8\") " pod="calico-apiserver/calico-apiserver-6bfd95f59f-4ls7l" Sep 12 23:09:49.782322 kubelet[2735]: I0912 23:09:49.782328 2735 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e769e284-06fa-4f9a-98a3-28a76d353dce-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 12 23:09:49.782322 kubelet[2735]: I0912 23:09:49.782338 2735 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvhrw\" (UniqueName: \"kubernetes.io/projected/e769e284-06fa-4f9a-98a3-28a76d353dce-kube-api-access-tvhrw\") on node \"localhost\" DevicePath \"\"" Sep 12 23:09:49.794619 containerd[1565]: time="2025-09-12T23:09:49.794239810Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:49.795459 containerd[1565]: time="2025-09-12T23:09:49.795393044Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 23:09:49.796926 containerd[1565]: time="2025-09-12T23:09:49.796866219Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:49.799121 containerd[1565]: time="2025-09-12T23:09:49.799084743Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:49.799733 containerd[1565]: time="2025-09-12T23:09:49.799704265Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.597563918s" Sep 12 23:09:49.799770 containerd[1565]: time="2025-09-12T23:09:49.799738009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 23:09:49.804812 containerd[1565]: time="2025-09-12T23:09:49.804766525Z" level=info msg="CreateContainer within sandbox \"a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 23:09:49.812669 containerd[1565]: time="2025-09-12T23:09:49.812617720Z" level=info msg="Container 51d65690ca793a5c8403eae171fd3ea8b193c53ed3d8096cbd674e8484533012: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:49.826870 containerd[1565]: time="2025-09-12T23:09:49.826801189Z" level=info msg="CreateContainer within sandbox \"a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"51d65690ca793a5c8403eae171fd3ea8b193c53ed3d8096cbd674e8484533012\"" Sep 12 23:09:49.828703 containerd[1565]: time="2025-09-12T23:09:49.828301795Z" level=info msg="StartContainer for \"51d65690ca793a5c8403eae171fd3ea8b193c53ed3d8096cbd674e8484533012\"" Sep 12 23:09:49.831954 containerd[1565]: time="2025-09-12T23:09:49.831904156Z" level=info msg="connecting to shim 51d65690ca793a5c8403eae171fd3ea8b193c53ed3d8096cbd674e8484533012" address="unix:///run/containerd/s/87eb3522931f72f876530478a5a37069b825599d7d9fc7f46591aa5073358e05" protocol=ttrpc version=3 Sep 12 23:09:49.862900 systemd[1]: Started cri-containerd-51d65690ca793a5c8403eae171fd3ea8b193c53ed3d8096cbd674e8484533012.scope - libcontainer container 51d65690ca793a5c8403eae171fd3ea8b193c53ed3d8096cbd674e8484533012. Sep 12 23:09:49.909278 systemd[1]: Removed slice kubepods-besteffort-pode769e284_06fa_4f9a_98a3_28a76d353dce.slice - libcontainer container kubepods-besteffort-pode769e284_06fa_4f9a_98a3_28a76d353dce.slice. Sep 12 23:09:50.010169 containerd[1565]: time="2025-09-12T23:09:50.010119748Z" level=info msg="StartContainer for \"51d65690ca793a5c8403eae171fd3ea8b193c53ed3d8096cbd674e8484533012\" returns successfully" Sep 12 23:09:50.011694 containerd[1565]: time="2025-09-12T23:09:50.011620104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 23:09:50.015381 containerd[1565]: time="2025-09-12T23:09:50.015333042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bfd95f59f-4ls7l,Uid:da8a039d-7002-456d-8c66-80d87fca79e8,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:09:50.124751 systemd-networkd[1450]: cali62d1ab6d49a: Link UP Sep 12 23:09:50.125410 systemd-networkd[1450]: cali62d1ab6d49a: Gained carrier Sep 12 23:09:50.142423 containerd[1565]: 2025-09-12 23:09:50.056 [INFO][4785] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0 calico-apiserver-6bfd95f59f- calico-apiserver da8a039d-7002-456d-8c66-80d87fca79e8 1026 0 2025-09-12 23:09:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bfd95f59f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6bfd95f59f-4ls7l eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali62d1ab6d49a [] [] }} ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-" Sep 12 23:09:50.142423 containerd[1565]: 2025-09-12 23:09:50.056 [INFO][4785] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" Sep 12 23:09:50.142423 containerd[1565]: 2025-09-12 23:09:50.083 [INFO][4802] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" HandleID="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Workload="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.083 [INFO][4802] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" HandleID="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Workload="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e810), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6bfd95f59f-4ls7l", "timestamp":"2025-09-12 23:09:50.083791485 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.084 [INFO][4802] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.084 [INFO][4802] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.084 [INFO][4802] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.092 [INFO][4802] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" host="localhost" Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.097 [INFO][4802] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.101 [INFO][4802] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.103 [INFO][4802] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.105 [INFO][4802] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:50.142625 containerd[1565]: 2025-09-12 23:09:50.105 [INFO][4802] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" host="localhost" Sep 12 23:09:50.142867 containerd[1565]: 2025-09-12 23:09:50.107 [INFO][4802] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259 Sep 12 23:09:50.142867 containerd[1565]: 2025-09-12 23:09:50.110 [INFO][4802] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" host="localhost" Sep 12 23:09:50.142867 containerd[1565]: 2025-09-12 23:09:50.116 [INFO][4802] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" host="localhost" Sep 12 23:09:50.142867 containerd[1565]: 2025-09-12 23:09:50.117 [INFO][4802] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" host="localhost" Sep 12 23:09:50.142867 containerd[1565]: 2025-09-12 23:09:50.117 [INFO][4802] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:50.142867 containerd[1565]: 2025-09-12 23:09:50.117 [INFO][4802] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" HandleID="k8s-pod-network.90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Workload="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" Sep 12 23:09:50.143012 containerd[1565]: 2025-09-12 23:09:50.122 [INFO][4785] cni-plugin/k8s.go 418: Populated endpoint ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0", GenerateName:"calico-apiserver-6bfd95f59f-", Namespace:"calico-apiserver", SelfLink:"", UID:"da8a039d-7002-456d-8c66-80d87fca79e8", ResourceVersion:"1026", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bfd95f59f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6bfd95f59f-4ls7l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali62d1ab6d49a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:50.143066 containerd[1565]: 2025-09-12 23:09:50.122 [INFO][4785] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" Sep 12 23:09:50.143066 containerd[1565]: 2025-09-12 23:09:50.122 [INFO][4785] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali62d1ab6d49a ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" Sep 12 23:09:50.143066 containerd[1565]: 2025-09-12 23:09:50.126 [INFO][4785] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" Sep 12 23:09:50.143135 containerd[1565]: 2025-09-12 23:09:50.126 [INFO][4785] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0", GenerateName:"calico-apiserver-6bfd95f59f-", Namespace:"calico-apiserver", SelfLink:"", UID:"da8a039d-7002-456d-8c66-80d87fca79e8", ResourceVersion:"1026", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bfd95f59f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259", Pod:"calico-apiserver-6bfd95f59f-4ls7l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali62d1ab6d49a", MAC:"26:d3:19:1f:f7:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:50.143185 containerd[1565]: 2025-09-12 23:09:50.137 [INFO][4785] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" Namespace="calico-apiserver" Pod="calico-apiserver-6bfd95f59f-4ls7l" WorkloadEndpoint="localhost-k8s-calico--apiserver--6bfd95f59f--4ls7l-eth0" Sep 12 23:09:50.180742 containerd[1565]: time="2025-09-12T23:09:50.180684779Z" level=info msg="connecting to shim 90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259" address="unix:///run/containerd/s/102fb3762c13930579341edf6153c0359d3bb783c424a7fe4d83fef1414b600f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:50.217808 systemd[1]: Started cri-containerd-90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259.scope - libcontainer container 90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259. Sep 12 23:09:50.233099 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:09:50.267752 containerd[1565]: time="2025-09-12T23:09:50.267688294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bfd95f59f-4ls7l,Uid:da8a039d-7002-456d-8c66-80d87fca79e8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259\"" Sep 12 23:09:50.273130 containerd[1565]: time="2025-09-12T23:09:50.273091134Z" level=info msg="CreateContainer within sandbox \"90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:09:50.282090 containerd[1565]: time="2025-09-12T23:09:50.282057159Z" level=info msg="Container b3c04627ae6287ed32322635f352b8e45c54ef19cf7ccc066b6fbceae5243bac: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:50.289897 containerd[1565]: time="2025-09-12T23:09:50.289860734Z" level=info msg="CreateContainer within sandbox \"90bce2585ebd226235ece84f2d78cc81aa57f0bef4517e2ea2d730b7d56ab259\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b3c04627ae6287ed32322635f352b8e45c54ef19cf7ccc066b6fbceae5243bac\"" Sep 12 23:09:50.290404 containerd[1565]: time="2025-09-12T23:09:50.290293085Z" level=info msg="StartContainer for \"b3c04627ae6287ed32322635f352b8e45c54ef19cf7ccc066b6fbceae5243bac\"" Sep 12 23:09:50.291498 containerd[1565]: time="2025-09-12T23:09:50.291469663Z" level=info msg="connecting to shim b3c04627ae6287ed32322635f352b8e45c54ef19cf7ccc066b6fbceae5243bac" address="unix:///run/containerd/s/102fb3762c13930579341edf6153c0359d3bb783c424a7fe4d83fef1414b600f" protocol=ttrpc version=3 Sep 12 23:09:50.317833 systemd[1]: Started cri-containerd-b3c04627ae6287ed32322635f352b8e45c54ef19cf7ccc066b6fbceae5243bac.scope - libcontainer container b3c04627ae6287ed32322635f352b8e45c54ef19cf7ccc066b6fbceae5243bac. Sep 12 23:09:50.380535 containerd[1565]: time="2025-09-12T23:09:50.380404512Z" level=info msg="StartContainer for \"b3c04627ae6287ed32322635f352b8e45c54ef19cf7ccc066b6fbceae5243bac\" returns successfully" Sep 12 23:09:50.607344 kubelet[2735]: I0912 23:09:50.607269 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bfd95f59f-4ls7l" podStartSLOduration=1.607255011 podStartE2EDuration="1.607255011s" podCreationTimestamp="2025-09-12 23:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:09:50.593782909 +0000 UTC m=+56.889820651" watchObservedRunningTime="2025-09-12 23:09:50.607255011 +0000 UTC m=+56.903292752" Sep 12 23:09:51.455904 systemd-networkd[1450]: cali62d1ab6d49a: Gained IPv6LL Sep 12 23:09:51.584546 kubelet[2735]: I0912 23:09:51.584469 2735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:09:51.911354 kubelet[2735]: I0912 23:09:51.910388 2735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e769e284-06fa-4f9a-98a3-28a76d353dce" path="/var/lib/kubelet/pods/e769e284-06fa-4f9a-98a3-28a76d353dce/volumes" Sep 12 23:09:52.162513 systemd[1]: Started sshd@7-10.0.0.150:22-10.0.0.1:49954.service - OpenSSH per-connection server daemon (10.0.0.1:49954). Sep 12 23:09:52.241519 sshd[4919]: Accepted publickey for core from 10.0.0.1 port 49954 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:09:52.244377 sshd-session[4919]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:09:52.251413 systemd-logind[1546]: New session 8 of user core. Sep 12 23:09:52.260847 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 23:09:52.424496 sshd[4922]: Connection closed by 10.0.0.1 port 49954 Sep 12 23:09:52.425382 sshd-session[4919]: pam_unix(sshd:session): session closed for user core Sep 12 23:09:52.432561 systemd[1]: sshd@7-10.0.0.150:22-10.0.0.1:49954.service: Deactivated successfully. Sep 12 23:09:52.435493 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 23:09:52.436380 systemd-logind[1546]: Session 8 logged out. Waiting for processes to exit. Sep 12 23:09:52.437960 systemd-logind[1546]: Removed session 8. Sep 12 23:09:52.545643 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2782575963.mount: Deactivated successfully. Sep 12 23:09:53.181425 containerd[1565]: time="2025-09-12T23:09:53.181344366Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:53.182533 containerd[1565]: time="2025-09-12T23:09:53.182469607Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 23:09:53.184049 containerd[1565]: time="2025-09-12T23:09:53.184008686Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:53.186102 containerd[1565]: time="2025-09-12T23:09:53.186063010Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:53.186621 containerd[1565]: time="2025-09-12T23:09:53.186581072Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.174899733s" Sep 12 23:09:53.186621 containerd[1565]: time="2025-09-12T23:09:53.186606730Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 23:09:53.191408 containerd[1565]: time="2025-09-12T23:09:53.191368416Z" level=info msg="CreateContainer within sandbox \"a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 23:09:53.197618 containerd[1565]: time="2025-09-12T23:09:53.197592765Z" level=info msg="Container 582e589b90c1baee2254b6ac20cf543236eae6227bd0de2aab3dfaa39740ca1c: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:53.208037 containerd[1565]: time="2025-09-12T23:09:53.207988762Z" level=info msg="CreateContainer within sandbox \"a4ec8ce69bfd5dd15391a5eb8f45a2c5341a36dce204f5498dc64f614dd52b23\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"582e589b90c1baee2254b6ac20cf543236eae6227bd0de2aab3dfaa39740ca1c\"" Sep 12 23:09:53.208487 containerd[1565]: time="2025-09-12T23:09:53.208452863Z" level=info msg="StartContainer for \"582e589b90c1baee2254b6ac20cf543236eae6227bd0de2aab3dfaa39740ca1c\"" Sep 12 23:09:53.209778 containerd[1565]: time="2025-09-12T23:09:53.209748163Z" level=info msg="connecting to shim 582e589b90c1baee2254b6ac20cf543236eae6227bd0de2aab3dfaa39740ca1c" address="unix:///run/containerd/s/87eb3522931f72f876530478a5a37069b825599d7d9fc7f46591aa5073358e05" protocol=ttrpc version=3 Sep 12 23:09:53.234828 systemd[1]: Started cri-containerd-582e589b90c1baee2254b6ac20cf543236eae6227bd0de2aab3dfaa39740ca1c.scope - libcontainer container 582e589b90c1baee2254b6ac20cf543236eae6227bd0de2aab3dfaa39740ca1c. Sep 12 23:09:53.282635 containerd[1565]: time="2025-09-12T23:09:53.282584966Z" level=info msg="StartContainer for \"582e589b90c1baee2254b6ac20cf543236eae6227bd0de2aab3dfaa39740ca1c\" returns successfully" Sep 12 23:09:53.628933 kubelet[2735]: I0912 23:09:53.628859 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-78dd598d96-244zs" podStartSLOduration=1.759071133 podStartE2EDuration="9.62882619s" podCreationTimestamp="2025-09-12 23:09:44 +0000 UTC" firstStartedPulling="2025-09-12 23:09:45.31759085 +0000 UTC m=+51.613628591" lastFinishedPulling="2025-09-12 23:09:53.187345907 +0000 UTC m=+59.483383648" observedRunningTime="2025-09-12 23:09:53.626120233 +0000 UTC m=+59.922157974" watchObservedRunningTime="2025-09-12 23:09:53.62882619 +0000 UTC m=+59.924863921" Sep 12 23:09:54.901376 containerd[1565]: time="2025-09-12T23:09:54.901314064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-547fc7c647-tnhhc,Uid:689cc000-12e9-401f-b65b-8e76eeb53d27,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:55.043183 systemd-networkd[1450]: cali176cc254c96: Link UP Sep 12 23:09:55.043932 systemd-networkd[1450]: cali176cc254c96: Gained carrier Sep 12 23:09:55.062619 containerd[1565]: 2025-09-12 23:09:54.977 [INFO][4982] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0 calico-kube-controllers-547fc7c647- calico-system 689cc000-12e9-401f-b65b-8e76eeb53d27 862 0 2025-09-12 23:09:13 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:547fc7c647 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-547fc7c647-tnhhc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali176cc254c96 [] [] }} ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-" Sep 12 23:09:55.062619 containerd[1565]: 2025-09-12 23:09:54.977 [INFO][4982] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" Sep 12 23:09:55.062619 containerd[1565]: 2025-09-12 23:09:55.004 [INFO][4996] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" HandleID="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Workload="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.004 [INFO][4996] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" HandleID="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Workload="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000135720), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-547fc7c647-tnhhc", "timestamp":"2025-09-12 23:09:55.004549811 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.004 [INFO][4996] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.004 [INFO][4996] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.004 [INFO][4996] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.012 [INFO][4996] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" host="localhost" Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.017 [INFO][4996] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.021 [INFO][4996] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.023 [INFO][4996] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.025 [INFO][4996] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:55.062854 containerd[1565]: 2025-09-12 23:09:55.025 [INFO][4996] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" host="localhost" Sep 12 23:09:55.063079 containerd[1565]: 2025-09-12 23:09:55.026 [INFO][4996] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694 Sep 12 23:09:55.063079 containerd[1565]: 2025-09-12 23:09:55.031 [INFO][4996] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" host="localhost" Sep 12 23:09:55.063079 containerd[1565]: 2025-09-12 23:09:55.036 [INFO][4996] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" host="localhost" Sep 12 23:09:55.063079 containerd[1565]: 2025-09-12 23:09:55.036 [INFO][4996] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" host="localhost" Sep 12 23:09:55.063079 containerd[1565]: 2025-09-12 23:09:55.036 [INFO][4996] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:55.063079 containerd[1565]: 2025-09-12 23:09:55.036 [INFO][4996] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" HandleID="k8s-pod-network.0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Workload="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" Sep 12 23:09:55.063229 containerd[1565]: 2025-09-12 23:09:55.040 [INFO][4982] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0", GenerateName:"calico-kube-controllers-547fc7c647-", Namespace:"calico-system", SelfLink:"", UID:"689cc000-12e9-401f-b65b-8e76eeb53d27", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"547fc7c647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-547fc7c647-tnhhc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali176cc254c96", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:55.063286 containerd[1565]: 2025-09-12 23:09:55.040 [INFO][4982] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" Sep 12 23:09:55.063286 containerd[1565]: 2025-09-12 23:09:55.041 [INFO][4982] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali176cc254c96 ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" Sep 12 23:09:55.063286 containerd[1565]: 2025-09-12 23:09:55.044 [INFO][4982] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" Sep 12 23:09:55.063353 containerd[1565]: 2025-09-12 23:09:55.044 [INFO][4982] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0", GenerateName:"calico-kube-controllers-547fc7c647-", Namespace:"calico-system", SelfLink:"", UID:"689cc000-12e9-401f-b65b-8e76eeb53d27", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"547fc7c647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694", Pod:"calico-kube-controllers-547fc7c647-tnhhc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali176cc254c96", MAC:"3a:44:40:be:1b:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:55.063403 containerd[1565]: 2025-09-12 23:09:55.056 [INFO][4982] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" Namespace="calico-system" Pod="calico-kube-controllers-547fc7c647-tnhhc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--547fc7c647--tnhhc-eth0" Sep 12 23:09:55.092732 containerd[1565]: time="2025-09-12T23:09:55.092666202Z" level=info msg="connecting to shim 0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694" address="unix:///run/containerd/s/9a9e6ead3724950a870ad9ce7834b65fb11aa5d50c53dce33802de30b54f82c9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:55.127865 systemd[1]: Started cri-containerd-0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694.scope - libcontainer container 0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694. Sep 12 23:09:55.144469 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:09:55.177736 containerd[1565]: time="2025-09-12T23:09:55.177562633Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-547fc7c647-tnhhc,Uid:689cc000-12e9-401f-b65b-8e76eeb53d27,Namespace:calico-system,Attempt:0,} returns sandbox id \"0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694\"" Sep 12 23:09:55.186420 containerd[1565]: time="2025-09-12T23:09:55.186347666Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 23:09:55.899666 containerd[1565]: time="2025-09-12T23:09:55.899594697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-r4tmb,Uid:bb420c54-7b0f-4c5c-8007-3e79ab20da98,Namespace:kube-system,Attempt:0,}" Sep 12 23:09:56.063958 systemd-networkd[1450]: calif400f3e2661: Link UP Sep 12 23:09:56.065102 systemd-networkd[1450]: calif400f3e2661: Gained carrier Sep 12 23:09:56.085677 containerd[1565]: 2025-09-12 23:09:55.939 [INFO][5061] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0 coredns-674b8bbfcf- kube-system bb420c54-7b0f-4c5c-8007-3e79ab20da98 868 0 2025-09-12 23:09:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-r4tmb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif400f3e2661 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-" Sep 12 23:09:56.085677 containerd[1565]: 2025-09-12 23:09:55.939 [INFO][5061] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" Sep 12 23:09:56.085677 containerd[1565]: 2025-09-12 23:09:55.969 [INFO][5074] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" HandleID="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Workload="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.977 [INFO][5074] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" HandleID="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Workload="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df6b0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-r4tmb", "timestamp":"2025-09-12 23:09:55.96960184 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.977 [INFO][5074] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.977 [INFO][5074] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.977 [INFO][5074] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.986 [INFO][5074] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" host="localhost" Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.990 [INFO][5074] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.995 [INFO][5074] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.996 [INFO][5074] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.999 [INFO][5074] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:56.086218 containerd[1565]: 2025-09-12 23:09:55.999 [INFO][5074] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" host="localhost" Sep 12 23:09:56.086453 containerd[1565]: 2025-09-12 23:09:56.000 [INFO][5074] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71 Sep 12 23:09:56.086453 containerd[1565]: 2025-09-12 23:09:56.023 [INFO][5074] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" host="localhost" Sep 12 23:09:56.086453 containerd[1565]: 2025-09-12 23:09:56.056 [INFO][5074] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" host="localhost" Sep 12 23:09:56.086453 containerd[1565]: 2025-09-12 23:09:56.056 [INFO][5074] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" host="localhost" Sep 12 23:09:56.086453 containerd[1565]: 2025-09-12 23:09:56.056 [INFO][5074] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:56.086453 containerd[1565]: 2025-09-12 23:09:56.056 [INFO][5074] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" HandleID="k8s-pod-network.430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Workload="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" Sep 12 23:09:56.086584 containerd[1565]: 2025-09-12 23:09:56.060 [INFO][5061] cni-plugin/k8s.go 418: Populated endpoint ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"bb420c54-7b0f-4c5c-8007-3e79ab20da98", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-r4tmb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif400f3e2661", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:56.086682 containerd[1565]: 2025-09-12 23:09:56.060 [INFO][5061] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" Sep 12 23:09:56.086682 containerd[1565]: 2025-09-12 23:09:56.060 [INFO][5061] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif400f3e2661 ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" Sep 12 23:09:56.086682 containerd[1565]: 2025-09-12 23:09:56.064 [INFO][5061] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" Sep 12 23:09:56.086755 containerd[1565]: 2025-09-12 23:09:56.068 [INFO][5061] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"bb420c54-7b0f-4c5c-8007-3e79ab20da98", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71", Pod:"coredns-674b8bbfcf-r4tmb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif400f3e2661", MAC:"4e:37:8c:a2:a6:8f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:56.086755 containerd[1565]: 2025-09-12 23:09:56.081 [INFO][5061] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" Namespace="kube-system" Pod="coredns-674b8bbfcf-r4tmb" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--r4tmb-eth0" Sep 12 23:09:56.113324 containerd[1565]: time="2025-09-12T23:09:56.113265920Z" level=info msg="connecting to shim 430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71" address="unix:///run/containerd/s/2b69755020a5763bbcd3ac16c55f0112b2de542628585d6862c0d5ae3ca47a6c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:56.145821 systemd[1]: Started cri-containerd-430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71.scope - libcontainer container 430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71. Sep 12 23:09:56.161977 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:09:56.197049 containerd[1565]: time="2025-09-12T23:09:56.196980938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-r4tmb,Uid:bb420c54-7b0f-4c5c-8007-3e79ab20da98,Namespace:kube-system,Attempt:0,} returns sandbox id \"430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71\"" Sep 12 23:09:56.204927 containerd[1565]: time="2025-09-12T23:09:56.204870221Z" level=info msg="CreateContainer within sandbox \"430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:09:56.224495 containerd[1565]: time="2025-09-12T23:09:56.223953766Z" level=info msg="Container 2aef9144f2dabbbea73e905dffe839ef0796783018b57be53101c63829669742: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:56.224184 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3231304093.mount: Deactivated successfully. Sep 12 23:09:56.235671 containerd[1565]: time="2025-09-12T23:09:56.235588265Z" level=info msg="CreateContainer within sandbox \"430d8a92376db0504e435beee05ed8774f025c99733727edd4cc3ccb191d4d71\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"2aef9144f2dabbbea73e905dffe839ef0796783018b57be53101c63829669742\"" Sep 12 23:09:56.236477 containerd[1565]: time="2025-09-12T23:09:56.236311893Z" level=info msg="StartContainer for \"2aef9144f2dabbbea73e905dffe839ef0796783018b57be53101c63829669742\"" Sep 12 23:09:56.238068 containerd[1565]: time="2025-09-12T23:09:56.238027241Z" level=info msg="connecting to shim 2aef9144f2dabbbea73e905dffe839ef0796783018b57be53101c63829669742" address="unix:///run/containerd/s/2b69755020a5763bbcd3ac16c55f0112b2de542628585d6862c0d5ae3ca47a6c" protocol=ttrpc version=3 Sep 12 23:09:56.259812 systemd[1]: Started cri-containerd-2aef9144f2dabbbea73e905dffe839ef0796783018b57be53101c63829669742.scope - libcontainer container 2aef9144f2dabbbea73e905dffe839ef0796783018b57be53101c63829669742. Sep 12 23:09:56.447886 systemd-networkd[1450]: cali176cc254c96: Gained IPv6LL Sep 12 23:09:56.531033 containerd[1565]: time="2025-09-12T23:09:56.530974114Z" level=info msg="StartContainer for \"2aef9144f2dabbbea73e905dffe839ef0796783018b57be53101c63829669742\" returns successfully" Sep 12 23:09:57.449188 systemd[1]: Started sshd@8-10.0.0.150:22-10.0.0.1:49970.service - OpenSSH per-connection server daemon (10.0.0.1:49970). Sep 12 23:09:57.534137 sshd[5184]: Accepted publickey for core from 10.0.0.1 port 49970 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:09:57.535939 sshd-session[5184]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:09:57.540293 systemd-logind[1546]: New session 9 of user core. Sep 12 23:09:57.546783 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 23:09:57.647211 kubelet[2735]: I0912 23:09:57.647135 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-r4tmb" podStartSLOduration=57.64712103 podStartE2EDuration="57.64712103s" podCreationTimestamp="2025-09-12 23:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:09:56.643961633 +0000 UTC m=+62.939999374" watchObservedRunningTime="2025-09-12 23:09:57.64712103 +0000 UTC m=+63.943158771" Sep 12 23:09:57.717774 sshd[5188]: Connection closed by 10.0.0.1 port 49970 Sep 12 23:09:57.718162 sshd-session[5184]: pam_unix(sshd:session): session closed for user core Sep 12 23:09:57.723790 systemd-logind[1546]: Session 9 logged out. Waiting for processes to exit. Sep 12 23:09:57.724359 systemd[1]: sshd@8-10.0.0.150:22-10.0.0.1:49970.service: Deactivated successfully. Sep 12 23:09:57.726792 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 23:09:57.728808 systemd-logind[1546]: Removed session 9. Sep 12 23:09:57.900185 containerd[1565]: time="2025-09-12T23:09:57.900116551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bm7r8,Uid:0e4de4ee-2e8e-4032-b7ab-9b77d4141fea,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:58.048853 systemd-networkd[1450]: calif400f3e2661: Gained IPv6LL Sep 12 23:09:58.117035 systemd-networkd[1450]: calib54bebd08c8: Link UP Sep 12 23:09:58.117888 systemd-networkd[1450]: calib54bebd08c8: Gained carrier Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.034 [INFO][5204] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--bm7r8-eth0 csi-node-driver- calico-system 0e4de4ee-2e8e-4032-b7ab-9b77d4141fea 753 0 2025-09-12 23:09:13 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-bm7r8 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib54bebd08c8 [] [] }} ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.035 [INFO][5204] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-eth0" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.071 [INFO][5225] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" HandleID="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Workload="localhost-k8s-csi--node--driver--bm7r8-eth0" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.071 [INFO][5225] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" HandleID="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Workload="localhost-k8s-csi--node--driver--bm7r8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7010), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-bm7r8", "timestamp":"2025-09-12 23:09:58.071067247 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.071 [INFO][5225] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.071 [INFO][5225] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.071 [INFO][5225] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.078 [INFO][5225] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.084 [INFO][5225] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.089 [INFO][5225] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.091 [INFO][5225] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.093 [INFO][5225] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.093 [INFO][5225] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.096 [INFO][5225] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193 Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.103 [INFO][5225] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.109 [INFO][5225] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.109 [INFO][5225] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" host="localhost" Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.109 [INFO][5225] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:58.135105 containerd[1565]: 2025-09-12 23:09:58.109 [INFO][5225] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" HandleID="k8s-pod-network.ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Workload="localhost-k8s-csi--node--driver--bm7r8-eth0" Sep 12 23:09:58.135887 containerd[1565]: 2025-09-12 23:09:58.114 [INFO][5204] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--bm7r8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0e4de4ee-2e8e-4032-b7ab-9b77d4141fea", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-bm7r8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib54bebd08c8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:58.135887 containerd[1565]: 2025-09-12 23:09:58.114 [INFO][5204] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-eth0" Sep 12 23:09:58.135887 containerd[1565]: 2025-09-12 23:09:58.114 [INFO][5204] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib54bebd08c8 ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-eth0" Sep 12 23:09:58.135887 containerd[1565]: 2025-09-12 23:09:58.118 [INFO][5204] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-eth0" Sep 12 23:09:58.135887 containerd[1565]: 2025-09-12 23:09:58.118 [INFO][5204] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--bm7r8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0e4de4ee-2e8e-4032-b7ab-9b77d4141fea", ResourceVersion:"753", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193", Pod:"csi-node-driver-bm7r8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib54bebd08c8", MAC:"6a:61:04:69:87:d5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:58.135887 containerd[1565]: 2025-09-12 23:09:58.129 [INFO][5204] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" Namespace="calico-system" Pod="csi-node-driver-bm7r8" WorkloadEndpoint="localhost-k8s-csi--node--driver--bm7r8-eth0" Sep 12 23:09:58.203342 containerd[1565]: time="2025-09-12T23:09:58.203017987Z" level=info msg="connecting to shim ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193" address="unix:///run/containerd/s/c4532843062b4f31466b23c03d85f35b2692b50e807a225e11c8683e6e889f94" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:09:58.263017 systemd[1]: Started cri-containerd-ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193.scope - libcontainer container ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193. Sep 12 23:09:58.279317 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:09:58.299979 containerd[1565]: time="2025-09-12T23:09:58.299889139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bm7r8,Uid:0e4de4ee-2e8e-4032-b7ab-9b77d4141fea,Namespace:calico-system,Attempt:0,} returns sandbox id \"ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193\"" Sep 12 23:09:58.592310 containerd[1565]: time="2025-09-12T23:09:58.592197816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:58.593570 containerd[1565]: time="2025-09-12T23:09:58.593550860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 23:09:58.594925 containerd[1565]: time="2025-09-12T23:09:58.594904646Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:58.597194 containerd[1565]: time="2025-09-12T23:09:58.597150451Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:09:58.597873 containerd[1565]: time="2025-09-12T23:09:58.597829649Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.411441455s" Sep 12 23:09:58.597917 containerd[1565]: time="2025-09-12T23:09:58.597875036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 23:09:58.598766 containerd[1565]: time="2025-09-12T23:09:58.598732801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 23:09:58.610456 containerd[1565]: time="2025-09-12T23:09:58.610391392Z" level=info msg="CreateContainer within sandbox \"0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 23:09:58.620532 containerd[1565]: time="2025-09-12T23:09:58.620503354Z" level=info msg="Container 74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:09:58.766642 containerd[1565]: time="2025-09-12T23:09:58.766592266Z" level=info msg="CreateContainer within sandbox \"0e8b068d3bad489d65ee09789be4547f9f2b5308fca4955297f285a6ba32b694\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9\"" Sep 12 23:09:58.767100 containerd[1565]: time="2025-09-12T23:09:58.767071856Z" level=info msg="StartContainer for \"74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9\"" Sep 12 23:09:58.768365 containerd[1565]: time="2025-09-12T23:09:58.768339564Z" level=info msg="connecting to shim 74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9" address="unix:///run/containerd/s/9a9e6ead3724950a870ad9ce7834b65fb11aa5d50c53dce33802de30b54f82c9" protocol=ttrpc version=3 Sep 12 23:09:58.790787 systemd[1]: Started cri-containerd-74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9.scope - libcontainer container 74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9. Sep 12 23:09:58.844634 containerd[1565]: time="2025-09-12T23:09:58.844533474Z" level=info msg="StartContainer for \"74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9\" returns successfully" Sep 12 23:09:58.899762 containerd[1565]: time="2025-09-12T23:09:58.899691403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w8xwj,Uid:ee957723-a32e-4e2c-b9a1-bcecc48992cf,Namespace:kube-system,Attempt:0,}" Sep 12 23:09:58.900261 containerd[1565]: time="2025-09-12T23:09:58.900211823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xn4hp,Uid:908cf082-bf9d-4477-b109-2080b0bf0e32,Namespace:calico-system,Attempt:0,}" Sep 12 23:09:59.397297 systemd-networkd[1450]: calicc98541f198: Link UP Sep 12 23:09:59.397920 systemd-networkd[1450]: calicc98541f198: Gained carrier Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.094 [INFO][5348] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--xn4hp-eth0 goldmane-54d579b49d- calico-system 908cf082-bf9d-4477-b109-2080b0bf0e32 876 0 2025-09-12 23:09:12 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-xn4hp eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calicc98541f198 [] [] }} ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.094 [INFO][5348] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5367] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" HandleID="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Workload="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5367] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" HandleID="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Workload="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000427680), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-xn4hp", "timestamp":"2025-09-12 23:09:59.233285129 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5367] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5367] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5367] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.239 [INFO][5367] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.243 [INFO][5367] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.247 [INFO][5367] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.248 [INFO][5367] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.250 [INFO][5367] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.250 [INFO][5367] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.251 [INFO][5367] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.275 [INFO][5367] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.391 [INFO][5367] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.391 [INFO][5367] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" host="localhost" Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.391 [INFO][5367] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:59.457574 containerd[1565]: 2025-09-12 23:09:59.391 [INFO][5367] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" HandleID="k8s-pod-network.c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Workload="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" Sep 12 23:09:59.458683 containerd[1565]: 2025-09-12 23:09:59.394 [INFO][5348] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--xn4hp-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"908cf082-bf9d-4477-b109-2080b0bf0e32", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-xn4hp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calicc98541f198", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:59.458683 containerd[1565]: 2025-09-12 23:09:59.394 [INFO][5348] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" Sep 12 23:09:59.458683 containerd[1565]: 2025-09-12 23:09:59.394 [INFO][5348] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicc98541f198 ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" Sep 12 23:09:59.458683 containerd[1565]: 2025-09-12 23:09:59.397 [INFO][5348] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" Sep 12 23:09:59.458683 containerd[1565]: 2025-09-12 23:09:59.399 [INFO][5348] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--xn4hp-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"908cf082-bf9d-4477-b109-2080b0bf0e32", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe", Pod:"goldmane-54d579b49d-xn4hp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calicc98541f198", MAC:"c6:f5:a6:7b:a6:a4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:59.458683 containerd[1565]: 2025-09-12 23:09:59.453 [INFO][5348] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" Namespace="calico-system" Pod="goldmane-54d579b49d-xn4hp" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--xn4hp-eth0" Sep 12 23:09:59.652333 systemd-networkd[1450]: cali305c0a76bc9: Link UP Sep 12 23:09:59.655852 systemd-networkd[1450]: cali305c0a76bc9: Gained carrier Sep 12 23:09:59.703870 containerd[1565]: time="2025-09-12T23:09:59.703816766Z" level=info msg="TaskExit event in podsandbox handler container_id:\"74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9\" id:\"083797f9fb7a675573da0000a1942a43df45638036260729a7c3e1128f4a73f2\" pid:5410 exited_at:{seconds:1757718599 nanos:702954515}" Sep 12 23:09:59.864576 kubelet[2735]: I0912 23:09:59.864496 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-547fc7c647-tnhhc" podStartSLOduration=43.451908456 podStartE2EDuration="46.86447586s" podCreationTimestamp="2025-09-12 23:09:13 +0000 UTC" firstStartedPulling="2025-09-12 23:09:55.185976149 +0000 UTC m=+61.482013900" lastFinishedPulling="2025-09-12 23:09:58.598543563 +0000 UTC m=+64.894581304" observedRunningTime="2025-09-12 23:09:59.864127275 +0000 UTC m=+66.160165016" watchObservedRunningTime="2025-09-12 23:09:59.86447586 +0000 UTC m=+66.160513601" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.096 [INFO][5336] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0 coredns-674b8bbfcf- kube-system ee957723-a32e-4e2c-b9a1-bcecc48992cf 875 0 2025-09-12 23:09:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-w8xwj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali305c0a76bc9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.096 [INFO][5336] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5369] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" HandleID="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Workload="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5369] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" HandleID="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Workload="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c6fe0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-w8xwj", "timestamp":"2025-09-12 23:09:59.233726775 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.233 [INFO][5369] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.391 [INFO][5369] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.391 [INFO][5369] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.401 [INFO][5369] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.454 [INFO][5369] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.460 [INFO][5369] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.462 [INFO][5369] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.464 [INFO][5369] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.464 [INFO][5369] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.466 [INFO][5369] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580 Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.479 [INFO][5369] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.641 [INFO][5369] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.641 [INFO][5369] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" host="localhost" Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.641 [INFO][5369] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:09:59.873528 containerd[1565]: 2025-09-12 23:09:59.641 [INFO][5369] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" HandleID="k8s-pod-network.c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Workload="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" Sep 12 23:09:59.874125 containerd[1565]: 2025-09-12 23:09:59.647 [INFO][5336] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ee957723-a32e-4e2c-b9a1-bcecc48992cf", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-w8xwj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali305c0a76bc9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:59.874125 containerd[1565]: 2025-09-12 23:09:59.647 [INFO][5336] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" Sep 12 23:09:59.874125 containerd[1565]: 2025-09-12 23:09:59.647 [INFO][5336] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali305c0a76bc9 ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" Sep 12 23:09:59.874125 containerd[1565]: 2025-09-12 23:09:59.652 [INFO][5336] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" Sep 12 23:09:59.874125 containerd[1565]: 2025-09-12 23:09:59.653 [INFO][5336] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ee957723-a32e-4e2c-b9a1-bcecc48992cf", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 9, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580", Pod:"coredns-674b8bbfcf-w8xwj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali305c0a76bc9", MAC:"1e:e9:72:3a:29:70", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:09:59.874125 containerd[1565]: 2025-09-12 23:09:59.866 [INFO][5336] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" Namespace="kube-system" Pod="coredns-674b8bbfcf-w8xwj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--w8xwj-eth0" Sep 12 23:09:59.967883 systemd-networkd[1450]: calib54bebd08c8: Gained IPv6LL Sep 12 23:10:00.160321 containerd[1565]: time="2025-09-12T23:10:00.160267151Z" level=info msg="connecting to shim c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe" address="unix:///run/containerd/s/114b823ad05f2cc4aefded0c5dfd1bde9bd0b0d2d10385dedfc26e2a449076fe" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:10:00.196818 systemd[1]: Started cri-containerd-c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe.scope - libcontainer container c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe. Sep 12 23:10:00.212724 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:10:00.220587 containerd[1565]: time="2025-09-12T23:10:00.220488441Z" level=info msg="connecting to shim c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580" address="unix:///run/containerd/s/ccb373f6ade3f94e9fb63fd375458c2aa9cfdf0473f451c0531e3f4179288aaa" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:10:00.249979 systemd[1]: Started cri-containerd-c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580.scope - libcontainer container c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580. Sep 12 23:10:00.252585 containerd[1565]: time="2025-09-12T23:10:00.252548630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xn4hp,Uid:908cf082-bf9d-4477-b109-2080b0bf0e32,Namespace:calico-system,Attempt:0,} returns sandbox id \"c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe\"" Sep 12 23:10:00.265543 systemd-resolved[1425]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 23:10:00.295170 containerd[1565]: time="2025-09-12T23:10:00.295114889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w8xwj,Uid:ee957723-a32e-4e2c-b9a1-bcecc48992cf,Namespace:kube-system,Attempt:0,} returns sandbox id \"c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580\"" Sep 12 23:10:00.301604 containerd[1565]: time="2025-09-12T23:10:00.301564436Z" level=info msg="CreateContainer within sandbox \"c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:10:00.311096 containerd[1565]: time="2025-09-12T23:10:00.311056151Z" level=info msg="Container 7ae9c3428a8da5dcfa1a7a11957b80d51110e9c08345e77d3e1f4f6714b8aef2: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:10:00.321986 containerd[1565]: time="2025-09-12T23:10:00.321924012Z" level=info msg="CreateContainer within sandbox \"c926216e2d28d95a4abdd8a3a4c5065761263fec21d0b922a2666a5ef3d9c580\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7ae9c3428a8da5dcfa1a7a11957b80d51110e9c08345e77d3e1f4f6714b8aef2\"" Sep 12 23:10:00.322703 containerd[1565]: time="2025-09-12T23:10:00.322673512Z" level=info msg="StartContainer for \"7ae9c3428a8da5dcfa1a7a11957b80d51110e9c08345e77d3e1f4f6714b8aef2\"" Sep 12 23:10:00.323535 containerd[1565]: time="2025-09-12T23:10:00.323495354Z" level=info msg="connecting to shim 7ae9c3428a8da5dcfa1a7a11957b80d51110e9c08345e77d3e1f4f6714b8aef2" address="unix:///run/containerd/s/ccb373f6ade3f94e9fb63fd375458c2aa9cfdf0473f451c0531e3f4179288aaa" protocol=ttrpc version=3 Sep 12 23:10:00.350795 systemd[1]: Started cri-containerd-7ae9c3428a8da5dcfa1a7a11957b80d51110e9c08345e77d3e1f4f6714b8aef2.scope - libcontainer container 7ae9c3428a8da5dcfa1a7a11957b80d51110e9c08345e77d3e1f4f6714b8aef2. Sep 12 23:10:00.385417 containerd[1565]: time="2025-09-12T23:10:00.385369033Z" level=info msg="StartContainer for \"7ae9c3428a8da5dcfa1a7a11957b80d51110e9c08345e77d3e1f4f6714b8aef2\" returns successfully" Sep 12 23:10:00.655495 kubelet[2735]: I0912 23:10:00.654960 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-w8xwj" podStartSLOduration=60.654943232 podStartE2EDuration="1m0.654943232s" podCreationTimestamp="2025-09-12 23:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:10:00.653850697 +0000 UTC m=+66.949888428" watchObservedRunningTime="2025-09-12 23:10:00.654943232 +0000 UTC m=+66.950980973" Sep 12 23:10:00.673174 systemd-networkd[1450]: calicc98541f198: Gained IPv6LL Sep 12 23:10:00.863915 systemd-networkd[1450]: cali305c0a76bc9: Gained IPv6LL Sep 12 23:10:01.462590 containerd[1565]: time="2025-09-12T23:10:01.462522830Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:01.463440 containerd[1565]: time="2025-09-12T23:10:01.463408193Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 23:10:01.464372 containerd[1565]: time="2025-09-12T23:10:01.464342860Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:01.466364 containerd[1565]: time="2025-09-12T23:10:01.466330997Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:01.466937 containerd[1565]: time="2025-09-12T23:10:01.466903003Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.868122419s" Sep 12 23:10:01.466937 containerd[1565]: time="2025-09-12T23:10:01.466936177Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 23:10:01.467820 containerd[1565]: time="2025-09-12T23:10:01.467780389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 23:10:01.471856 containerd[1565]: time="2025-09-12T23:10:01.471821627Z" level=info msg="CreateContainer within sandbox \"ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 23:10:01.483750 containerd[1565]: time="2025-09-12T23:10:01.483690155Z" level=info msg="Container 37a31329616f2b8e88bd7cae44a900cce8e2bdbc8321ff8b673125f5afc8c2f5: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:10:01.492861 containerd[1565]: time="2025-09-12T23:10:01.492809656Z" level=info msg="CreateContainer within sandbox \"ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"37a31329616f2b8e88bd7cae44a900cce8e2bdbc8321ff8b673125f5afc8c2f5\"" Sep 12 23:10:01.493422 containerd[1565]: time="2025-09-12T23:10:01.493386531Z" level=info msg="StartContainer for \"37a31329616f2b8e88bd7cae44a900cce8e2bdbc8321ff8b673125f5afc8c2f5\"" Sep 12 23:10:01.495117 containerd[1565]: time="2025-09-12T23:10:01.495087942Z" level=info msg="connecting to shim 37a31329616f2b8e88bd7cae44a900cce8e2bdbc8321ff8b673125f5afc8c2f5" address="unix:///run/containerd/s/c4532843062b4f31466b23c03d85f35b2692b50e807a225e11c8683e6e889f94" protocol=ttrpc version=3 Sep 12 23:10:01.520967 systemd[1]: Started cri-containerd-37a31329616f2b8e88bd7cae44a900cce8e2bdbc8321ff8b673125f5afc8c2f5.scope - libcontainer container 37a31329616f2b8e88bd7cae44a900cce8e2bdbc8321ff8b673125f5afc8c2f5. Sep 12 23:10:01.851473 containerd[1565]: time="2025-09-12T23:10:01.851322715Z" level=info msg="StartContainer for \"37a31329616f2b8e88bd7cae44a900cce8e2bdbc8321ff8b673125f5afc8c2f5\" returns successfully" Sep 12 23:10:02.734855 systemd[1]: Started sshd@9-10.0.0.150:22-10.0.0.1:54494.service - OpenSSH per-connection server daemon (10.0.0.1:54494). Sep 12 23:10:02.800185 sshd[5597]: Accepted publickey for core from 10.0.0.1 port 54494 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:02.801716 sshd-session[5597]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:02.806167 systemd-logind[1546]: New session 10 of user core. Sep 12 23:10:02.815800 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 23:10:02.945099 sshd[5600]: Connection closed by 10.0.0.1 port 54494 Sep 12 23:10:02.945448 sshd-session[5597]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:02.950321 systemd[1]: sshd@9-10.0.0.150:22-10.0.0.1:54494.service: Deactivated successfully. Sep 12 23:10:02.952399 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 23:10:02.953164 systemd-logind[1546]: Session 10 logged out. Waiting for processes to exit. Sep 12 23:10:02.954471 systemd-logind[1546]: Removed session 10. Sep 12 23:10:05.273980 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3775627815.mount: Deactivated successfully. Sep 12 23:10:05.836407 containerd[1565]: time="2025-09-12T23:10:05.836342336Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:05.837485 containerd[1565]: time="2025-09-12T23:10:05.837298198Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 23:10:05.838551 containerd[1565]: time="2025-09-12T23:10:05.838505426Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:05.840644 containerd[1565]: time="2025-09-12T23:10:05.840604323Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:05.841302 containerd[1565]: time="2025-09-12T23:10:05.841269045Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 4.373455351s" Sep 12 23:10:05.841302 containerd[1565]: time="2025-09-12T23:10:05.841297500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 23:10:05.842224 containerd[1565]: time="2025-09-12T23:10:05.842191113Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 23:10:05.846373 containerd[1565]: time="2025-09-12T23:10:05.846334842Z" level=info msg="CreateContainer within sandbox \"c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 23:10:05.855079 containerd[1565]: time="2025-09-12T23:10:05.855033536Z" level=info msg="Container c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:10:05.863922 containerd[1565]: time="2025-09-12T23:10:05.863871808Z" level=info msg="CreateContainer within sandbox \"c7c7c7f67155e1719257a226df9e8648748a2c75927976df6f458b612f1946fe\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee\"" Sep 12 23:10:05.864900 containerd[1565]: time="2025-09-12T23:10:05.864273973Z" level=info msg="StartContainer for \"c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee\"" Sep 12 23:10:05.865991 containerd[1565]: time="2025-09-12T23:10:05.865957609Z" level=info msg="connecting to shim c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee" address="unix:///run/containerd/s/114b823ad05f2cc4aefded0c5dfd1bde9bd0b0d2d10385dedfc26e2a449076fe" protocol=ttrpc version=3 Sep 12 23:10:05.895795 systemd[1]: Started cri-containerd-c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee.scope - libcontainer container c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee. Sep 12 23:10:05.950771 containerd[1565]: time="2025-09-12T23:10:05.950718324Z" level=info msg="StartContainer for \"c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee\" returns successfully" Sep 12 23:10:06.878217 kubelet[2735]: I0912 23:10:06.878155 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-xn4hp" podStartSLOduration=49.289987653 podStartE2EDuration="54.878138549s" podCreationTimestamp="2025-09-12 23:09:12 +0000 UTC" firstStartedPulling="2025-09-12 23:10:00.253945314 +0000 UTC m=+66.549983055" lastFinishedPulling="2025-09-12 23:10:05.84209621 +0000 UTC m=+72.138133951" observedRunningTime="2025-09-12 23:10:06.877183298 +0000 UTC m=+73.173221039" watchObservedRunningTime="2025-09-12 23:10:06.878138549 +0000 UTC m=+73.174176291" Sep 12 23:10:06.951086 containerd[1565]: time="2025-09-12T23:10:06.951042709Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee\" id:\"425faccaef0ddf2eb0a5ce2609b80cb74e8de6fae16822f9c146c1c722866caf\" pid:5679 exit_status:1 exited_at:{seconds:1757718606 nanos:950694478}" Sep 12 23:10:07.951196 containerd[1565]: time="2025-09-12T23:10:07.951128594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee\" id:\"bc84154cfe1d5d1250c3837c2fcd2b9a64f72136a1c00d39fb9733f1048cfedb\" pid:5709 exit_status:1 exited_at:{seconds:1757718607 nanos:950816443}" Sep 12 23:10:07.959210 systemd[1]: Started sshd@10-10.0.0.150:22-10.0.0.1:54504.service - OpenSSH per-connection server daemon (10.0.0.1:54504). Sep 12 23:10:08.509985 containerd[1565]: time="2025-09-12T23:10:08.509934397Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:08.510925 containerd[1565]: time="2025-09-12T23:10:08.510802708Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 23:10:08.512039 containerd[1565]: time="2025-09-12T23:10:08.511973731Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:08.513916 containerd[1565]: time="2025-09-12T23:10:08.513863116Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:10:08.514413 containerd[1565]: time="2025-09-12T23:10:08.514386193Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.672162296s" Sep 12 23:10:08.514472 containerd[1565]: time="2025-09-12T23:10:08.514416852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 23:10:08.519971 containerd[1565]: time="2025-09-12T23:10:08.519924469Z" level=info msg="CreateContainer within sandbox \"ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 23:10:08.537334 sshd[5722]: Accepted publickey for core from 10.0.0.1 port 54504 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:08.539998 sshd-session[5722]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:08.548673 containerd[1565]: time="2025-09-12T23:10:08.547968827Z" level=info msg="Container 47946147c9012c37056d722fb9538bcab1efb3b3a6687409de940649f9abddae: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:10:08.553936 systemd-logind[1546]: New session 11 of user core. Sep 12 23:10:08.560344 containerd[1565]: time="2025-09-12T23:10:08.560307227Z" level=info msg="CreateContainer within sandbox \"ed324466a6f1c59a0241cad90582c5134e03e2b146a5fb790e10eb5cd5eb8193\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"47946147c9012c37056d722fb9538bcab1efb3b3a6687409de940649f9abddae\"" Sep 12 23:10:08.560855 containerd[1565]: time="2025-09-12T23:10:08.560817900Z" level=info msg="StartContainer for \"47946147c9012c37056d722fb9538bcab1efb3b3a6687409de940649f9abddae\"" Sep 12 23:10:08.562295 containerd[1565]: time="2025-09-12T23:10:08.562268862Z" level=info msg="connecting to shim 47946147c9012c37056d722fb9538bcab1efb3b3a6687409de940649f9abddae" address="unix:///run/containerd/s/c4532843062b4f31466b23c03d85f35b2692b50e807a225e11c8683e6e889f94" protocol=ttrpc version=3 Sep 12 23:10:08.563876 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 23:10:08.594788 systemd[1]: Started cri-containerd-47946147c9012c37056d722fb9538bcab1efb3b3a6687409de940649f9abddae.scope - libcontainer container 47946147c9012c37056d722fb9538bcab1efb3b3a6687409de940649f9abddae. Sep 12 23:10:08.644575 containerd[1565]: time="2025-09-12T23:10:08.644513165Z" level=info msg="StartContainer for \"47946147c9012c37056d722fb9538bcab1efb3b3a6687409de940649f9abddae\" returns successfully" Sep 12 23:10:08.748162 sshd[5733]: Connection closed by 10.0.0.1 port 54504 Sep 12 23:10:08.748721 sshd-session[5722]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:08.757936 systemd[1]: sshd@10-10.0.0.150:22-10.0.0.1:54504.service: Deactivated successfully. Sep 12 23:10:08.760257 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 23:10:08.761472 systemd-logind[1546]: Session 11 logged out. Waiting for processes to exit. Sep 12 23:10:08.765039 systemd[1]: Started sshd@11-10.0.0.150:22-10.0.0.1:54514.service - OpenSSH per-connection server daemon (10.0.0.1:54514). Sep 12 23:10:08.765726 systemd-logind[1546]: Removed session 11. Sep 12 23:10:08.826934 sshd[5773]: Accepted publickey for core from 10.0.0.1 port 54514 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:08.828456 sshd-session[5773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:08.833551 systemd-logind[1546]: New session 12 of user core. Sep 12 23:10:08.843804 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 23:10:09.058606 kubelet[2735]: I0912 23:10:09.058502 2735 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 23:10:09.073930 kubelet[2735]: I0912 23:10:09.073882 2735 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 23:10:09.301336 sshd[5776]: Connection closed by 10.0.0.1 port 54514 Sep 12 23:10:09.301679 sshd-session[5773]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:09.311601 systemd[1]: sshd@11-10.0.0.150:22-10.0.0.1:54514.service: Deactivated successfully. Sep 12 23:10:09.315737 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 23:10:09.317707 systemd-logind[1546]: Session 12 logged out. Waiting for processes to exit. Sep 12 23:10:09.320125 systemd-logind[1546]: Removed session 12. Sep 12 23:10:09.321726 systemd[1]: Started sshd@12-10.0.0.150:22-10.0.0.1:54524.service - OpenSSH per-connection server daemon (10.0.0.1:54524). Sep 12 23:10:09.373464 sshd[5788]: Accepted publickey for core from 10.0.0.1 port 54524 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:09.374873 sshd-session[5788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:09.379189 systemd-logind[1546]: New session 13 of user core. Sep 12 23:10:09.388774 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 23:10:09.558216 sshd[5791]: Connection closed by 10.0.0.1 port 54524 Sep 12 23:10:09.558607 sshd-session[5788]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:09.562781 systemd[1]: sshd@12-10.0.0.150:22-10.0.0.1:54524.service: Deactivated successfully. Sep 12 23:10:09.564932 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 23:10:09.565749 systemd-logind[1546]: Session 13 logged out. Waiting for processes to exit. Sep 12 23:10:09.567025 systemd-logind[1546]: Removed session 13. Sep 12 23:10:14.573170 systemd[1]: Started sshd@13-10.0.0.150:22-10.0.0.1:39018.service - OpenSSH per-connection server daemon (10.0.0.1:39018). Sep 12 23:10:14.647287 sshd[5811]: Accepted publickey for core from 10.0.0.1 port 39018 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:14.648691 sshd-session[5811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:14.653158 systemd-logind[1546]: New session 14 of user core. Sep 12 23:10:14.664810 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 23:10:14.860231 sshd[5814]: Connection closed by 10.0.0.1 port 39018 Sep 12 23:10:14.860517 sshd-session[5811]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:14.865052 systemd[1]: sshd@13-10.0.0.150:22-10.0.0.1:39018.service: Deactivated successfully. Sep 12 23:10:14.867321 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 23:10:14.868212 systemd-logind[1546]: Session 14 logged out. Waiting for processes to exit. Sep 12 23:10:14.869681 systemd-logind[1546]: Removed session 14. Sep 12 23:10:15.649230 containerd[1565]: time="2025-09-12T23:10:15.649156742Z" level=info msg="TaskExit event in podsandbox handler container_id:\"830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa\" id:\"299e64ab0e9b9f0171a59a361fb78bec8b05ca70473603db4c207e79ce56ac9d\" pid:5838 exit_status:1 exited_at:{seconds:1757718615 nanos:648828372}" Sep 12 23:10:16.965203 containerd[1565]: time="2025-09-12T23:10:16.965133699Z" level=info msg="TaskExit event in podsandbox handler container_id:\"74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9\" id:\"94c975e9c7b102e2ae75980dff5f835e6033e67413a4ef9111ddd5eddd020eb4\" pid:5863 exited_at:{seconds:1757718616 nanos:964903087}" Sep 12 23:10:19.154738 kubelet[2735]: I0912 23:10:19.154491 2735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:10:19.195800 kubelet[2735]: I0912 23:10:19.195308 2735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-bm7r8" podStartSLOduration=55.98253853 podStartE2EDuration="1m6.195292682s" podCreationTimestamp="2025-09-12 23:09:13 +0000 UTC" firstStartedPulling="2025-09-12 23:09:58.30236473 +0000 UTC m=+64.598402471" lastFinishedPulling="2025-09-12 23:10:08.515118882 +0000 UTC m=+74.811156623" observedRunningTime="2025-09-12 23:10:09.129888417 +0000 UTC m=+75.425926168" watchObservedRunningTime="2025-09-12 23:10:19.195292682 +0000 UTC m=+85.491330423" Sep 12 23:10:19.873119 systemd[1]: Started sshd@14-10.0.0.150:22-10.0.0.1:39028.service - OpenSSH per-connection server daemon (10.0.0.1:39028). Sep 12 23:10:19.938162 sshd[5876]: Accepted publickey for core from 10.0.0.1 port 39028 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:19.939886 sshd-session[5876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:19.944249 systemd-logind[1546]: New session 15 of user core. Sep 12 23:10:19.951821 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 23:10:20.130495 sshd[5881]: Connection closed by 10.0.0.1 port 39028 Sep 12 23:10:20.130822 sshd-session[5876]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:20.135643 systemd[1]: sshd@14-10.0.0.150:22-10.0.0.1:39028.service: Deactivated successfully. Sep 12 23:10:20.137762 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 23:10:20.138611 systemd-logind[1546]: Session 15 logged out. Waiting for processes to exit. Sep 12 23:10:20.140004 systemd-logind[1546]: Removed session 15. Sep 12 23:10:21.377152 kubelet[2735]: I0912 23:10:21.377096 2735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:10:21.484326 containerd[1565]: time="2025-09-12T23:10:21.484264692Z" level=info msg="StopContainer for \"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" with timeout 30 (s)" Sep 12 23:10:21.505910 containerd[1565]: time="2025-09-12T23:10:21.505868136Z" level=info msg="Stop container \"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" with signal terminated" Sep 12 23:10:21.519436 systemd[1]: cri-containerd-0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7.scope: Deactivated successfully. Sep 12 23:10:21.532024 containerd[1565]: time="2025-09-12T23:10:21.531968540Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" id:\"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" pid:4678 exit_status:1 exited_at:{seconds:1757718621 nanos:520745883}" Sep 12 23:10:21.533925 containerd[1565]: time="2025-09-12T23:10:21.533849202Z" level=info msg="received exit event container_id:\"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" id:\"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" pid:4678 exit_status:1 exited_at:{seconds:1757718621 nanos:520745883}" Sep 12 23:10:21.569589 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7-rootfs.mount: Deactivated successfully. Sep 12 23:10:21.623947 containerd[1565]: time="2025-09-12T23:10:21.623877452Z" level=info msg="StopContainer for \"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" returns successfully" Sep 12 23:10:21.626838 containerd[1565]: time="2025-09-12T23:10:21.626794323Z" level=info msg="StopPodSandbox for \"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\"" Sep 12 23:10:21.645799 containerd[1565]: time="2025-09-12T23:10:21.645695615Z" level=info msg="Container to stop \"0b5a95276f3377e2a3ec6bed70dc4e1d754a1095377abe2d83bf57eb9141c9a7\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 12 23:10:21.661096 systemd[1]: cri-containerd-e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f.scope: Deactivated successfully. Sep 12 23:10:21.662784 containerd[1565]: time="2025-09-12T23:10:21.662718490Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\" id:\"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\" pid:4288 exit_status:137 exited_at:{seconds:1757718621 nanos:662498200}" Sep 12 23:10:21.697514 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f-rootfs.mount: Deactivated successfully. Sep 12 23:10:21.709850 containerd[1565]: time="2025-09-12T23:10:21.709803085Z" level=info msg="shim disconnected" id=e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f namespace=k8s.io Sep 12 23:10:21.709850 containerd[1565]: time="2025-09-12T23:10:21.709845175Z" level=warning msg="cleaning up after shim disconnected" id=e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f namespace=k8s.io Sep 12 23:10:21.710094 containerd[1565]: time="2025-09-12T23:10:21.709852941Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 23:10:21.801802 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f-shm.mount: Deactivated successfully. Sep 12 23:10:21.809173 containerd[1565]: time="2025-09-12T23:10:21.809125678Z" level=info msg="received exit event sandbox_id:\"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\" exit_status:137 exited_at:{seconds:1757718621 nanos:662498200}" Sep 12 23:10:21.926562 kubelet[2735]: I0912 23:10:21.926428 2735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Sep 12 23:10:22.166779 systemd-networkd[1450]: cali9ea4b44843d: Link DOWN Sep 12 23:10:22.166790 systemd-networkd[1450]: cali9ea4b44843d: Lost carrier Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.164 [INFO][5964] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.165 [INFO][5964] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" iface="eth0" netns="/var/run/netns/cni-051c8279-8b86-5120-b181-1c25e8d19b16" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.165 [INFO][5964] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" iface="eth0" netns="/var/run/netns/cni-051c8279-8b86-5120-b181-1c25e8d19b16" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.173 [INFO][5964] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" after=8.343861ms iface="eth0" netns="/var/run/netns/cni-051c8279-8b86-5120-b181-1c25e8d19b16" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.173 [INFO][5964] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.173 [INFO][5964] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.200 [INFO][5974] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" HandleID="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Workload="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.200 [INFO][5974] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.200 [INFO][5974] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.250 [INFO][5974] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" HandleID="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Workload="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.250 [INFO][5974] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" HandleID="k8s-pod-network.e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Workload="localhost-k8s-calico--apiserver--8f4ff4584--49xj7-eth0" Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.252 [INFO][5974] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:10:22.268836 containerd[1565]: 2025-09-12 23:10:22.262 [INFO][5964] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f" Sep 12 23:10:22.273182 systemd[1]: run-netns-cni\x2d051c8279\x2d8b86\x2d5120\x2db181\x2d1c25e8d19b16.mount: Deactivated successfully. Sep 12 23:10:22.277221 containerd[1565]: time="2025-09-12T23:10:22.276333821Z" level=info msg="TearDown network for sandbox \"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\" successfully" Sep 12 23:10:22.277221 containerd[1565]: time="2025-09-12T23:10:22.276367026Z" level=info msg="StopPodSandbox for \"e8efcdc353793f16e3abf39e8b61a56994858aa723e8e1699efa967b7312735f\" returns successfully" Sep 12 23:10:22.402861 kubelet[2735]: I0912 23:10:22.402794 2735 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx92x\" (UniqueName: \"kubernetes.io/projected/8f0d555c-c909-4b23-ab0e-a3117d51e702-kube-api-access-zx92x\") pod \"8f0d555c-c909-4b23-ab0e-a3117d51e702\" (UID: \"8f0d555c-c909-4b23-ab0e-a3117d51e702\") " Sep 12 23:10:22.402861 kubelet[2735]: I0912 23:10:22.402854 2735 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8f0d555c-c909-4b23-ab0e-a3117d51e702-calico-apiserver-certs\") pod \"8f0d555c-c909-4b23-ab0e-a3117d51e702\" (UID: \"8f0d555c-c909-4b23-ab0e-a3117d51e702\") " Sep 12 23:10:22.412435 kubelet[2735]: I0912 23:10:22.412386 2735 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0d555c-c909-4b23-ab0e-a3117d51e702-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "8f0d555c-c909-4b23-ab0e-a3117d51e702" (UID: "8f0d555c-c909-4b23-ab0e-a3117d51e702"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 23:10:22.414627 systemd[1]: var-lib-kubelet-pods-8f0d555c\x2dc909\x2d4b23\x2dab0e\x2da3117d51e702-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzx92x.mount: Deactivated successfully. Sep 12 23:10:22.415897 systemd[1]: var-lib-kubelet-pods-8f0d555c\x2dc909\x2d4b23\x2dab0e\x2da3117d51e702-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 12 23:10:22.417595 kubelet[2735]: I0912 23:10:22.417557 2735 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0d555c-c909-4b23-ab0e-a3117d51e702-kube-api-access-zx92x" (OuterVolumeSpecName: "kube-api-access-zx92x") pod "8f0d555c-c909-4b23-ab0e-a3117d51e702" (UID: "8f0d555c-c909-4b23-ab0e-a3117d51e702"). InnerVolumeSpecName "kube-api-access-zx92x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 23:10:22.504220 kubelet[2735]: I0912 23:10:22.504063 2735 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zx92x\" (UniqueName: \"kubernetes.io/projected/8f0d555c-c909-4b23-ab0e-a3117d51e702-kube-api-access-zx92x\") on node \"localhost\" DevicePath \"\"" Sep 12 23:10:22.504220 kubelet[2735]: I0912 23:10:22.504106 2735 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8f0d555c-c909-4b23-ab0e-a3117d51e702-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Sep 12 23:10:22.933471 systemd[1]: Removed slice kubepods-besteffort-pod8f0d555c_c909_4b23_ab0e_a3117d51e702.slice - libcontainer container kubepods-besteffort-pod8f0d555c_c909_4b23_ab0e_a3117d51e702.slice. Sep 12 23:10:23.901219 kubelet[2735]: I0912 23:10:23.901166 2735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0d555c-c909-4b23-ab0e-a3117d51e702" path="/var/lib/kubelet/pods/8f0d555c-c909-4b23-ab0e-a3117d51e702/volumes" Sep 12 23:10:25.149271 systemd[1]: Started sshd@15-10.0.0.150:22-10.0.0.1:49842.service - OpenSSH per-connection server daemon (10.0.0.1:49842). Sep 12 23:10:25.210761 sshd[5991]: Accepted publickey for core from 10.0.0.1 port 49842 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:25.212554 sshd-session[5991]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:25.217385 systemd-logind[1546]: New session 16 of user core. Sep 12 23:10:25.228825 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 23:10:25.342086 sshd[5994]: Connection closed by 10.0.0.1 port 49842 Sep 12 23:10:25.342488 sshd-session[5991]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:25.346927 systemd[1]: sshd@15-10.0.0.150:22-10.0.0.1:49842.service: Deactivated successfully. Sep 12 23:10:25.348975 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 23:10:25.349856 systemd-logind[1546]: Session 16 logged out. Waiting for processes to exit. Sep 12 23:10:25.351117 systemd-logind[1546]: Removed session 16. Sep 12 23:10:29.695545 containerd[1565]: time="2025-09-12T23:10:29.695124198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"74aa0f40a2a1f1247814a05f7594ebdf55e7b9982ca008f4930c2c8687b369f9\" id:\"face0cae8090e0c8d6a4103cf6f193be292b4ee15252c5e99328e42c3b2e9f21\" pid:6029 exited_at:{seconds:1757718629 nanos:694599299}" Sep 12 23:10:30.359542 systemd[1]: Started sshd@16-10.0.0.150:22-10.0.0.1:46376.service - OpenSSH per-connection server daemon (10.0.0.1:46376). Sep 12 23:10:30.429231 sshd[6041]: Accepted publickey for core from 10.0.0.1 port 46376 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:30.431007 sshd-session[6041]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:30.435835 systemd-logind[1546]: New session 17 of user core. Sep 12 23:10:30.440782 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 23:10:30.629831 sshd[6044]: Connection closed by 10.0.0.1 port 46376 Sep 12 23:10:30.630171 sshd-session[6041]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:30.642596 systemd[1]: sshd@16-10.0.0.150:22-10.0.0.1:46376.service: Deactivated successfully. Sep 12 23:10:30.645043 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 23:10:30.646326 systemd-logind[1546]: Session 17 logged out. Waiting for processes to exit. Sep 12 23:10:30.649898 systemd[1]: Started sshd@17-10.0.0.150:22-10.0.0.1:46388.service - OpenSSH per-connection server daemon (10.0.0.1:46388). Sep 12 23:10:30.650933 systemd-logind[1546]: Removed session 17. Sep 12 23:10:30.709428 sshd[6060]: Accepted publickey for core from 10.0.0.1 port 46388 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:30.710829 sshd-session[6060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:30.715347 systemd-logind[1546]: New session 18 of user core. Sep 12 23:10:30.724782 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 23:10:31.033207 sshd[6063]: Connection closed by 10.0.0.1 port 46388 Sep 12 23:10:31.033723 sshd-session[6060]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:31.043565 systemd[1]: sshd@17-10.0.0.150:22-10.0.0.1:46388.service: Deactivated successfully. Sep 12 23:10:31.045503 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 23:10:31.046507 systemd-logind[1546]: Session 18 logged out. Waiting for processes to exit. Sep 12 23:10:31.049627 systemd[1]: Started sshd@18-10.0.0.150:22-10.0.0.1:46404.service - OpenSSH per-connection server daemon (10.0.0.1:46404). Sep 12 23:10:31.051030 systemd-logind[1546]: Removed session 18. Sep 12 23:10:31.108220 sshd[6075]: Accepted publickey for core from 10.0.0.1 port 46404 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:31.110009 sshd-session[6075]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:31.114847 systemd-logind[1546]: New session 19 of user core. Sep 12 23:10:31.127978 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 23:10:31.780027 sshd[6078]: Connection closed by 10.0.0.1 port 46404 Sep 12 23:10:31.781931 sshd-session[6075]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:31.794249 systemd[1]: sshd@18-10.0.0.150:22-10.0.0.1:46404.service: Deactivated successfully. Sep 12 23:10:31.798841 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 23:10:31.800527 systemd-logind[1546]: Session 19 logged out. Waiting for processes to exit. Sep 12 23:10:31.804140 systemd[1]: Started sshd@19-10.0.0.150:22-10.0.0.1:46410.service - OpenSSH per-connection server daemon (10.0.0.1:46410). Sep 12 23:10:31.807297 systemd-logind[1546]: Removed session 19. Sep 12 23:10:31.871083 sshd[6097]: Accepted publickey for core from 10.0.0.1 port 46410 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:31.872988 sshd-session[6097]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:31.877690 systemd-logind[1546]: New session 20 of user core. Sep 12 23:10:31.889776 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 23:10:32.243197 sshd[6101]: Connection closed by 10.0.0.1 port 46410 Sep 12 23:10:32.244937 sshd-session[6097]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:32.252586 systemd[1]: sshd@19-10.0.0.150:22-10.0.0.1:46410.service: Deactivated successfully. Sep 12 23:10:32.254511 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 23:10:32.255363 systemd-logind[1546]: Session 20 logged out. Waiting for processes to exit. Sep 12 23:10:32.258976 systemd[1]: Started sshd@20-10.0.0.150:22-10.0.0.1:46422.service - OpenSSH per-connection server daemon (10.0.0.1:46422). Sep 12 23:10:32.260538 systemd-logind[1546]: Removed session 20. Sep 12 23:10:32.316745 sshd[6114]: Accepted publickey for core from 10.0.0.1 port 46422 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:32.318088 sshd-session[6114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:32.322932 systemd-logind[1546]: New session 21 of user core. Sep 12 23:10:32.334802 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 23:10:32.569427 sshd[6117]: Connection closed by 10.0.0.1 port 46422 Sep 12 23:10:32.569892 sshd-session[6114]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:32.575451 systemd[1]: sshd@20-10.0.0.150:22-10.0.0.1:46422.service: Deactivated successfully. Sep 12 23:10:32.577677 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 23:10:32.578499 systemd-logind[1546]: Session 21 logged out. Waiting for processes to exit. Sep 12 23:10:32.580025 systemd-logind[1546]: Removed session 21. Sep 12 23:10:34.303958 containerd[1565]: time="2025-09-12T23:10:34.303898803Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee\" id:\"cba338bdd09f60252b0e07f5339255e864e025ff80a454ae3808118924b4323f\" pid:6141 exited_at:{seconds:1757718634 nanos:303557635}" Sep 12 23:10:37.585114 systemd[1]: Started sshd@21-10.0.0.150:22-10.0.0.1:46438.service - OpenSSH per-connection server daemon (10.0.0.1:46438). Sep 12 23:10:37.643824 sshd[6155]: Accepted publickey for core from 10.0.0.1 port 46438 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:37.645323 sshd-session[6155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:37.649442 systemd-logind[1546]: New session 22 of user core. Sep 12 23:10:37.662780 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 23:10:37.769132 sshd[6158]: Connection closed by 10.0.0.1 port 46438 Sep 12 23:10:37.769524 sshd-session[6155]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:37.773806 systemd[1]: sshd@21-10.0.0.150:22-10.0.0.1:46438.service: Deactivated successfully. Sep 12 23:10:37.776021 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 23:10:37.776858 systemd-logind[1546]: Session 22 logged out. Waiting for processes to exit. Sep 12 23:10:37.778688 systemd-logind[1546]: Removed session 22. Sep 12 23:10:37.949291 containerd[1565]: time="2025-09-12T23:10:37.949137330Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c973c79ffb4acc917c9ead060d91977105f320b27372631040726ed78b6100ee\" id:\"90a66b63a0d864d9361a912cf0a4d4544329d548d09ce160d847fd3724403274\" pid:6182 exited_at:{seconds:1757718637 nanos:948722001}" Sep 12 23:10:42.783169 systemd[1]: Started sshd@22-10.0.0.150:22-10.0.0.1:54040.service - OpenSSH per-connection server daemon (10.0.0.1:54040). Sep 12 23:10:42.840921 sshd[6199]: Accepted publickey for core from 10.0.0.1 port 54040 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:42.843110 sshd-session[6199]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:42.848453 systemd-logind[1546]: New session 23 of user core. Sep 12 23:10:42.856982 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 23:10:42.971293 sshd[6202]: Connection closed by 10.0.0.1 port 54040 Sep 12 23:10:42.971712 sshd-session[6199]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:42.976623 systemd[1]: sshd@22-10.0.0.150:22-10.0.0.1:54040.service: Deactivated successfully. Sep 12 23:10:42.978715 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 23:10:42.979762 systemd-logind[1546]: Session 23 logged out. Waiting for processes to exit. Sep 12 23:10:42.981347 systemd-logind[1546]: Removed session 23. Sep 12 23:10:45.650605 containerd[1565]: time="2025-09-12T23:10:45.650545576Z" level=info msg="TaskExit event in podsandbox handler container_id:\"830b232da73452e7523f258c2a47e41b0b421b9c2cc3c7a8042bd99e0e17bfaa\" id:\"ff389cfd729791b691d39f21ed546bb93abac3cb4850d1e85599ccbd3a013462\" pid:6227 exited_at:{seconds:1757718645 nanos:650157961}" Sep 12 23:10:47.985199 systemd[1]: Started sshd@23-10.0.0.150:22-10.0.0.1:54044.service - OpenSSH per-connection server daemon (10.0.0.1:54044). Sep 12 23:10:48.060319 sshd[6242]: Accepted publickey for core from 10.0.0.1 port 54044 ssh2: RSA SHA256:AJXFPvfa6P0uoKREGLBBCMsQReZl0x2RPvoaq8XPvvc Sep 12 23:10:48.062439 sshd-session[6242]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:10:48.066764 systemd-logind[1546]: New session 24 of user core. Sep 12 23:10:48.074785 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 23:10:48.227381 sshd[6245]: Connection closed by 10.0.0.1 port 54044 Sep 12 23:10:48.227776 sshd-session[6242]: pam_unix(sshd:session): session closed for user core Sep 12 23:10:48.232957 systemd[1]: sshd@23-10.0.0.150:22-10.0.0.1:54044.service: Deactivated successfully. Sep 12 23:10:48.235245 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 23:10:48.236274 systemd-logind[1546]: Session 24 logged out. Waiting for processes to exit. Sep 12 23:10:48.237945 systemd-logind[1546]: Removed session 24.