Sep 12 00:25:06.819149 kernel: Linux version 6.12.46-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Sep 11 22:19:36 -00 2025 Sep 12 00:25:06.819178 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=04dc7b04a37dcb996cc4d6074b142179401d5685abf61ddcbaff7d77d0988990 Sep 12 00:25:06.819189 kernel: BIOS-provided physical RAM map: Sep 12 00:25:06.819196 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 12 00:25:06.819202 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 12 00:25:06.819209 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 12 00:25:06.819216 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Sep 12 00:25:06.819222 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 12 00:25:06.819231 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 12 00:25:06.819237 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 12 00:25:06.819244 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Sep 12 00:25:06.819250 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 12 00:25:06.819256 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 12 00:25:06.819263 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 12 00:25:06.819272 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 12 00:25:06.819279 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 12 00:25:06.819286 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 12 00:25:06.819293 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 12 00:25:06.819300 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 12 00:25:06.819307 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 12 00:25:06.819313 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 12 00:25:06.819320 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 12 00:25:06.819327 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 00:25:06.819334 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 00:25:06.819340 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 12 00:25:06.819349 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 12 00:25:06.819356 kernel: NX (Execute Disable) protection: active Sep 12 00:25:06.819363 kernel: APIC: Static calls initialized Sep 12 00:25:06.819370 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Sep 12 00:25:06.819377 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Sep 12 00:25:06.819383 kernel: extended physical RAM map: Sep 12 00:25:06.819390 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 12 00:25:06.819397 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 12 00:25:06.819404 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 12 00:25:06.819411 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Sep 12 00:25:06.819418 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 12 00:25:06.819426 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 12 00:25:06.819433 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 12 00:25:06.819440 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Sep 12 00:25:06.819447 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Sep 12 00:25:06.819457 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Sep 12 00:25:06.819464 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Sep 12 00:25:06.819475 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Sep 12 00:25:06.819484 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 12 00:25:06.819492 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 12 00:25:06.819501 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 12 00:25:06.819510 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 12 00:25:06.819519 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 12 00:25:06.819528 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 12 00:25:06.819537 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 12 00:25:06.819545 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 12 00:25:06.819557 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 12 00:25:06.819566 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 12 00:25:06.819574 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 12 00:25:06.819583 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 12 00:25:06.819592 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 00:25:06.819601 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 12 00:25:06.819609 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 12 00:25:06.819618 kernel: efi: EFI v2.7 by EDK II Sep 12 00:25:06.819626 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Sep 12 00:25:06.819633 kernel: random: crng init done Sep 12 00:25:06.819641 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Sep 12 00:25:06.819648 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Sep 12 00:25:06.819657 kernel: secureboot: Secure boot disabled Sep 12 00:25:06.819664 kernel: SMBIOS 2.8 present. Sep 12 00:25:06.819671 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Sep 12 00:25:06.819678 kernel: DMI: Memory slots populated: 1/1 Sep 12 00:25:06.819685 kernel: Hypervisor detected: KVM Sep 12 00:25:06.819692 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 00:25:06.819699 kernel: kvm-clock: using sched offset of 3507564250 cycles Sep 12 00:25:06.819706 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 00:25:06.819713 kernel: tsc: Detected 2794.750 MHz processor Sep 12 00:25:06.819721 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 00:25:06.819728 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 00:25:06.819737 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Sep 12 00:25:06.819744 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 12 00:25:06.819751 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 00:25:06.819758 kernel: Using GB pages for direct mapping Sep 12 00:25:06.819765 kernel: ACPI: Early table checksum verification disabled Sep 12 00:25:06.819773 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Sep 12 00:25:06.819780 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Sep 12 00:25:06.819787 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 00:25:06.819794 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 00:25:06.819803 kernel: ACPI: FACS 0x000000009CBDD000 000040 Sep 12 00:25:06.819810 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 00:25:06.819818 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 00:25:06.819825 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 00:25:06.819832 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 00:25:06.819839 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Sep 12 00:25:06.819847 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Sep 12 00:25:06.819854 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Sep 12 00:25:06.819863 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Sep 12 00:25:06.819870 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Sep 12 00:25:06.819877 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Sep 12 00:25:06.819884 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Sep 12 00:25:06.819891 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Sep 12 00:25:06.819898 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Sep 12 00:25:06.819906 kernel: No NUMA configuration found Sep 12 00:25:06.819913 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Sep 12 00:25:06.819920 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Sep 12 00:25:06.819927 kernel: Zone ranges: Sep 12 00:25:06.819937 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 00:25:06.819944 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Sep 12 00:25:06.819951 kernel: Normal empty Sep 12 00:25:06.819958 kernel: Device empty Sep 12 00:25:06.819965 kernel: Movable zone start for each node Sep 12 00:25:06.819972 kernel: Early memory node ranges Sep 12 00:25:06.819979 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 12 00:25:06.819986 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Sep 12 00:25:06.819993 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Sep 12 00:25:06.820002 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Sep 12 00:25:06.820009 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Sep 12 00:25:06.820016 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Sep 12 00:25:06.820024 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Sep 12 00:25:06.820031 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Sep 12 00:25:06.820038 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Sep 12 00:25:06.820045 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 00:25:06.820052 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 12 00:25:06.820068 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Sep 12 00:25:06.820076 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 00:25:06.820083 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Sep 12 00:25:06.820090 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Sep 12 00:25:06.820100 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 12 00:25:06.820107 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Sep 12 00:25:06.820115 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Sep 12 00:25:06.820136 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 12 00:25:06.820143 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 00:25:06.820153 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 00:25:06.820161 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 00:25:06.820175 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 00:25:06.820182 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 00:25:06.820190 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 00:25:06.820197 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 00:25:06.820205 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 00:25:06.820212 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 00:25:06.820219 kernel: TSC deadline timer available Sep 12 00:25:06.820227 kernel: CPU topo: Max. logical packages: 1 Sep 12 00:25:06.820237 kernel: CPU topo: Max. logical dies: 1 Sep 12 00:25:06.820244 kernel: CPU topo: Max. dies per package: 1 Sep 12 00:25:06.820252 kernel: CPU topo: Max. threads per core: 1 Sep 12 00:25:06.820259 kernel: CPU topo: Num. cores per package: 4 Sep 12 00:25:06.820266 kernel: CPU topo: Num. threads per package: 4 Sep 12 00:25:06.820273 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 12 00:25:06.820281 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 12 00:25:06.820288 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 12 00:25:06.820296 kernel: kvm-guest: setup PV sched yield Sep 12 00:25:06.820305 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Sep 12 00:25:06.820313 kernel: Booting paravirtualized kernel on KVM Sep 12 00:25:06.820320 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 00:25:06.820328 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 12 00:25:06.820335 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 12 00:25:06.820343 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 12 00:25:06.820350 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 12 00:25:06.820358 kernel: kvm-guest: PV spinlocks enabled Sep 12 00:25:06.820365 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 12 00:25:06.820376 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=04dc7b04a37dcb996cc4d6074b142179401d5685abf61ddcbaff7d77d0988990 Sep 12 00:25:06.820384 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 00:25:06.820391 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 00:25:06.820399 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 00:25:06.820406 kernel: Fallback order for Node 0: 0 Sep 12 00:25:06.820414 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Sep 12 00:25:06.820421 kernel: Policy zone: DMA32 Sep 12 00:25:06.820428 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 00:25:06.820438 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 12 00:25:06.820445 kernel: ftrace: allocating 40120 entries in 157 pages Sep 12 00:25:06.820452 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 00:25:06.820460 kernel: Dynamic Preempt: voluntary Sep 12 00:25:06.820467 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 00:25:06.820475 kernel: rcu: RCU event tracing is enabled. Sep 12 00:25:06.820483 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 12 00:25:06.820490 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 00:25:06.820498 kernel: Rude variant of Tasks RCU enabled. Sep 12 00:25:06.820507 kernel: Tracing variant of Tasks RCU enabled. Sep 12 00:25:06.820515 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 00:25:06.820522 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 12 00:25:06.820530 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 00:25:06.820537 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 00:25:06.820545 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 12 00:25:06.820552 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 12 00:25:06.820560 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 00:25:06.820567 kernel: Console: colour dummy device 80x25 Sep 12 00:25:06.820577 kernel: printk: legacy console [ttyS0] enabled Sep 12 00:25:06.820584 kernel: ACPI: Core revision 20240827 Sep 12 00:25:06.820592 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 12 00:25:06.820599 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 00:25:06.820607 kernel: x2apic enabled Sep 12 00:25:06.820614 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 00:25:06.820621 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 12 00:25:06.820629 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 12 00:25:06.820636 kernel: kvm-guest: setup PV IPIs Sep 12 00:25:06.820646 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 00:25:06.820654 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 12 00:25:06.820663 kernel: Calibrating delay loop (skipped) preset value.. 5589.50 BogoMIPS (lpj=2794750) Sep 12 00:25:06.820672 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 12 00:25:06.820679 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 12 00:25:06.820687 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 12 00:25:06.820697 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 00:25:06.820705 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 00:25:06.820714 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 00:25:06.820724 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 12 00:25:06.820731 kernel: active return thunk: retbleed_return_thunk Sep 12 00:25:06.820739 kernel: RETBleed: Mitigation: untrained return thunk Sep 12 00:25:06.820746 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 00:25:06.820754 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 00:25:06.820761 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 12 00:25:06.820769 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 12 00:25:06.820777 kernel: active return thunk: srso_return_thunk Sep 12 00:25:06.820784 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 12 00:25:06.820795 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 00:25:06.820802 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 00:25:06.820810 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 00:25:06.820817 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 00:25:06.820825 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 00:25:06.820832 kernel: Freeing SMP alternatives memory: 32K Sep 12 00:25:06.820839 kernel: pid_max: default: 32768 minimum: 301 Sep 12 00:25:06.820847 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 00:25:06.820856 kernel: landlock: Up and running. Sep 12 00:25:06.820864 kernel: SELinux: Initializing. Sep 12 00:25:06.820871 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 00:25:06.820879 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 00:25:06.820887 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 12 00:25:06.820894 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 12 00:25:06.820902 kernel: ... version: 0 Sep 12 00:25:06.820909 kernel: ... bit width: 48 Sep 12 00:25:06.820916 kernel: ... generic registers: 6 Sep 12 00:25:06.820926 kernel: ... value mask: 0000ffffffffffff Sep 12 00:25:06.820933 kernel: ... max period: 00007fffffffffff Sep 12 00:25:06.820940 kernel: ... fixed-purpose events: 0 Sep 12 00:25:06.820948 kernel: ... event mask: 000000000000003f Sep 12 00:25:06.820955 kernel: signal: max sigframe size: 1776 Sep 12 00:25:06.820963 kernel: rcu: Hierarchical SRCU implementation. Sep 12 00:25:06.820970 kernel: rcu: Max phase no-delay instances is 400. Sep 12 00:25:06.820978 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 00:25:06.820986 kernel: smp: Bringing up secondary CPUs ... Sep 12 00:25:06.820993 kernel: smpboot: x86: Booting SMP configuration: Sep 12 00:25:06.821002 kernel: .... node #0, CPUs: #1 #2 #3 Sep 12 00:25:06.821010 kernel: smp: Brought up 1 node, 4 CPUs Sep 12 00:25:06.821017 kernel: smpboot: Total of 4 processors activated (22358.00 BogoMIPS) Sep 12 00:25:06.821025 kernel: Memory: 2424720K/2565800K available (14336K kernel code, 2432K rwdata, 9960K rodata, 53836K init, 1080K bss, 135148K reserved, 0K cma-reserved) Sep 12 00:25:06.821033 kernel: devtmpfs: initialized Sep 12 00:25:06.821040 kernel: x86/mm: Memory block size: 128MB Sep 12 00:25:06.821048 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Sep 12 00:25:06.821055 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Sep 12 00:25:06.821063 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Sep 12 00:25:06.821073 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Sep 12 00:25:06.821080 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Sep 12 00:25:06.821088 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Sep 12 00:25:06.821095 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 00:25:06.821103 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 12 00:25:06.821110 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 00:25:06.821130 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 00:25:06.821138 kernel: audit: initializing netlink subsys (disabled) Sep 12 00:25:06.821159 kernel: audit: type=2000 audit(1757636705.377:1): state=initialized audit_enabled=0 res=1 Sep 12 00:25:06.821172 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 00:25:06.821180 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 00:25:06.821187 kernel: cpuidle: using governor menu Sep 12 00:25:06.821195 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 00:25:06.821202 kernel: dca service started, version 1.12.1 Sep 12 00:25:06.821210 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 12 00:25:06.821218 kernel: PCI: Using configuration type 1 for base access Sep 12 00:25:06.821225 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 00:25:06.821235 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 00:25:06.821242 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 00:25:06.821250 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 00:25:06.821257 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 00:25:06.821265 kernel: ACPI: Added _OSI(Module Device) Sep 12 00:25:06.821272 kernel: ACPI: Added _OSI(Processor Device) Sep 12 00:25:06.821280 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 00:25:06.821287 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 00:25:06.821295 kernel: ACPI: Interpreter enabled Sep 12 00:25:06.821304 kernel: ACPI: PM: (supports S0 S3 S5) Sep 12 00:25:06.821311 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 00:25:06.821319 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 00:25:06.821326 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 00:25:06.821334 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 12 00:25:06.821341 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 00:25:06.821509 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 00:25:06.821631 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 12 00:25:06.821749 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 12 00:25:06.821759 kernel: PCI host bridge to bus 0000:00 Sep 12 00:25:06.821876 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 00:25:06.821980 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 00:25:06.822087 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 00:25:06.822222 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Sep 12 00:25:06.822327 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Sep 12 00:25:06.822434 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Sep 12 00:25:06.822539 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 00:25:06.822697 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 12 00:25:06.822838 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 12 00:25:06.822953 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Sep 12 00:25:06.823067 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Sep 12 00:25:06.823225 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 12 00:25:06.823342 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 00:25:06.823474 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 12 00:25:06.823590 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Sep 12 00:25:06.823706 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Sep 12 00:25:06.823820 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Sep 12 00:25:06.823946 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 12 00:25:06.824065 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Sep 12 00:25:06.824205 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Sep 12 00:25:06.824323 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Sep 12 00:25:06.824477 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 12 00:25:06.824673 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Sep 12 00:25:06.824791 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Sep 12 00:25:06.824907 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Sep 12 00:25:06.825025 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Sep 12 00:25:06.825163 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 12 00:25:06.825288 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 12 00:25:06.825410 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 12 00:25:06.825525 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Sep 12 00:25:06.825659 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Sep 12 00:25:06.825848 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 12 00:25:06.826002 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Sep 12 00:25:06.826014 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 00:25:06.826022 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 00:25:06.826029 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 00:25:06.826037 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 00:25:06.826044 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 12 00:25:06.826052 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 12 00:25:06.826059 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 12 00:25:06.826070 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 12 00:25:06.826078 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 12 00:25:06.826085 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 12 00:25:06.826103 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 12 00:25:06.826110 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 12 00:25:06.826132 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 12 00:25:06.826150 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 12 00:25:06.826158 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 12 00:25:06.826165 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 12 00:25:06.826184 kernel: iommu: Default domain type: Translated Sep 12 00:25:06.826192 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 00:25:06.826199 kernel: efivars: Registered efivars operations Sep 12 00:25:06.826207 kernel: PCI: Using ACPI for IRQ routing Sep 12 00:25:06.826214 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 00:25:06.826222 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Sep 12 00:25:06.826229 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Sep 12 00:25:06.826236 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Sep 12 00:25:06.826244 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Sep 12 00:25:06.826253 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Sep 12 00:25:06.826261 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Sep 12 00:25:06.826268 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Sep 12 00:25:06.826276 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Sep 12 00:25:06.826396 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 12 00:25:06.826519 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 12 00:25:06.826654 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 00:25:06.826667 kernel: vgaarb: loaded Sep 12 00:25:06.826675 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 12 00:25:06.826682 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 12 00:25:06.826690 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 00:25:06.826697 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 00:25:06.826705 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 00:25:06.826713 kernel: pnp: PnP ACPI init Sep 12 00:25:06.826848 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Sep 12 00:25:06.826862 kernel: pnp: PnP ACPI: found 6 devices Sep 12 00:25:06.826872 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 00:25:06.826880 kernel: NET: Registered PF_INET protocol family Sep 12 00:25:06.826888 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 00:25:06.826896 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 00:25:06.826904 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 00:25:06.826911 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 00:25:06.826919 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 00:25:06.826927 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 00:25:06.826937 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 00:25:06.826945 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 00:25:06.826953 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 00:25:06.826960 kernel: NET: Registered PF_XDP protocol family Sep 12 00:25:06.827093 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Sep 12 00:25:06.827241 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Sep 12 00:25:06.827348 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 00:25:06.827466 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 00:25:06.827585 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 00:25:06.827717 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Sep 12 00:25:06.827835 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Sep 12 00:25:06.827942 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Sep 12 00:25:06.827953 kernel: PCI: CLS 0 bytes, default 64 Sep 12 00:25:06.827961 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848e100549, max_idle_ns: 440795215505 ns Sep 12 00:25:06.827969 kernel: Initialise system trusted keyrings Sep 12 00:25:06.827980 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 00:25:06.827988 kernel: Key type asymmetric registered Sep 12 00:25:06.827996 kernel: Asymmetric key parser 'x509' registered Sep 12 00:25:06.828004 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 00:25:06.828012 kernel: io scheduler mq-deadline registered Sep 12 00:25:06.828019 kernel: io scheduler kyber registered Sep 12 00:25:06.828027 kernel: io scheduler bfq registered Sep 12 00:25:06.828035 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 00:25:06.828045 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 12 00:25:06.828054 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 12 00:25:06.828062 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 12 00:25:06.828069 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 00:25:06.828079 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 00:25:06.828087 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 00:25:06.828095 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 00:25:06.828103 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 00:25:06.828111 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 00:25:06.828285 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 12 00:25:06.828449 kernel: rtc_cmos 00:04: registered as rtc0 Sep 12 00:25:06.828561 kernel: rtc_cmos 00:04: setting system clock to 2025-09-12T00:25:06 UTC (1757636706) Sep 12 00:25:06.828669 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 12 00:25:06.828680 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 12 00:25:06.828688 kernel: efifb: probing for efifb Sep 12 00:25:06.828696 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Sep 12 00:25:06.828707 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Sep 12 00:25:06.828715 kernel: efifb: scrolling: redraw Sep 12 00:25:06.828723 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 12 00:25:06.828731 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 00:25:06.828738 kernel: fb0: EFI VGA frame buffer device Sep 12 00:25:06.828746 kernel: pstore: Using crash dump compression: deflate Sep 12 00:25:06.828754 kernel: pstore: Registered efi_pstore as persistent store backend Sep 12 00:25:06.828762 kernel: NET: Registered PF_INET6 protocol family Sep 12 00:25:06.828770 kernel: Segment Routing with IPv6 Sep 12 00:25:06.828778 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 00:25:06.828787 kernel: NET: Registered PF_PACKET protocol family Sep 12 00:25:06.828795 kernel: Key type dns_resolver registered Sep 12 00:25:06.828803 kernel: IPI shorthand broadcast: enabled Sep 12 00:25:06.828811 kernel: sched_clock: Marking stable (2720002890, 154708779)->(2893160345, -18448676) Sep 12 00:25:06.828819 kernel: registered taskstats version 1 Sep 12 00:25:06.828826 kernel: Loading compiled-in X.509 certificates Sep 12 00:25:06.828834 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.46-flatcar: 652e453facea91af3a07ba1d2bcc346a615f1cf9' Sep 12 00:25:06.828843 kernel: Demotion targets for Node 0: null Sep 12 00:25:06.828850 kernel: Key type .fscrypt registered Sep 12 00:25:06.828860 kernel: Key type fscrypt-provisioning registered Sep 12 00:25:06.828868 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 00:25:06.828875 kernel: ima: Allocated hash algorithm: sha1 Sep 12 00:25:06.828883 kernel: ima: No architecture policies found Sep 12 00:25:06.828891 kernel: clk: Disabling unused clocks Sep 12 00:25:06.828898 kernel: Warning: unable to open an initial console. Sep 12 00:25:06.828907 kernel: Freeing unused kernel image (initmem) memory: 53836K Sep 12 00:25:06.828914 kernel: Write protecting the kernel read-only data: 24576k Sep 12 00:25:06.828924 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 00:25:06.828932 kernel: Run /init as init process Sep 12 00:25:06.828940 kernel: with arguments: Sep 12 00:25:06.828948 kernel: /init Sep 12 00:25:06.828955 kernel: with environment: Sep 12 00:25:06.828963 kernel: HOME=/ Sep 12 00:25:06.828970 kernel: TERM=linux Sep 12 00:25:06.828978 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 00:25:06.828986 systemd[1]: Successfully made /usr/ read-only. Sep 12 00:25:06.828999 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 00:25:06.829008 systemd[1]: Detected virtualization kvm. Sep 12 00:25:06.829016 systemd[1]: Detected architecture x86-64. Sep 12 00:25:06.829024 systemd[1]: Running in initrd. Sep 12 00:25:06.829033 systemd[1]: No hostname configured, using default hostname. Sep 12 00:25:06.829041 systemd[1]: Hostname set to . Sep 12 00:25:06.829049 systemd[1]: Initializing machine ID from VM UUID. Sep 12 00:25:06.829059 systemd[1]: Queued start job for default target initrd.target. Sep 12 00:25:06.829068 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 00:25:06.829076 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 00:25:06.829085 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 00:25:06.829094 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 00:25:06.829102 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 00:25:06.829111 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 00:25:06.829142 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 00:25:06.829151 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 00:25:06.829159 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 00:25:06.829174 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 00:25:06.829183 systemd[1]: Reached target paths.target - Path Units. Sep 12 00:25:06.829191 systemd[1]: Reached target slices.target - Slice Units. Sep 12 00:25:06.829200 systemd[1]: Reached target swap.target - Swaps. Sep 12 00:25:06.829208 systemd[1]: Reached target timers.target - Timer Units. Sep 12 00:25:06.829216 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 00:25:06.829227 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 00:25:06.829235 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 00:25:06.829244 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 00:25:06.829261 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 00:25:06.829277 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 00:25:06.829286 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 00:25:06.829295 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 00:25:06.829303 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 00:25:06.829314 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 00:25:06.829322 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 00:25:06.829331 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 00:25:06.829339 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 00:25:06.829348 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 00:25:06.829356 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 00:25:06.829365 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:25:06.829373 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 00:25:06.829384 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 00:25:06.829393 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 00:25:06.829401 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 00:25:06.829430 systemd-journald[219]: Collecting audit messages is disabled. Sep 12 00:25:06.829452 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:06.829461 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 00:25:06.829470 systemd-journald[219]: Journal started Sep 12 00:25:06.829491 systemd-journald[219]: Runtime Journal (/run/log/journal/77007f5b0fc6478eb04ea4e6c72f191b) is 6M, max 48.5M, 42.4M free. Sep 12 00:25:06.827039 systemd-modules-load[221]: Inserted module 'overlay' Sep 12 00:25:06.835285 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 00:25:06.835667 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 00:25:06.843882 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 00:25:06.846315 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 00:25:06.857143 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 00:25:06.858391 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 00:25:06.862155 kernel: Bridge firewalling registered Sep 12 00:25:06.861475 systemd-modules-load[221]: Inserted module 'br_netfilter' Sep 12 00:25:06.862575 systemd-tmpfiles[241]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 00:25:06.862851 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 00:25:06.864765 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 00:25:06.867665 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 00:25:06.871890 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 00:25:06.874565 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 00:25:06.878189 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 00:25:06.880763 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 00:25:06.903798 dracut-cmdline[258]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=04dc7b04a37dcb996cc4d6074b142179401d5685abf61ddcbaff7d77d0988990 Sep 12 00:25:06.922001 systemd-resolved[261]: Positive Trust Anchors: Sep 12 00:25:06.922017 systemd-resolved[261]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 00:25:06.922046 systemd-resolved[261]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 00:25:06.924521 systemd-resolved[261]: Defaulting to hostname 'linux'. Sep 12 00:25:06.925525 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 00:25:06.931517 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 00:25:07.017157 kernel: SCSI subsystem initialized Sep 12 00:25:07.026298 kernel: Loading iSCSI transport class v2.0-870. Sep 12 00:25:07.036146 kernel: iscsi: registered transport (tcp) Sep 12 00:25:07.058172 kernel: iscsi: registered transport (qla4xxx) Sep 12 00:25:07.058254 kernel: QLogic iSCSI HBA Driver Sep 12 00:25:07.078654 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 00:25:07.097476 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 00:25:07.101145 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 00:25:07.150180 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 00:25:07.153551 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 00:25:07.214146 kernel: raid6: avx2x4 gen() 29605 MB/s Sep 12 00:25:07.231143 kernel: raid6: avx2x2 gen() 30291 MB/s Sep 12 00:25:07.248192 kernel: raid6: avx2x1 gen() 25372 MB/s Sep 12 00:25:07.248213 kernel: raid6: using algorithm avx2x2 gen() 30291 MB/s Sep 12 00:25:07.266218 kernel: raid6: .... xor() 19670 MB/s, rmw enabled Sep 12 00:25:07.266242 kernel: raid6: using avx2x2 recovery algorithm Sep 12 00:25:07.287141 kernel: xor: automatically using best checksumming function avx Sep 12 00:25:07.452148 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 00:25:07.460929 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 00:25:07.463610 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 00:25:07.498100 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 12 00:25:07.504364 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 00:25:07.506346 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 00:25:07.526952 dracut-pre-trigger[474]: rd.md=0: removing MD RAID activation Sep 12 00:25:07.556168 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 00:25:07.557490 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 00:25:07.627362 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 00:25:07.632383 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 00:25:07.663164 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 00:25:07.670143 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 12 00:25:07.678657 kernel: AES CTR mode by8 optimization enabled Sep 12 00:25:07.678674 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 12 00:25:07.685036 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 00:25:07.685053 kernel: GPT:9289727 != 19775487 Sep 12 00:25:07.685066 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 00:25:07.685079 kernel: GPT:9289727 != 19775487 Sep 12 00:25:07.685091 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 00:25:07.685104 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 00:25:07.696189 kernel: libata version 3.00 loaded. Sep 12 00:25:07.704173 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 00:25:07.704300 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:07.708820 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:25:07.712484 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:25:07.727138 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 12 00:25:07.727176 kernel: ahci 0000:00:1f.2: version 3.0 Sep 12 00:25:07.727370 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 12 00:25:07.731136 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 12 00:25:07.731315 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 12 00:25:07.731484 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 12 00:25:07.745174 kernel: scsi host0: ahci Sep 12 00:25:07.745417 kernel: scsi host1: ahci Sep 12 00:25:07.746414 kernel: scsi host2: ahci Sep 12 00:25:07.747437 kernel: scsi host3: ahci Sep 12 00:25:07.750207 kernel: scsi host4: ahci Sep 12 00:25:07.750350 kernel: scsi host5: ahci Sep 12 00:25:07.750490 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Sep 12 00:25:07.750502 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Sep 12 00:25:07.750512 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Sep 12 00:25:07.751554 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Sep 12 00:25:07.751573 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Sep 12 00:25:07.751610 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 12 00:25:07.755365 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Sep 12 00:25:07.757007 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:07.768506 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 12 00:25:07.777829 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 12 00:25:07.780415 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 12 00:25:07.804558 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 00:25:07.808714 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 00:25:07.810947 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 00:25:07.810998 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:07.814163 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:25:07.833715 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:25:07.836162 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 00:25:07.851360 disk-uuid[635]: Primary Header is updated. Sep 12 00:25:07.851360 disk-uuid[635]: Secondary Entries is updated. Sep 12 00:25:07.851360 disk-uuid[635]: Secondary Header is updated. Sep 12 00:25:07.855136 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 00:25:07.860150 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 00:25:07.872465 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:08.061147 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 00:25:08.061205 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 12 00:25:08.061217 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 00:25:08.061227 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 12 00:25:08.062152 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 00:25:08.063150 kernel: ata3.00: LPM support broken, forcing max_power Sep 12 00:25:08.063173 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 12 00:25:08.063716 kernel: ata3.00: applying bridge limits Sep 12 00:25:08.065159 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 12 00:25:08.065174 kernel: ata3.00: LPM support broken, forcing max_power Sep 12 00:25:08.065566 kernel: ata3.00: configured for UDMA/100 Sep 12 00:25:08.068166 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 00:25:08.111625 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 12 00:25:08.111818 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 00:25:08.128168 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 12 00:25:08.474113 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 00:25:08.476605 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 00:25:08.476689 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 00:25:08.478942 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 00:25:08.483484 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 00:25:08.507561 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 00:25:08.861731 disk-uuid[636]: The operation has completed successfully. Sep 12 00:25:08.863050 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 12 00:25:08.886590 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 00:25:08.886700 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 00:25:08.925626 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 00:25:08.950999 sh[670]: Success Sep 12 00:25:08.968146 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 00:25:08.968177 kernel: device-mapper: uevent: version 1.0.3 Sep 12 00:25:08.968190 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 00:25:08.978179 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 12 00:25:09.008465 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 00:25:09.010404 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 00:25:09.025924 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 00:25:09.032904 kernel: BTRFS: device fsid e375903e-484e-4702-81f7-5fa3109f1a1c devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (682) Sep 12 00:25:09.032931 kernel: BTRFS info (device dm-0): first mount of filesystem e375903e-484e-4702-81f7-5fa3109f1a1c Sep 12 00:25:09.032941 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:25:09.038754 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 00:25:09.038807 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 00:25:09.039711 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 00:25:09.041796 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 00:25:09.044036 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 00:25:09.046514 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 00:25:09.048485 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 00:25:09.072497 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (715) Sep 12 00:25:09.072520 kernel: BTRFS info (device vda6): first mount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:25:09.072532 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:25:09.076150 kernel: BTRFS info (device vda6): turning on async discard Sep 12 00:25:09.076199 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 00:25:09.081146 kernel: BTRFS info (device vda6): last unmount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:25:09.081597 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 00:25:09.083654 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 00:25:09.164500 ignition[756]: Ignition 2.21.0 Sep 12 00:25:09.164514 ignition[756]: Stage: fetch-offline Sep 12 00:25:09.164553 ignition[756]: no configs at "/usr/lib/ignition/base.d" Sep 12 00:25:09.164562 ignition[756]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 00:25:09.164672 ignition[756]: parsed url from cmdline: "" Sep 12 00:25:09.164677 ignition[756]: no config URL provided Sep 12 00:25:09.164682 ignition[756]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 00:25:09.164691 ignition[756]: no config at "/usr/lib/ignition/user.ign" Sep 12 00:25:09.164716 ignition[756]: op(1): [started] loading QEMU firmware config module Sep 12 00:25:09.164721 ignition[756]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 12 00:25:09.178258 ignition[756]: op(1): [finished] loading QEMU firmware config module Sep 12 00:25:09.187197 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 00:25:09.190755 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 00:25:09.221386 ignition[756]: parsing config with SHA512: 1ef8e9afc1d7e58418ba1d5d87f0924e62803db5bdf8cc2416750f997e9f5a993ca4ae4beadb8fa288408681086906bf2fbc289e4f286322343318da0d82b54d Sep 12 00:25:09.224571 unknown[756]: fetched base config from "system" Sep 12 00:25:09.224584 unknown[756]: fetched user config from "qemu" Sep 12 00:25:09.224894 ignition[756]: fetch-offline: fetch-offline passed Sep 12 00:25:09.224944 ignition[756]: Ignition finished successfully Sep 12 00:25:09.228424 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 00:25:09.237744 systemd-networkd[860]: lo: Link UP Sep 12 00:25:09.237755 systemd-networkd[860]: lo: Gained carrier Sep 12 00:25:09.239283 systemd-networkd[860]: Enumeration completed Sep 12 00:25:09.239381 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 00:25:09.239625 systemd-networkd[860]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 00:25:09.239630 systemd-networkd[860]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 00:25:09.240520 systemd-networkd[860]: eth0: Link UP Sep 12 00:25:09.240704 systemd-networkd[860]: eth0: Gained carrier Sep 12 00:25:09.240712 systemd-networkd[860]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 00:25:09.241612 systemd[1]: Reached target network.target - Network. Sep 12 00:25:09.245716 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 12 00:25:09.248433 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 00:25:09.262195 systemd-networkd[860]: eth0: DHCPv4 address 10.0.0.151/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 00:25:09.284935 ignition[864]: Ignition 2.21.0 Sep 12 00:25:09.284947 ignition[864]: Stage: kargs Sep 12 00:25:09.285076 ignition[864]: no configs at "/usr/lib/ignition/base.d" Sep 12 00:25:09.285086 ignition[864]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 00:25:09.289482 ignition[864]: kargs: kargs passed Sep 12 00:25:09.289544 ignition[864]: Ignition finished successfully Sep 12 00:25:09.295188 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 00:25:09.297967 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 00:25:09.336963 ignition[873]: Ignition 2.21.0 Sep 12 00:25:09.336975 ignition[873]: Stage: disks Sep 12 00:25:09.337097 ignition[873]: no configs at "/usr/lib/ignition/base.d" Sep 12 00:25:09.337116 ignition[873]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 00:25:09.338824 ignition[873]: disks: disks passed Sep 12 00:25:09.338869 ignition[873]: Ignition finished successfully Sep 12 00:25:09.341688 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 00:25:09.343016 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 00:25:09.344836 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 00:25:09.346994 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 00:25:09.349075 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 00:25:09.351004 systemd[1]: Reached target basic.target - Basic System. Sep 12 00:25:09.352990 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 00:25:09.389052 systemd-fsck[883]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 12 00:25:09.397940 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 00:25:09.401950 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 00:25:09.508144 kernel: EXT4-fs (vda9): mounted filesystem c7fbf20f-7fc7-47c1-8781-0f8569841f1e r/w with ordered data mode. Quota mode: none. Sep 12 00:25:09.508237 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 00:25:09.508857 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 00:25:09.510226 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 00:25:09.512431 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 00:25:09.513815 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 00:25:09.513854 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 00:25:09.513877 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 00:25:09.530028 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 00:25:09.532233 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 00:25:09.536727 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (891) Sep 12 00:25:09.536753 kernel: BTRFS info (device vda6): first mount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:25:09.536763 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:25:09.540760 kernel: BTRFS info (device vda6): turning on async discard Sep 12 00:25:09.540786 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 00:25:09.543051 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 00:25:09.567669 initrd-setup-root[916]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 00:25:09.572319 initrd-setup-root[923]: cut: /sysroot/etc/group: No such file or directory Sep 12 00:25:09.577354 initrd-setup-root[930]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 00:25:09.581970 initrd-setup-root[937]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 00:25:09.663983 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 00:25:09.667502 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 00:25:09.670042 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 00:25:09.689165 kernel: BTRFS info (device vda6): last unmount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:25:09.700383 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 00:25:09.715839 ignition[1006]: INFO : Ignition 2.21.0 Sep 12 00:25:09.715839 ignition[1006]: INFO : Stage: mount Sep 12 00:25:09.715839 ignition[1006]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 00:25:09.715839 ignition[1006]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 00:25:09.719753 ignition[1006]: INFO : mount: mount passed Sep 12 00:25:09.719753 ignition[1006]: INFO : Ignition finished successfully Sep 12 00:25:09.720013 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 00:25:09.722526 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 00:25:10.031985 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 00:25:10.033630 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 00:25:10.063153 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1018) Sep 12 00:25:10.065143 kernel: BTRFS info (device vda6): first mount of filesystem 8d27ac3b-8168-4900-a00e-1d8b6b830700 Sep 12 00:25:10.065167 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 00:25:10.068165 kernel: BTRFS info (device vda6): turning on async discard Sep 12 00:25:10.068185 kernel: BTRFS info (device vda6): enabling free space tree Sep 12 00:25:10.070011 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 00:25:10.105392 ignition[1036]: INFO : Ignition 2.21.0 Sep 12 00:25:10.105392 ignition[1036]: INFO : Stage: files Sep 12 00:25:10.107512 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 00:25:10.107512 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 00:25:10.107512 ignition[1036]: DEBUG : files: compiled without relabeling support, skipping Sep 12 00:25:10.111659 ignition[1036]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 00:25:10.111659 ignition[1036]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 00:25:10.111659 ignition[1036]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 00:25:10.111659 ignition[1036]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 00:25:10.111659 ignition[1036]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 00:25:10.110147 unknown[1036]: wrote ssh authorized keys file for user: core Sep 12 00:25:10.119558 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 00:25:10.119558 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 12 00:25:10.164502 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 00:25:10.300475 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 00:25:10.300475 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 00:25:10.304338 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 00:25:10.304338 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 00:25:10.304338 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 00:25:10.304338 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 00:25:10.304338 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 00:25:10.304338 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 00:25:10.304338 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 00:25:10.316778 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 00:25:10.316778 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 00:25:10.316778 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 00:25:10.316778 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 00:25:10.316778 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 00:25:10.316778 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 12 00:25:10.709283 systemd-networkd[860]: eth0: Gained IPv6LL Sep 12 00:25:10.782568 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 00:25:11.259744 ignition[1036]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 00:25:11.259744 ignition[1036]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 00:25:11.263887 ignition[1036]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 00:25:11.266403 ignition[1036]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 00:25:11.266403 ignition[1036]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 00:25:11.266403 ignition[1036]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 00:25:11.271147 ignition[1036]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 00:25:11.271147 ignition[1036]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 12 00:25:11.271147 ignition[1036]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 00:25:11.271147 ignition[1036]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 12 00:25:11.289508 ignition[1036]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 00:25:11.294550 ignition[1036]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 12 00:25:11.296266 ignition[1036]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 12 00:25:11.296266 ignition[1036]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 12 00:25:11.299038 ignition[1036]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 00:25:11.299038 ignition[1036]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 00:25:11.299038 ignition[1036]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 00:25:11.299038 ignition[1036]: INFO : files: files passed Sep 12 00:25:11.299038 ignition[1036]: INFO : Ignition finished successfully Sep 12 00:25:11.308082 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 00:25:11.310285 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 00:25:11.313216 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 00:25:11.324342 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 00:25:11.324496 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 00:25:11.328503 initrd-setup-root-after-ignition[1065]: grep: /sysroot/oem/oem-release: No such file or directory Sep 12 00:25:11.332259 initrd-setup-root-after-ignition[1067]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 00:25:11.333879 initrd-setup-root-after-ignition[1067]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 00:25:11.335874 initrd-setup-root-after-ignition[1071]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 00:25:11.338700 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 00:25:11.340101 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 00:25:11.343043 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 00:25:11.394273 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 00:25:11.394427 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 00:25:11.397723 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 00:25:11.399637 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 00:25:11.399754 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 00:25:11.401644 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 00:25:11.427910 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 00:25:11.431587 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 00:25:11.454324 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 00:25:11.454480 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 00:25:11.457732 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 00:25:11.458854 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 00:25:11.458960 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 00:25:11.463438 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 00:25:11.463564 systemd[1]: Stopped target basic.target - Basic System. Sep 12 00:25:11.466184 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 00:25:11.466992 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 00:25:11.467493 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 00:25:11.470895 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 00:25:11.471373 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 00:25:11.474750 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 00:25:11.475079 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 00:25:11.479769 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 00:25:11.481691 systemd[1]: Stopped target swap.target - Swaps. Sep 12 00:25:11.482680 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 00:25:11.482783 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 00:25:11.485940 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 00:25:11.486403 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 00:25:11.486681 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 00:25:11.490833 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 00:25:11.491088 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 00:25:11.491203 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 00:25:11.491885 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 00:25:11.491986 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 00:25:11.496807 systemd[1]: Stopped target paths.target - Path Units. Sep 12 00:25:11.497032 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 00:25:11.504189 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 00:25:11.504367 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 00:25:11.506912 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 00:25:11.507412 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 00:25:11.507506 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 00:25:11.511135 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 00:25:11.511216 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 00:25:11.511972 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 00:25:11.512094 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 00:25:11.513780 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 00:25:11.513880 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 00:25:11.517513 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 00:25:11.518659 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 00:25:11.518766 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 00:25:11.519757 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 00:25:11.524076 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 00:25:11.524209 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 00:25:11.525112 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 00:25:11.525229 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 00:25:11.533156 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 00:25:11.541291 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 00:25:11.554691 ignition[1091]: INFO : Ignition 2.21.0 Sep 12 00:25:11.554691 ignition[1091]: INFO : Stage: umount Sep 12 00:25:11.556611 ignition[1091]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 00:25:11.556611 ignition[1091]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 12 00:25:11.558893 ignition[1091]: INFO : umount: umount passed Sep 12 00:25:11.558893 ignition[1091]: INFO : Ignition finished successfully Sep 12 00:25:11.559627 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 00:25:11.559756 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 00:25:11.562722 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 00:25:11.563184 systemd[1]: Stopped target network.target - Network. Sep 12 00:25:11.564498 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 00:25:11.564557 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 00:25:11.566377 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 00:25:11.566420 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 00:25:11.569486 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 00:25:11.569547 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 00:25:11.570920 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 00:25:11.570964 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 00:25:11.573101 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 00:25:11.575059 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 00:25:11.584032 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 00:25:11.584199 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 00:25:11.588849 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 00:25:11.589067 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 00:25:11.589268 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 00:25:11.592563 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 00:25:11.593186 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 00:25:11.595916 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 00:25:11.595961 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 00:25:11.597921 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 00:25:11.598864 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 00:25:11.598913 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 00:25:11.606105 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 00:25:11.606181 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 00:25:11.609103 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 00:25:11.609168 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 00:25:11.609389 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 00:25:11.609429 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 00:25:11.614028 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 00:25:11.616314 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 00:25:11.616389 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 00:25:11.623801 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 00:25:11.623922 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 00:25:11.636865 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 00:25:11.637040 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 00:25:11.638104 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 00:25:11.638167 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 00:25:11.640254 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 00:25:11.640290 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 00:25:11.640544 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 00:25:11.640588 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 00:25:11.641335 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 00:25:11.641380 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 00:25:11.642005 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 00:25:11.642058 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 00:25:11.652068 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 00:25:11.653883 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 00:25:11.653953 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 00:25:11.656498 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 00:25:11.656544 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 00:25:11.659789 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 00:25:11.659841 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:11.664505 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 00:25:11.664564 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 00:25:11.664612 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 00:25:11.692330 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 00:25:11.692472 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 00:25:11.743902 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 00:25:11.744136 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 00:25:11.747900 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 00:25:11.749317 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 00:25:11.749421 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 00:25:11.755431 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 00:25:11.789761 systemd[1]: Switching root. Sep 12 00:25:11.838777 systemd-journald[219]: Journal stopped Sep 12 00:25:13.022969 systemd-journald[219]: Received SIGTERM from PID 1 (systemd). Sep 12 00:25:13.023042 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 00:25:13.023056 kernel: SELinux: policy capability open_perms=1 Sep 12 00:25:13.023067 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 00:25:13.023078 kernel: SELinux: policy capability always_check_network=0 Sep 12 00:25:13.023089 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 00:25:13.023104 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 00:25:13.023131 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 00:25:13.023142 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 00:25:13.023153 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 00:25:13.023164 kernel: audit: type=1403 audit(1757636712.268:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 00:25:13.023176 systemd[1]: Successfully loaded SELinux policy in 55.177ms. Sep 12 00:25:13.023200 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.182ms. Sep 12 00:25:13.023213 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 00:25:13.023225 systemd[1]: Detected virtualization kvm. Sep 12 00:25:13.023240 systemd[1]: Detected architecture x86-64. Sep 12 00:25:13.023252 systemd[1]: Detected first boot. Sep 12 00:25:13.023265 systemd[1]: Initializing machine ID from VM UUID. Sep 12 00:25:13.023277 zram_generator::config[1135]: No configuration found. Sep 12 00:25:13.023295 kernel: Guest personality initialized and is inactive Sep 12 00:25:13.023306 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 00:25:13.023318 kernel: Initialized host personality Sep 12 00:25:13.023329 kernel: NET: Registered PF_VSOCK protocol family Sep 12 00:25:13.023342 systemd[1]: Populated /etc with preset unit settings. Sep 12 00:25:13.023355 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 00:25:13.023368 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 00:25:13.023379 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 00:25:13.023391 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 00:25:13.023404 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 00:25:13.023417 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 00:25:13.023428 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 00:25:13.023440 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 00:25:13.023459 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 00:25:13.023472 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 00:25:13.023484 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 00:25:13.023496 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 00:25:13.023508 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 00:25:13.023520 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 00:25:13.023533 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 00:25:13.023545 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 00:25:13.023557 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 00:25:13.023572 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 00:25:13.023584 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 00:25:13.023596 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 00:25:13.023608 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 00:25:13.023620 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 00:25:13.023632 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 00:25:13.023644 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 00:25:13.023658 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 00:25:13.023670 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 00:25:13.023682 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 00:25:13.023694 systemd[1]: Reached target slices.target - Slice Units. Sep 12 00:25:13.023706 systemd[1]: Reached target swap.target - Swaps. Sep 12 00:25:13.023718 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 00:25:13.023730 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 00:25:13.023742 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 00:25:13.023754 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 00:25:13.023766 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 00:25:13.023780 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 00:25:13.023794 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 00:25:13.023806 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 00:25:13.023818 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 00:25:13.023830 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 00:25:13.023842 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.023854 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 00:25:13.023866 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 00:25:13.023880 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 00:25:13.023893 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 00:25:13.023905 systemd[1]: Reached target machines.target - Containers. Sep 12 00:25:13.023917 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 00:25:13.023929 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 00:25:13.023941 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 00:25:13.023953 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 00:25:13.023965 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 00:25:13.023977 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 00:25:13.023992 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 00:25:13.024004 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 00:25:13.024025 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 00:25:13.024041 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 00:25:13.024053 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 00:25:13.024067 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 00:25:13.024079 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 00:25:13.024092 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 00:25:13.024106 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:25:13.024142 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 00:25:13.024155 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 00:25:13.024168 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 00:25:13.024179 kernel: fuse: init (API version 7.41) Sep 12 00:25:13.024191 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 00:25:13.024203 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 00:25:13.024217 kernel: loop: module loaded Sep 12 00:25:13.024229 kernel: ACPI: bus type drm_connector registered Sep 12 00:25:13.024240 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 00:25:13.024255 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 00:25:13.024269 systemd[1]: Stopped verity-setup.service. Sep 12 00:25:13.024281 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.024293 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 00:25:13.024360 systemd-journald[1210]: Collecting audit messages is disabled. Sep 12 00:25:13.024385 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 00:25:13.024397 systemd-journald[1210]: Journal started Sep 12 00:25:13.024422 systemd-journald[1210]: Runtime Journal (/run/log/journal/77007f5b0fc6478eb04ea4e6c72f191b) is 6M, max 48.5M, 42.4M free. Sep 12 00:25:12.785547 systemd[1]: Queued start job for default target multi-user.target. Sep 12 00:25:12.809990 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 12 00:25:12.810463 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 00:25:13.026143 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 00:25:13.027836 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 00:25:13.028939 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 00:25:13.030183 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 00:25:13.031450 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 00:25:13.032700 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 00:25:13.034253 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 00:25:13.035818 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 00:25:13.036057 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 00:25:13.037518 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 00:25:13.037728 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 00:25:13.039146 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 00:25:13.039353 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 00:25:13.040679 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 00:25:13.040883 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 00:25:13.042356 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 00:25:13.042570 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 00:25:13.043902 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 00:25:13.044137 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 00:25:13.045606 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 00:25:13.047075 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 00:25:13.048602 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 00:25:13.050099 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 00:25:13.063115 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 00:25:13.065585 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 00:25:13.067611 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 00:25:13.068768 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 00:25:13.068822 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 00:25:13.070868 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 00:25:13.078229 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 00:25:13.079570 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:25:13.081144 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 00:25:13.083154 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 00:25:13.084378 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 00:25:13.085244 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 00:25:13.086373 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 00:25:13.093075 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 00:25:13.096611 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 00:25:13.098867 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 00:25:13.101633 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 00:25:13.103005 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 00:25:13.107729 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 00:25:13.110318 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 00:25:13.112669 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 00:25:13.113384 kernel: loop0: detected capacity change from 0 to 146240 Sep 12 00:25:13.114056 systemd-journald[1210]: Time spent on flushing to /var/log/journal/77007f5b0fc6478eb04ea4e6c72f191b is 19.294ms for 1079 entries. Sep 12 00:25:13.114056 systemd-journald[1210]: System Journal (/var/log/journal/77007f5b0fc6478eb04ea4e6c72f191b) is 8M, max 195.6M, 187.6M free. Sep 12 00:25:13.151677 systemd-journald[1210]: Received client request to flush runtime journal. Sep 12 00:25:13.151745 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 00:25:13.118660 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 00:25:13.130192 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 00:25:13.155293 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 00:25:13.157583 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 00:25:13.163248 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 00:25:13.166936 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 00:25:13.170143 kernel: loop1: detected capacity change from 0 to 224512 Sep 12 00:25:13.194644 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Sep 12 00:25:13.194981 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Sep 12 00:25:13.197145 kernel: loop2: detected capacity change from 0 to 113872 Sep 12 00:25:13.200832 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 00:25:13.228145 kernel: loop3: detected capacity change from 0 to 146240 Sep 12 00:25:13.241150 kernel: loop4: detected capacity change from 0 to 224512 Sep 12 00:25:13.251153 kernel: loop5: detected capacity change from 0 to 113872 Sep 12 00:25:13.258764 (sd-merge)[1278]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 12 00:25:13.259338 (sd-merge)[1278]: Merged extensions into '/usr'. Sep 12 00:25:13.264466 systemd[1]: Reload requested from client PID 1254 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 00:25:13.264479 systemd[1]: Reloading... Sep 12 00:25:13.323148 zram_generator::config[1304]: No configuration found. Sep 12 00:25:13.422429 ldconfig[1249]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 00:25:13.426549 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:25:13.507545 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 00:25:13.507705 systemd[1]: Reloading finished in 242 ms. Sep 12 00:25:13.540243 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 00:25:13.541821 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 00:25:13.557473 systemd[1]: Starting ensure-sysext.service... Sep 12 00:25:13.559317 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 00:25:13.573852 systemd[1]: Reload requested from client PID 1341 ('systemctl') (unit ensure-sysext.service)... Sep 12 00:25:13.573865 systemd[1]: Reloading... Sep 12 00:25:13.581662 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 00:25:13.581832 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 00:25:13.582161 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 00:25:13.582425 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 00:25:13.583315 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 00:25:13.583580 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Sep 12 00:25:13.583647 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Sep 12 00:25:13.588912 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 00:25:13.588925 systemd-tmpfiles[1342]: Skipping /boot Sep 12 00:25:13.601017 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 00:25:13.601030 systemd-tmpfiles[1342]: Skipping /boot Sep 12 00:25:13.628152 zram_generator::config[1369]: No configuration found. Sep 12 00:25:13.726876 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:25:13.838766 systemd[1]: Reloading finished in 264 ms. Sep 12 00:25:13.863241 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 00:25:13.887152 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 00:25:13.898184 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 00:25:13.901411 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 00:25:13.913034 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 00:25:13.916913 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 00:25:13.921017 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 00:25:13.924583 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 00:25:13.929051 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.929231 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 00:25:13.931867 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 00:25:13.935261 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 00:25:13.938305 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 00:25:13.939542 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:25:13.939729 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:25:13.942943 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 00:25:13.944043 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.945245 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 00:25:13.945460 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 00:25:13.947695 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 00:25:13.947917 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 00:25:13.952365 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 00:25:13.958206 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 00:25:13.962768 systemd-udevd[1415]: Using default interface naming scheme 'v255'. Sep 12 00:25:13.966737 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 00:25:13.971038 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 00:25:13.975105 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.975442 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 00:25:13.978908 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 00:25:13.981212 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 00:25:13.984013 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 00:25:13.985220 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:25:13.985471 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:25:13.988302 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 00:25:13.989490 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.993505 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.993766 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 00:25:13.993818 augenrules[1446]: No rules Sep 12 00:25:13.997329 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 00:25:13.998450 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 00:25:13.998561 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 00:25:13.998691 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 00:25:13.999734 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 00:25:13.999986 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 00:25:14.001866 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 00:25:14.003760 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 00:25:14.003966 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 00:25:14.005476 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 00:25:14.007194 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 00:25:14.007409 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 00:25:14.009028 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 00:25:14.009362 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 00:25:14.010883 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 00:25:14.023205 systemd[1]: Finished ensure-sysext.service. Sep 12 00:25:14.024403 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 00:25:14.024621 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 00:25:14.028018 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 00:25:14.046344 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 00:25:14.047415 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 00:25:14.047471 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 00:25:14.050360 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 00:25:14.051452 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 00:25:14.100073 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 00:25:14.144948 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 12 00:25:14.147548 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 00:25:14.170141 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 00:25:14.170606 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 00:25:14.175143 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 00:25:14.181138 kernel: ACPI: button: Power Button [PWRF] Sep 12 00:25:14.202135 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Sep 12 00:25:14.204342 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 12 00:25:14.204501 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 12 00:25:14.221506 systemd-networkd[1493]: lo: Link UP Sep 12 00:25:14.221520 systemd-networkd[1493]: lo: Gained carrier Sep 12 00:25:14.223170 systemd-networkd[1493]: Enumeration completed Sep 12 00:25:14.223264 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 00:25:14.223731 systemd-networkd[1493]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 00:25:14.223744 systemd-networkd[1493]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 00:25:14.224415 systemd-networkd[1493]: eth0: Link UP Sep 12 00:25:14.224613 systemd-networkd[1493]: eth0: Gained carrier Sep 12 00:25:14.224634 systemd-networkd[1493]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 00:25:14.227233 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 00:25:14.230288 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 00:25:14.235183 systemd-networkd[1493]: eth0: DHCPv4 address 10.0.0.151/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 12 00:25:14.247398 systemd-resolved[1411]: Positive Trust Anchors: Sep 12 00:25:14.247415 systemd-resolved[1411]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 00:25:14.247446 systemd-resolved[1411]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 00:25:14.250887 systemd-resolved[1411]: Defaulting to hostname 'linux'. Sep 12 00:25:14.252761 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 00:25:14.254636 systemd[1]: Reached target network.target - Network. Sep 12 00:25:14.255553 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 00:25:14.263404 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 00:25:14.266710 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 00:25:15.904818 systemd-resolved[1411]: Clock change detected. Flushing caches. Sep 12 00:25:15.904898 systemd-timesyncd[1494]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 12 00:25:15.904945 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 00:25:15.904946 systemd-timesyncd[1494]: Initial clock synchronization to Fri 2025-09-12 00:25:15.904780 UTC. Sep 12 00:25:15.906097 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 00:25:15.907491 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 00:25:15.908836 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 00:25:15.909963 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 00:25:15.911185 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 00:25:15.911215 systemd[1]: Reached target paths.target - Path Units. Sep 12 00:25:15.912137 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 00:25:15.913316 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 00:25:15.914470 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 00:25:15.915724 systemd[1]: Reached target timers.target - Timer Units. Sep 12 00:25:15.917658 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 00:25:15.920120 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 00:25:15.925657 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 00:25:15.928362 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 00:25:15.929627 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 00:25:15.943554 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 00:25:15.944993 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 00:25:15.947045 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 00:25:15.949024 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 00:25:15.950371 systemd[1]: Reached target basic.target - Basic System. Sep 12 00:25:15.951348 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 00:25:15.951370 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 00:25:15.952539 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 00:25:15.954883 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 00:25:15.956884 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 00:25:15.962201 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 00:25:15.965679 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 00:25:15.967740 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 00:25:15.968976 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 00:25:15.970073 jq[1532]: false Sep 12 00:25:15.971937 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 00:25:15.974852 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 00:25:15.979857 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 00:25:15.984215 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 00:25:15.989366 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 00:25:15.991414 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 00:25:15.991878 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 00:25:15.994109 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 00:25:15.996937 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 00:25:16.003360 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 00:25:16.004871 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Refreshing passwd entry cache Sep 12 00:25:16.004879 oslogin_cache_refresh[1534]: Refreshing passwd entry cache Sep 12 00:25:16.004952 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 00:25:16.005749 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 00:25:16.006077 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 00:25:16.006310 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 00:25:16.010760 extend-filesystems[1533]: Found /dev/vda6 Sep 12 00:25:16.009210 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 00:25:16.018961 jq[1551]: true Sep 12 00:25:16.019051 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Failure getting users, quitting Sep 12 00:25:16.019051 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 00:25:16.019051 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Refreshing group entry cache Sep 12 00:25:16.016864 oslogin_cache_refresh[1534]: Failure getting users, quitting Sep 12 00:25:16.009448 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 00:25:16.016884 oslogin_cache_refresh[1534]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 00:25:16.016934 oslogin_cache_refresh[1534]: Refreshing group entry cache Sep 12 00:25:16.021002 extend-filesystems[1533]: Found /dev/vda9 Sep 12 00:25:16.024823 extend-filesystems[1533]: Checking size of /dev/vda9 Sep 12 00:25:16.025199 (ntainerd)[1555]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 00:25:16.036527 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Failure getting groups, quitting Sep 12 00:25:16.036527 google_oslogin_nss_cache[1534]: oslogin_cache_refresh[1534]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 00:25:16.036327 oslogin_cache_refresh[1534]: Failure getting groups, quitting Sep 12 00:25:16.036361 oslogin_cache_refresh[1534]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 00:25:16.038830 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 00:25:16.040815 jq[1556]: true Sep 12 00:25:16.039085 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 00:25:16.047776 extend-filesystems[1533]: Resized partition /dev/vda9 Sep 12 00:25:16.057822 extend-filesystems[1575]: resize2fs 1.47.2 (1-Jan-2025) Sep 12 00:25:16.050949 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:25:16.064725 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 12 00:25:16.067125 tar[1553]: linux-amd64/LICENSE Sep 12 00:25:16.067125 tar[1553]: linux-amd64/helm Sep 12 00:25:16.069335 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 00:25:16.070082 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:16.071096 update_engine[1547]: I20250912 00:25:16.071026 1547 main.cc:92] Flatcar Update Engine starting Sep 12 00:25:16.076018 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 00:25:16.083606 kernel: kvm_amd: TSC scaling supported Sep 12 00:25:16.083639 kernel: kvm_amd: Nested Virtualization enabled Sep 12 00:25:16.083652 kernel: kvm_amd: Nested Paging enabled Sep 12 00:25:16.083664 kernel: kvm_amd: LBR virtualization supported Sep 12 00:25:16.083690 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 12 00:25:16.084185 kernel: kvm_amd: Virtual GIF supported Sep 12 00:25:16.085669 dbus-daemon[1530]: [system] SELinux support is enabled Sep 12 00:25:16.085864 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 00:25:16.091069 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 00:25:16.124812 update_engine[1547]: I20250912 00:25:16.093233 1547 update_check_scheduler.cc:74] Next update check in 5m37s Sep 12 00:25:16.091095 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 00:25:16.092623 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 00:25:16.092642 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 00:25:16.096074 systemd[1]: Started update-engine.service - Update Engine. Sep 12 00:25:16.101600 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 00:25:16.128816 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 12 00:25:16.151731 extend-filesystems[1575]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 12 00:25:16.151731 extend-filesystems[1575]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 12 00:25:16.151731 extend-filesystems[1575]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 12 00:25:16.150087 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 00:25:16.152221 extend-filesystems[1533]: Resized filesystem in /dev/vda9 Sep 12 00:25:16.150354 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 00:25:16.168682 bash[1595]: Updated "/home/core/.ssh/authorized_keys" Sep 12 00:25:16.172720 kernel: EDAC MC: Ver: 3.0.0 Sep 12 00:25:16.225805 locksmithd[1589]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 00:25:16.239393 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 00:25:16.246791 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 00:25:16.260201 containerd[1555]: time="2025-09-12T00:25:16Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 00:25:16.264081 containerd[1555]: time="2025-09-12T00:25:16.262817968Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 12 00:25:16.264858 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 00:25:16.267747 systemd-logind[1546]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 00:25:16.267771 systemd-logind[1546]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 00:25:16.268229 systemd-logind[1546]: New seat seat0. Sep 12 00:25:16.270512 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 00:25:16.277727 containerd[1555]: time="2025-09-12T00:25:16.277655919Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.416µs" Sep 12 00:25:16.277727 containerd[1555]: time="2025-09-12T00:25:16.277692087Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 00:25:16.277727 containerd[1555]: time="2025-09-12T00:25:16.277723336Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 00:25:16.277931 containerd[1555]: time="2025-09-12T00:25:16.277904205Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 00:25:16.277931 containerd[1555]: time="2025-09-12T00:25:16.277923611Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 00:25:16.277993 containerd[1555]: time="2025-09-12T00:25:16.277947616Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278027 containerd[1555]: time="2025-09-12T00:25:16.278009943Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278027 containerd[1555]: time="2025-09-12T00:25:16.278023198Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278306 containerd[1555]: time="2025-09-12T00:25:16.278277795Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278306 containerd[1555]: time="2025-09-12T00:25:16.278295248Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278352 containerd[1555]: time="2025-09-12T00:25:16.278306599Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278352 containerd[1555]: time="2025-09-12T00:25:16.278315386Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278415 containerd[1555]: time="2025-09-12T00:25:16.278397951Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278645 containerd[1555]: time="2025-09-12T00:25:16.278616571Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278668 containerd[1555]: time="2025-09-12T00:25:16.278650013Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 00:25:16.278689 containerd[1555]: time="2025-09-12T00:25:16.278665833Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 00:25:16.279122 containerd[1555]: time="2025-09-12T00:25:16.279098685Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 00:25:16.279487 containerd[1555]: time="2025-09-12T00:25:16.279453420Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 00:25:16.279594 containerd[1555]: time="2025-09-12T00:25:16.279566251Z" level=info msg="metadata content store policy set" policy=shared Sep 12 00:25:16.284716 containerd[1555]: time="2025-09-12T00:25:16.284676132Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 00:25:16.284760 containerd[1555]: time="2025-09-12T00:25:16.284735253Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 00:25:16.284760 containerd[1555]: time="2025-09-12T00:25:16.284751684Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 00:25:16.284810 containerd[1555]: time="2025-09-12T00:25:16.284764177Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 00:25:16.284810 containerd[1555]: time="2025-09-12T00:25:16.284777202Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 00:25:16.284810 containerd[1555]: time="2025-09-12T00:25:16.284787461Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 00:25:16.284810 containerd[1555]: time="2025-09-12T00:25:16.284800596Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 00:25:16.284879 containerd[1555]: time="2025-09-12T00:25:16.284812728Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 00:25:16.284879 containerd[1555]: time="2025-09-12T00:25:16.284824100Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 00:25:16.284879 containerd[1555]: time="2025-09-12T00:25:16.284834119Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 00:25:16.284879 containerd[1555]: time="2025-09-12T00:25:16.284842955Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 00:25:16.284879 containerd[1555]: time="2025-09-12T00:25:16.284855238Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 00:25:16.284971 containerd[1555]: time="2025-09-12T00:25:16.284960976Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 00:25:16.284992 containerd[1555]: time="2025-09-12T00:25:16.284979221Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 00:25:16.285012 containerd[1555]: time="2025-09-12T00:25:16.284992165Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 00:25:16.285012 containerd[1555]: time="2025-09-12T00:25:16.285008756Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 00:25:16.285053 containerd[1555]: time="2025-09-12T00:25:16.285020087Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 00:25:16.285053 containerd[1555]: time="2025-09-12T00:25:16.285031409Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 00:25:16.285053 containerd[1555]: time="2025-09-12T00:25:16.285042559Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 00:25:16.285053 containerd[1555]: time="2025-09-12T00:25:16.285052718Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 00:25:16.285132 containerd[1555]: time="2025-09-12T00:25:16.285074379Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 00:25:16.285132 containerd[1555]: time="2025-09-12T00:25:16.285086061Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 00:25:16.285132 containerd[1555]: time="2025-09-12T00:25:16.285096100Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 00:25:16.285185 containerd[1555]: time="2025-09-12T00:25:16.285153888Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 00:25:16.285185 containerd[1555]: time="2025-09-12T00:25:16.285166862Z" level=info msg="Start snapshots syncer" Sep 12 00:25:16.285222 containerd[1555]: time="2025-09-12T00:25:16.285189675Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 00:25:16.285444 containerd[1555]: time="2025-09-12T00:25:16.285398747Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 00:25:16.285536 containerd[1555]: time="2025-09-12T00:25:16.285447980Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 00:25:16.285536 containerd[1555]: time="2025-09-12T00:25:16.285517831Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 00:25:16.285639 containerd[1555]: time="2025-09-12T00:25:16.285617568Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 00:25:16.285660 containerd[1555]: time="2025-09-12T00:25:16.285639769Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 00:25:16.285660 containerd[1555]: time="2025-09-12T00:25:16.285655148Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 00:25:16.285718 containerd[1555]: time="2025-09-12T00:25:16.285664646Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 00:25:16.285718 containerd[1555]: time="2025-09-12T00:25:16.285676879Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 00:25:16.285718 containerd[1555]: time="2025-09-12T00:25:16.285687459Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 00:25:16.285805 containerd[1555]: time="2025-09-12T00:25:16.285785072Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 00:25:16.285827 containerd[1555]: time="2025-09-12T00:25:16.285810119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 00:25:16.285867 containerd[1555]: time="2025-09-12T00:25:16.285848901Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 00:25:16.285867 containerd[1555]: time="2025-09-12T00:25:16.285863739Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 00:25:16.285916 containerd[1555]: time="2025-09-12T00:25:16.285903864Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 00:25:16.285937 containerd[1555]: time="2025-09-12T00:25:16.285917360Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 00:25:16.285937 containerd[1555]: time="2025-09-12T00:25:16.285927409Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 00:25:16.285977 containerd[1555]: time="2025-09-12T00:25:16.285938459Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 00:25:16.286055 containerd[1555]: time="2025-09-12T00:25:16.286035040Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 00:25:16.286077 containerd[1555]: time="2025-09-12T00:25:16.286054928Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 00:25:16.286077 containerd[1555]: time="2025-09-12T00:25:16.286065247Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 00:25:16.286118 containerd[1555]: time="2025-09-12T00:25:16.286081788Z" level=info msg="runtime interface created" Sep 12 00:25:16.286118 containerd[1555]: time="2025-09-12T00:25:16.286087489Z" level=info msg="created NRI interface" Sep 12 00:25:16.286118 containerd[1555]: time="2025-09-12T00:25:16.286095424Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 00:25:16.286118 containerd[1555]: time="2025-09-12T00:25:16.286105953Z" level=info msg="Connect containerd service" Sep 12 00:25:16.286187 containerd[1555]: time="2025-09-12T00:25:16.286125420Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 00:25:16.288357 containerd[1555]: time="2025-09-12T00:25:16.288322399Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 00:25:16.350051 sshd_keygen[1578]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 00:25:16.376354 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 00:25:16.380607 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 00:25:16.381591 containerd[1555]: time="2025-09-12T00:25:16.381368120Z" level=info msg="Start subscribing containerd event" Sep 12 00:25:16.381591 containerd[1555]: time="2025-09-12T00:25:16.381530364Z" level=info msg="Start recovering state" Sep 12 00:25:16.381775 containerd[1555]: time="2025-09-12T00:25:16.381747551Z" level=info msg="Start event monitor" Sep 12 00:25:16.381934 containerd[1555]: time="2025-09-12T00:25:16.381911539Z" level=info msg="Start cni network conf syncer for default" Sep 12 00:25:16.381965 containerd[1555]: time="2025-09-12T00:25:16.381937167Z" level=info msg="Start streaming server" Sep 12 00:25:16.381965 containerd[1555]: time="2025-09-12T00:25:16.381947015Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 00:25:16.381965 containerd[1555]: time="2025-09-12T00:25:16.381955651Z" level=info msg="runtime interface starting up..." Sep 12 00:25:16.381965 containerd[1555]: time="2025-09-12T00:25:16.381961953Z" level=info msg="starting plugins..." Sep 12 00:25:16.382067 containerd[1555]: time="2025-09-12T00:25:16.381978334Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 00:25:16.382094 containerd[1555]: time="2025-09-12T00:25:16.381890740Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 00:25:16.382150 containerd[1555]: time="2025-09-12T00:25:16.382125159Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 00:25:16.382322 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 00:25:16.382607 containerd[1555]: time="2025-09-12T00:25:16.382579732Z" level=info msg="containerd successfully booted in 0.122943s" Sep 12 00:25:16.406415 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 00:25:16.406750 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 00:25:16.409519 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 00:25:16.438791 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 00:25:16.441774 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 00:25:16.443950 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 00:25:16.445206 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 00:25:16.605839 tar[1553]: linux-amd64/README.md Sep 12 00:25:16.635969 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 00:25:17.785924 systemd-networkd[1493]: eth0: Gained IPv6LL Sep 12 00:25:17.788784 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 00:25:17.790568 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 00:25:17.793102 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 12 00:25:17.795302 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:25:17.814229 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 00:25:17.832370 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 12 00:25:17.832954 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 12 00:25:17.834538 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 00:25:17.836987 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 00:25:18.501983 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:18.503578 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 00:25:18.504851 systemd[1]: Startup finished in 2.777s (kernel) + 5.637s (initrd) + 4.653s (userspace) = 13.069s. Sep 12 00:25:18.512015 (kubelet)[1675]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:25:18.906936 kubelet[1675]: E0912 00:25:18.906830 1675 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:25:18.910972 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:25:18.911178 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:25:18.911563 systemd[1]: kubelet.service: Consumed 961ms CPU time, 263.3M memory peak. Sep 12 00:25:20.965620 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 00:25:20.966925 systemd[1]: Started sshd@0-10.0.0.151:22-10.0.0.1:42582.service - OpenSSH per-connection server daemon (10.0.0.1:42582). Sep 12 00:25:21.038586 sshd[1688]: Accepted publickey for core from 10.0.0.1 port 42582 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:25:21.040409 sshd-session[1688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:25:21.046625 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 00:25:21.047724 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 00:25:21.053712 systemd-logind[1546]: New session 1 of user core. Sep 12 00:25:21.076347 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 00:25:21.079190 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 00:25:21.099908 (systemd)[1692]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 00:25:21.102066 systemd-logind[1546]: New session c1 of user core. Sep 12 00:25:21.244107 systemd[1692]: Queued start job for default target default.target. Sep 12 00:25:21.250856 systemd[1692]: Created slice app.slice - User Application Slice. Sep 12 00:25:21.250879 systemd[1692]: Reached target paths.target - Paths. Sep 12 00:25:21.250914 systemd[1692]: Reached target timers.target - Timers. Sep 12 00:25:21.252255 systemd[1692]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 00:25:21.262203 systemd[1692]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 00:25:21.262258 systemd[1692]: Reached target sockets.target - Sockets. Sep 12 00:25:21.262292 systemd[1692]: Reached target basic.target - Basic System. Sep 12 00:25:21.262328 systemd[1692]: Reached target default.target - Main User Target. Sep 12 00:25:21.262358 systemd[1692]: Startup finished in 154ms. Sep 12 00:25:21.262734 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 00:25:21.264322 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 00:25:21.330591 systemd[1]: Started sshd@1-10.0.0.151:22-10.0.0.1:42594.service - OpenSSH per-connection server daemon (10.0.0.1:42594). Sep 12 00:25:21.389623 sshd[1703]: Accepted publickey for core from 10.0.0.1 port 42594 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:25:21.390901 sshd-session[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:25:21.394978 systemd-logind[1546]: New session 2 of user core. Sep 12 00:25:21.405818 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 00:25:21.457316 sshd[1705]: Connection closed by 10.0.0.1 port 42594 Sep 12 00:25:21.457683 sshd-session[1703]: pam_unix(sshd:session): session closed for user core Sep 12 00:25:21.466274 systemd[1]: sshd@1-10.0.0.151:22-10.0.0.1:42594.service: Deactivated successfully. Sep 12 00:25:21.468007 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 00:25:21.468820 systemd-logind[1546]: Session 2 logged out. Waiting for processes to exit. Sep 12 00:25:21.471448 systemd[1]: Started sshd@2-10.0.0.151:22-10.0.0.1:42596.service - OpenSSH per-connection server daemon (10.0.0.1:42596). Sep 12 00:25:21.472308 systemd-logind[1546]: Removed session 2. Sep 12 00:25:21.529101 sshd[1711]: Accepted publickey for core from 10.0.0.1 port 42596 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:25:21.530299 sshd-session[1711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:25:21.534430 systemd-logind[1546]: New session 3 of user core. Sep 12 00:25:21.543824 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 00:25:21.591190 sshd[1713]: Connection closed by 10.0.0.1 port 42596 Sep 12 00:25:21.591470 sshd-session[1711]: pam_unix(sshd:session): session closed for user core Sep 12 00:25:21.607988 systemd[1]: sshd@2-10.0.0.151:22-10.0.0.1:42596.service: Deactivated successfully. Sep 12 00:25:21.609531 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 00:25:21.610188 systemd-logind[1546]: Session 3 logged out. Waiting for processes to exit. Sep 12 00:25:21.612568 systemd[1]: Started sshd@3-10.0.0.151:22-10.0.0.1:42606.service - OpenSSH per-connection server daemon (10.0.0.1:42606). Sep 12 00:25:21.613298 systemd-logind[1546]: Removed session 3. Sep 12 00:25:21.664575 sshd[1719]: Accepted publickey for core from 10.0.0.1 port 42606 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:25:21.665819 sshd-session[1719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:25:21.673465 systemd-logind[1546]: New session 4 of user core. Sep 12 00:25:21.682813 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 00:25:21.733996 sshd[1721]: Connection closed by 10.0.0.1 port 42606 Sep 12 00:25:21.734208 sshd-session[1719]: pam_unix(sshd:session): session closed for user core Sep 12 00:25:21.742138 systemd[1]: sshd@3-10.0.0.151:22-10.0.0.1:42606.service: Deactivated successfully. Sep 12 00:25:21.743756 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 00:25:21.744417 systemd-logind[1546]: Session 4 logged out. Waiting for processes to exit. Sep 12 00:25:21.746873 systemd[1]: Started sshd@4-10.0.0.151:22-10.0.0.1:42622.service - OpenSSH per-connection server daemon (10.0.0.1:42622). Sep 12 00:25:21.747425 systemd-logind[1546]: Removed session 4. Sep 12 00:25:21.804687 sshd[1727]: Accepted publickey for core from 10.0.0.1 port 42622 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:25:21.806047 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:25:21.810370 systemd-logind[1546]: New session 5 of user core. Sep 12 00:25:21.819833 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 00:25:21.877450 sudo[1730]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 00:25:21.877779 sudo[1730]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:25:21.894308 sudo[1730]: pam_unix(sudo:session): session closed for user root Sep 12 00:25:21.895900 sshd[1729]: Connection closed by 10.0.0.1 port 42622 Sep 12 00:25:21.896043 sshd-session[1727]: pam_unix(sshd:session): session closed for user core Sep 12 00:25:21.911096 systemd[1]: sshd@4-10.0.0.151:22-10.0.0.1:42622.service: Deactivated successfully. Sep 12 00:25:21.912569 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 00:25:21.913346 systemd-logind[1546]: Session 5 logged out. Waiting for processes to exit. Sep 12 00:25:21.916459 systemd[1]: Started sshd@5-10.0.0.151:22-10.0.0.1:42630.service - OpenSSH per-connection server daemon (10.0.0.1:42630). Sep 12 00:25:21.917221 systemd-logind[1546]: Removed session 5. Sep 12 00:25:21.966223 sshd[1736]: Accepted publickey for core from 10.0.0.1 port 42630 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:25:21.967818 sshd-session[1736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:25:21.972989 systemd-logind[1546]: New session 6 of user core. Sep 12 00:25:21.990932 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 00:25:22.044843 sudo[1740]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 00:25:22.045130 sudo[1740]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:25:22.150500 sudo[1740]: pam_unix(sudo:session): session closed for user root Sep 12 00:25:22.158955 sudo[1739]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 00:25:22.159373 sudo[1739]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:25:22.188787 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 00:25:22.254032 augenrules[1762]: No rules Sep 12 00:25:22.256846 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 00:25:22.257219 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 00:25:22.258601 sudo[1739]: pam_unix(sudo:session): session closed for user root Sep 12 00:25:22.261025 sshd[1738]: Connection closed by 10.0.0.1 port 42630 Sep 12 00:25:22.263104 sshd-session[1736]: pam_unix(sshd:session): session closed for user core Sep 12 00:25:22.279376 systemd[1]: sshd@5-10.0.0.151:22-10.0.0.1:42630.service: Deactivated successfully. Sep 12 00:25:22.284259 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 00:25:22.285431 systemd-logind[1546]: Session 6 logged out. Waiting for processes to exit. Sep 12 00:25:22.289515 systemd[1]: Started sshd@6-10.0.0.151:22-10.0.0.1:42638.service - OpenSSH per-connection server daemon (10.0.0.1:42638). Sep 12 00:25:22.290368 systemd-logind[1546]: Removed session 6. Sep 12 00:25:22.361380 sshd[1771]: Accepted publickey for core from 10.0.0.1 port 42638 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:25:22.363442 sshd-session[1771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:25:22.373346 systemd-logind[1546]: New session 7 of user core. Sep 12 00:25:22.387002 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 00:25:22.445625 sudo[1774]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 00:25:22.445953 sudo[1774]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 00:25:22.912842 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 00:25:22.933115 (dockerd)[1794]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 00:25:23.412289 dockerd[1794]: time="2025-09-12T00:25:23.412139264Z" level=info msg="Starting up" Sep 12 00:25:23.413096 dockerd[1794]: time="2025-09-12T00:25:23.413074327Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 00:25:23.807599 dockerd[1794]: time="2025-09-12T00:25:23.807484446Z" level=info msg="Loading containers: start." Sep 12 00:25:23.816738 kernel: Initializing XFRM netlink socket Sep 12 00:25:24.049550 systemd-networkd[1493]: docker0: Link UP Sep 12 00:25:24.055200 dockerd[1794]: time="2025-09-12T00:25:24.055165087Z" level=info msg="Loading containers: done." Sep 12 00:25:24.075195 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3217202202-merged.mount: Deactivated successfully. Sep 12 00:25:24.079078 dockerd[1794]: time="2025-09-12T00:25:24.079035935Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 00:25:24.079148 dockerd[1794]: time="2025-09-12T00:25:24.079119001Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 12 00:25:24.079236 dockerd[1794]: time="2025-09-12T00:25:24.079216343Z" level=info msg="Initializing buildkit" Sep 12 00:25:24.107610 dockerd[1794]: time="2025-09-12T00:25:24.107577861Z" level=info msg="Completed buildkit initialization" Sep 12 00:25:24.112529 dockerd[1794]: time="2025-09-12T00:25:24.112499339Z" level=info msg="Daemon has completed initialization" Sep 12 00:25:24.112607 dockerd[1794]: time="2025-09-12T00:25:24.112562187Z" level=info msg="API listen on /run/docker.sock" Sep 12 00:25:24.112709 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 00:25:25.278945 containerd[1555]: time="2025-09-12T00:25:25.278904799Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 00:25:25.969075 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2280238740.mount: Deactivated successfully. Sep 12 00:25:27.171057 containerd[1555]: time="2025-09-12T00:25:27.170991992Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:27.171679 containerd[1555]: time="2025-09-12T00:25:27.171650506Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28837916" Sep 12 00:25:27.172889 containerd[1555]: time="2025-09-12T00:25:27.172854253Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:27.176720 containerd[1555]: time="2025-09-12T00:25:27.175885727Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:27.178485 containerd[1555]: time="2025-09-12T00:25:27.178437171Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 1.899493059s" Sep 12 00:25:27.178485 containerd[1555]: time="2025-09-12T00:25:27.178482897Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 12 00:25:27.179091 containerd[1555]: time="2025-09-12T00:25:27.179061472Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 00:25:28.601337 containerd[1555]: time="2025-09-12T00:25:28.601278458Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:28.602203 containerd[1555]: time="2025-09-12T00:25:28.602183395Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787027" Sep 12 00:25:28.603406 containerd[1555]: time="2025-09-12T00:25:28.603360753Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:28.605798 containerd[1555]: time="2025-09-12T00:25:28.605768507Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:28.606620 containerd[1555]: time="2025-09-12T00:25:28.606582153Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 1.427487329s" Sep 12 00:25:28.606656 containerd[1555]: time="2025-09-12T00:25:28.606618140Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 12 00:25:28.607068 containerd[1555]: time="2025-09-12T00:25:28.607041975Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 00:25:29.161588 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 00:25:29.163661 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:25:29.393453 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:29.398978 (kubelet)[2073]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:25:29.637456 kubelet[2073]: E0912 00:25:29.637337 2073 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:25:29.643421 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:25:29.643603 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:25:29.644104 systemd[1]: kubelet.service: Consumed 234ms CPU time, 110.9M memory peak. Sep 12 00:25:30.202758 containerd[1555]: time="2025-09-12T00:25:30.202714110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:30.203576 containerd[1555]: time="2025-09-12T00:25:30.203546451Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176289" Sep 12 00:25:30.204583 containerd[1555]: time="2025-09-12T00:25:30.204534433Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:30.206787 containerd[1555]: time="2025-09-12T00:25:30.206755167Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:30.209387 containerd[1555]: time="2025-09-12T00:25:30.208815500Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 1.601746304s" Sep 12 00:25:30.209387 containerd[1555]: time="2025-09-12T00:25:30.208848742Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 12 00:25:30.209517 containerd[1555]: time="2025-09-12T00:25:30.209492269Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 00:25:31.358578 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount108056926.mount: Deactivated successfully. Sep 12 00:25:32.360343 containerd[1555]: time="2025-09-12T00:25:32.360262993Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:32.361101 containerd[1555]: time="2025-09-12T00:25:32.361070798Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924206" Sep 12 00:25:32.362388 containerd[1555]: time="2025-09-12T00:25:32.362337112Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:32.364139 containerd[1555]: time="2025-09-12T00:25:32.364104206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:32.364644 containerd[1555]: time="2025-09-12T00:25:32.364609593Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 2.15508845s" Sep 12 00:25:32.364644 containerd[1555]: time="2025-09-12T00:25:32.364639680Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 12 00:25:32.365174 containerd[1555]: time="2025-09-12T00:25:32.365068654Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 00:25:32.916923 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4150654440.mount: Deactivated successfully. Sep 12 00:25:34.033868 containerd[1555]: time="2025-09-12T00:25:34.033808711Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:34.034584 containerd[1555]: time="2025-09-12T00:25:34.034519434Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 12 00:25:34.035614 containerd[1555]: time="2025-09-12T00:25:34.035572639Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:34.038033 containerd[1555]: time="2025-09-12T00:25:34.037973911Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:34.038879 containerd[1555]: time="2025-09-12T00:25:34.038850214Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.673742046s" Sep 12 00:25:34.038879 containerd[1555]: time="2025-09-12T00:25:34.038879249Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 00:25:34.039380 containerd[1555]: time="2025-09-12T00:25:34.039347236Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 00:25:34.577552 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3363370484.mount: Deactivated successfully. Sep 12 00:25:34.582493 containerd[1555]: time="2025-09-12T00:25:34.582448521Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 00:25:34.583124 containerd[1555]: time="2025-09-12T00:25:34.583103569Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 12 00:25:34.584151 containerd[1555]: time="2025-09-12T00:25:34.584129793Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 00:25:34.586011 containerd[1555]: time="2025-09-12T00:25:34.585981245Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 00:25:34.586485 containerd[1555]: time="2025-09-12T00:25:34.586452949Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 547.079083ms" Sep 12 00:25:34.586485 containerd[1555]: time="2025-09-12T00:25:34.586480330Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 00:25:34.586977 containerd[1555]: time="2025-09-12T00:25:34.586947166Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 00:25:35.121246 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2475603302.mount: Deactivated successfully. Sep 12 00:25:37.688213 containerd[1555]: time="2025-09-12T00:25:37.688147966Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:37.688912 containerd[1555]: time="2025-09-12T00:25:37.688892032Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 12 00:25:37.690130 containerd[1555]: time="2025-09-12T00:25:37.690106268Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:37.692758 containerd[1555]: time="2025-09-12T00:25:37.692682919Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:37.693955 containerd[1555]: time="2025-09-12T00:25:37.693917474Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.106930854s" Sep 12 00:25:37.694008 containerd[1555]: time="2025-09-12T00:25:37.693959633Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 12 00:25:39.761589 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 00:25:39.763259 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:25:39.968347 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:39.983139 (kubelet)[2234]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 00:25:40.049211 kubelet[2234]: E0912 00:25:40.049038 2234 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 00:25:40.054285 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 00:25:40.054597 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 00:25:40.055077 systemd[1]: kubelet.service: Consumed 238ms CPU time, 109.9M memory peak. Sep 12 00:25:40.301219 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:40.301439 systemd[1]: kubelet.service: Consumed 238ms CPU time, 109.9M memory peak. Sep 12 00:25:40.303623 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:25:40.331506 systemd[1]: Reload requested from client PID 2250 ('systemctl') (unit session-7.scope)... Sep 12 00:25:40.331520 systemd[1]: Reloading... Sep 12 00:25:40.414723 zram_generator::config[2292]: No configuration found. Sep 12 00:25:40.987969 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:25:41.102914 systemd[1]: Reloading finished in 771 ms. Sep 12 00:25:41.176341 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 00:25:41.176453 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 00:25:41.176801 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:41.176854 systemd[1]: kubelet.service: Consumed 142ms CPU time, 98.2M memory peak. Sep 12 00:25:41.178541 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:25:41.335558 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:41.348026 (kubelet)[2340]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 00:25:41.385985 kubelet[2340]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:25:41.385985 kubelet[2340]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 00:25:41.385985 kubelet[2340]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:25:41.386339 kubelet[2340]: I0912 00:25:41.386028 2340 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 00:25:41.569751 kubelet[2340]: I0912 00:25:41.569714 2340 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 00:25:41.569751 kubelet[2340]: I0912 00:25:41.569740 2340 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 00:25:41.570018 kubelet[2340]: I0912 00:25:41.569994 2340 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 00:25:41.600271 kubelet[2340]: E0912 00:25:41.600176 2340 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.151:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:41.607150 kubelet[2340]: I0912 00:25:41.607123 2340 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 00:25:41.613661 kubelet[2340]: I0912 00:25:41.613628 2340 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 00:25:41.618351 kubelet[2340]: I0912 00:25:41.618329 2340 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 00:25:41.620434 kubelet[2340]: I0912 00:25:41.620387 2340 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 00:25:41.620573 kubelet[2340]: I0912 00:25:41.620424 2340 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 00:25:41.620677 kubelet[2340]: I0912 00:25:41.620574 2340 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 00:25:41.620677 kubelet[2340]: I0912 00:25:41.620582 2340 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 00:25:41.620748 kubelet[2340]: I0912 00:25:41.620725 2340 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:25:41.623711 kubelet[2340]: I0912 00:25:41.623676 2340 kubelet.go:446] "Attempting to sync node with API server" Sep 12 00:25:41.623758 kubelet[2340]: I0912 00:25:41.623714 2340 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 00:25:41.623758 kubelet[2340]: I0912 00:25:41.623734 2340 kubelet.go:352] "Adding apiserver pod source" Sep 12 00:25:41.623758 kubelet[2340]: I0912 00:25:41.623743 2340 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 00:25:41.627659 kubelet[2340]: W0912 00:25:41.625993 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.151:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:41.627659 kubelet[2340]: E0912 00:25:41.626058 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.151:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:41.627659 kubelet[2340]: W0912 00:25:41.626572 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.151:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:41.627659 kubelet[2340]: E0912 00:25:41.626739 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.151:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:41.628059 kubelet[2340]: I0912 00:25:41.627978 2340 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 12 00:25:41.628430 kubelet[2340]: I0912 00:25:41.628397 2340 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 00:25:41.628922 kubelet[2340]: W0912 00:25:41.628889 2340 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 00:25:41.633043 kubelet[2340]: I0912 00:25:41.633015 2340 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 00:25:41.633089 kubelet[2340]: I0912 00:25:41.633052 2340 server.go:1287] "Started kubelet" Sep 12 00:25:41.634132 kubelet[2340]: I0912 00:25:41.633803 2340 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 00:25:41.634132 kubelet[2340]: I0912 00:25:41.634093 2340 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 00:25:41.634398 kubelet[2340]: I0912 00:25:41.634378 2340 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 00:25:41.634611 kubelet[2340]: I0912 00:25:41.634596 2340 server.go:479] "Adding debug handlers to kubelet server" Sep 12 00:25:41.634763 kubelet[2340]: I0912 00:25:41.634748 2340 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 00:25:41.635765 kubelet[2340]: I0912 00:25:41.635745 2340 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 00:25:41.637526 kubelet[2340]: E0912 00:25:41.637498 2340 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 00:25:41.637588 kubelet[2340]: I0912 00:25:41.637530 2340 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 00:25:41.637675 kubelet[2340]: I0912 00:25:41.637653 2340 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 00:25:41.637859 kubelet[2340]: I0912 00:25:41.637845 2340 reconciler.go:26] "Reconciler: start to sync state" Sep 12 00:25:41.638034 kubelet[2340]: E0912 00:25:41.638011 2340 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.151:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.151:6443: connect: connection refused" interval="200ms" Sep 12 00:25:41.638136 kubelet[2340]: W0912 00:25:41.638109 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.151:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:41.638178 kubelet[2340]: E0912 00:25:41.638143 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.151:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:41.639099 kubelet[2340]: I0912 00:25:41.639077 2340 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 00:25:41.639644 kubelet[2340]: E0912 00:25:41.638744 2340 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.151:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.151:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864614ddd363a55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 00:25:41.633030741 +0000 UTC m=+0.280538331,LastTimestamp:2025-09-12 00:25:41.633030741 +0000 UTC m=+0.280538331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 00:25:41.641235 kubelet[2340]: E0912 00:25:41.641215 2340 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 00:25:41.641445 kubelet[2340]: I0912 00:25:41.641249 2340 factory.go:221] Registration of the containerd container factory successfully Sep 12 00:25:41.641445 kubelet[2340]: I0912 00:25:41.641401 2340 factory.go:221] Registration of the systemd container factory successfully Sep 12 00:25:41.652527 kubelet[2340]: I0912 00:25:41.652505 2340 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 00:25:41.652527 kubelet[2340]: I0912 00:25:41.652519 2340 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 00:25:41.652527 kubelet[2340]: I0912 00:25:41.652533 2340 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:25:41.654152 kubelet[2340]: I0912 00:25:41.654120 2340 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 00:25:41.655260 kubelet[2340]: I0912 00:25:41.655239 2340 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 00:25:41.655260 kubelet[2340]: I0912 00:25:41.655258 2340 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 00:25:41.655314 kubelet[2340]: I0912 00:25:41.655281 2340 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 00:25:41.655314 kubelet[2340]: I0912 00:25:41.655289 2340 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 00:25:41.655367 kubelet[2340]: E0912 00:25:41.655332 2340 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 00:25:41.660082 kubelet[2340]: W0912 00:25:41.659577 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.151:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:41.660082 kubelet[2340]: E0912 00:25:41.659630 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.151:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:41.738571 kubelet[2340]: E0912 00:25:41.738532 2340 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 00:25:41.755777 kubelet[2340]: E0912 00:25:41.755729 2340 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 00:25:41.839065 kubelet[2340]: E0912 00:25:41.839035 2340 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 00:25:41.839238 kubelet[2340]: E0912 00:25:41.839212 2340 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.151:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.151:6443: connect: connection refused" interval="400ms" Sep 12 00:25:41.940061 kubelet[2340]: E0912 00:25:41.939981 2340 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 00:25:41.956170 kubelet[2340]: E0912 00:25:41.956152 2340 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 00:25:42.040596 kubelet[2340]: E0912 00:25:42.040576 2340 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 00:25:42.063268 kubelet[2340]: I0912 00:25:42.063236 2340 policy_none.go:49] "None policy: Start" Sep 12 00:25:42.063268 kubelet[2340]: I0912 00:25:42.063254 2340 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 00:25:42.063268 kubelet[2340]: I0912 00:25:42.063265 2340 state_mem.go:35] "Initializing new in-memory state store" Sep 12 00:25:42.074067 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 00:25:42.087752 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 00:25:42.112318 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 00:25:42.113968 kubelet[2340]: I0912 00:25:42.113520 2340 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 00:25:42.113968 kubelet[2340]: I0912 00:25:42.113713 2340 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 00:25:42.113968 kubelet[2340]: I0912 00:25:42.113722 2340 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 00:25:42.114059 kubelet[2340]: I0912 00:25:42.114009 2340 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 00:25:42.114684 kubelet[2340]: E0912 00:25:42.114656 2340 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 00:25:42.114748 kubelet[2340]: E0912 00:25:42.114716 2340 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 12 00:25:42.215546 kubelet[2340]: I0912 00:25:42.215472 2340 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:25:42.215875 kubelet[2340]: E0912 00:25:42.215840 2340 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.151:6443/api/v1/nodes\": dial tcp 10.0.0.151:6443: connect: connection refused" node="localhost" Sep 12 00:25:42.240395 kubelet[2340]: E0912 00:25:42.240364 2340 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.151:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.151:6443: connect: connection refused" interval="800ms" Sep 12 00:25:42.365418 systemd[1]: Created slice kubepods-burstable-podbd5c44267d391565958c4f27244eea48.slice - libcontainer container kubepods-burstable-podbd5c44267d391565958c4f27244eea48.slice. Sep 12 00:25:42.383236 kubelet[2340]: E0912 00:25:42.383197 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:42.387111 systemd[1]: Created slice kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice - libcontainer container kubepods-burstable-pod1403266a9792debaa127cd8df7a81c3c.slice. Sep 12 00:25:42.389195 kubelet[2340]: E0912 00:25:42.389177 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:42.391140 systemd[1]: Created slice kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice - libcontainer container kubepods-burstable-pod72a30db4fc25e4da65a3b99eba43be94.slice. Sep 12 00:25:42.392771 kubelet[2340]: E0912 00:25:42.392747 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:42.417568 kubelet[2340]: I0912 00:25:42.417550 2340 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:25:42.417921 kubelet[2340]: E0912 00:25:42.417887 2340 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.151:6443/api/v1/nodes\": dial tcp 10.0.0.151:6443: connect: connection refused" node="localhost" Sep 12 00:25:42.442295 kubelet[2340]: I0912 00:25:42.442262 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bd5c44267d391565958c4f27244eea48-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"bd5c44267d391565958c4f27244eea48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:42.442344 kubelet[2340]: I0912 00:25:42.442289 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:42.442344 kubelet[2340]: I0912 00:25:42.442318 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:42.442432 kubelet[2340]: I0912 00:25:42.442351 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:42.442432 kubelet[2340]: I0912 00:25:42.442407 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:42.442492 kubelet[2340]: I0912 00:25:42.442431 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 00:25:42.442492 kubelet[2340]: I0912 00:25:42.442451 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bd5c44267d391565958c4f27244eea48-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"bd5c44267d391565958c4f27244eea48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:42.442492 kubelet[2340]: I0912 00:25:42.442470 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bd5c44267d391565958c4f27244eea48-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"bd5c44267d391565958c4f27244eea48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:42.442492 kubelet[2340]: I0912 00:25:42.442489 2340 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:42.684648 containerd[1555]: time="2025-09-12T00:25:42.684596316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:bd5c44267d391565958c4f27244eea48,Namespace:kube-system,Attempt:0,}" Sep 12 00:25:42.690058 containerd[1555]: time="2025-09-12T00:25:42.690031467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,}" Sep 12 00:25:42.693525 containerd[1555]: time="2025-09-12T00:25:42.693494039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,}" Sep 12 00:25:42.768436 kubelet[2340]: E0912 00:25:42.768345 2340 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.151:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.151:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864614ddd363a55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-12 00:25:41.633030741 +0000 UTC m=+0.280538331,LastTimestamp:2025-09-12 00:25:41.633030741 +0000 UTC m=+0.280538331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 12 00:25:42.819195 kubelet[2340]: I0912 00:25:42.819157 2340 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:25:42.819401 kubelet[2340]: E0912 00:25:42.819370 2340 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.151:6443/api/v1/nodes\": dial tcp 10.0.0.151:6443: connect: connection refused" node="localhost" Sep 12 00:25:42.925947 kubelet[2340]: W0912 00:25:42.925853 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.151:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:42.926238 kubelet[2340]: W0912 00:25:42.926160 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.151:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:42.926238 kubelet[2340]: E0912 00:25:42.926214 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.151:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:42.926399 kubelet[2340]: E0912 00:25:42.926369 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.151:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:42.936534 containerd[1555]: time="2025-09-12T00:25:42.936395469Z" level=info msg="connecting to shim 27cb32c133ec782d2693a483bcf52c7332b191fe97b41ea5e5375aedae1a0eba" address="unix:///run/containerd/s/440e1310c667a05039b06418666a849d1c7279cb0dc5ffaaac9ec6fe096d422c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:25:42.937782 containerd[1555]: time="2025-09-12T00:25:42.937752343Z" level=info msg="connecting to shim 43e092902b2a764eb97c0b3e98c57d399dc79f6e45a4da873a8e88b11ed99bea" address="unix:///run/containerd/s/690c7d4aa1b132f8a22282b4261f047b50c6f282562a94d0d1d442c7ef783d16" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:25:42.938495 containerd[1555]: time="2025-09-12T00:25:42.938416699Z" level=info msg="connecting to shim 04aa630fb17cbe61f97112945c5ae95180cfe4bae432c74967dce2be2a26f75e" address="unix:///run/containerd/s/8d028b76353b2c76dc2b06dce20574435751b5d9758b3054424dcd8aa8b7da3c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:25:42.971926 systemd[1]: Started cri-containerd-27cb32c133ec782d2693a483bcf52c7332b191fe97b41ea5e5375aedae1a0eba.scope - libcontainer container 27cb32c133ec782d2693a483bcf52c7332b191fe97b41ea5e5375aedae1a0eba. Sep 12 00:25:42.977080 systemd[1]: Started cri-containerd-43e092902b2a764eb97c0b3e98c57d399dc79f6e45a4da873a8e88b11ed99bea.scope - libcontainer container 43e092902b2a764eb97c0b3e98c57d399dc79f6e45a4da873a8e88b11ed99bea. Sep 12 00:25:42.981776 systemd[1]: Started cri-containerd-04aa630fb17cbe61f97112945c5ae95180cfe4bae432c74967dce2be2a26f75e.scope - libcontainer container 04aa630fb17cbe61f97112945c5ae95180cfe4bae432c74967dce2be2a26f75e. Sep 12 00:25:43.054964 kubelet[2340]: E0912 00:25:43.040796 2340 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.151:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.151:6443: connect: connection refused" interval="1.6s" Sep 12 00:25:43.054964 kubelet[2340]: W0912 00:25:43.054200 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.151:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:43.054964 kubelet[2340]: E0912 00:25:43.054291 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.151:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:43.055120 containerd[1555]: time="2025-09-12T00:25:43.043469323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:bd5c44267d391565958c4f27244eea48,Namespace:kube-system,Attempt:0,} returns sandbox id \"04aa630fb17cbe61f97112945c5ae95180cfe4bae432c74967dce2be2a26f75e\"" Sep 12 00:25:43.055120 containerd[1555]: time="2025-09-12T00:25:43.045976393Z" level=info msg="CreateContainer within sandbox \"04aa630fb17cbe61f97112945c5ae95180cfe4bae432c74967dce2be2a26f75e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 00:25:43.061249 containerd[1555]: time="2025-09-12T00:25:43.061219995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:72a30db4fc25e4da65a3b99eba43be94,Namespace:kube-system,Attempt:0,} returns sandbox id \"43e092902b2a764eb97c0b3e98c57d399dc79f6e45a4da873a8e88b11ed99bea\"" Sep 12 00:25:43.064143 containerd[1555]: time="2025-09-12T00:25:43.064110204Z" level=info msg="CreateContainer within sandbox \"43e092902b2a764eb97c0b3e98c57d399dc79f6e45a4da873a8e88b11ed99bea\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 00:25:43.208818 kubelet[2340]: W0912 00:25:43.208330 2340 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.151:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.151:6443: connect: connection refused Sep 12 00:25:43.208818 kubelet[2340]: E0912 00:25:43.208397 2340 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.151:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.151:6443: connect: connection refused" logger="UnhandledError" Sep 12 00:25:43.221768 containerd[1555]: time="2025-09-12T00:25:43.221737776Z" level=info msg="Container f6d3405ca4e5e22233736b1d6562d596402fbd355f08268ba045be808eb3a35c: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:25:43.223030 containerd[1555]: time="2025-09-12T00:25:43.222931855Z" level=info msg="Container 8c01be7201442007188cb9253ffeb3a1b6dfd10aca051886c0ab70fe7585e5ee: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:25:43.232234 containerd[1555]: time="2025-09-12T00:25:43.232174665Z" level=info msg="CreateContainer within sandbox \"04aa630fb17cbe61f97112945c5ae95180cfe4bae432c74967dce2be2a26f75e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f6d3405ca4e5e22233736b1d6562d596402fbd355f08268ba045be808eb3a35c\"" Sep 12 00:25:43.233523 containerd[1555]: time="2025-09-12T00:25:43.233488549Z" level=info msg="StartContainer for \"f6d3405ca4e5e22233736b1d6562d596402fbd355f08268ba045be808eb3a35c\"" Sep 12 00:25:43.234912 containerd[1555]: time="2025-09-12T00:25:43.234869838Z" level=info msg="connecting to shim f6d3405ca4e5e22233736b1d6562d596402fbd355f08268ba045be808eb3a35c" address="unix:///run/containerd/s/8d028b76353b2c76dc2b06dce20574435751b5d9758b3054424dcd8aa8b7da3c" protocol=ttrpc version=3 Sep 12 00:25:43.235580 containerd[1555]: time="2025-09-12T00:25:43.235545956Z" level=info msg="CreateContainer within sandbox \"43e092902b2a764eb97c0b3e98c57d399dc79f6e45a4da873a8e88b11ed99bea\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8c01be7201442007188cb9253ffeb3a1b6dfd10aca051886c0ab70fe7585e5ee\"" Sep 12 00:25:43.236553 containerd[1555]: time="2025-09-12T00:25:43.236514813Z" level=info msg="StartContainer for \"8c01be7201442007188cb9253ffeb3a1b6dfd10aca051886c0ab70fe7585e5ee\"" Sep 12 00:25:43.236761 containerd[1555]: time="2025-09-12T00:25:43.236741979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1403266a9792debaa127cd8df7a81c3c,Namespace:kube-system,Attempt:0,} returns sandbox id \"27cb32c133ec782d2693a483bcf52c7332b191fe97b41ea5e5375aedae1a0eba\"" Sep 12 00:25:43.237459 containerd[1555]: time="2025-09-12T00:25:43.237429317Z" level=info msg="connecting to shim 8c01be7201442007188cb9253ffeb3a1b6dfd10aca051886c0ab70fe7585e5ee" address="unix:///run/containerd/s/690c7d4aa1b132f8a22282b4261f047b50c6f282562a94d0d1d442c7ef783d16" protocol=ttrpc version=3 Sep 12 00:25:43.239783 containerd[1555]: time="2025-09-12T00:25:43.239752584Z" level=info msg="CreateContainer within sandbox \"27cb32c133ec782d2693a483bcf52c7332b191fe97b41ea5e5375aedae1a0eba\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 00:25:43.249168 containerd[1555]: time="2025-09-12T00:25:43.248583291Z" level=info msg="Container 7e640deb8afaeb752a54c3390cc5932c5417ffcdbae750c2f9235ad5626f4144: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:25:43.254911 containerd[1555]: time="2025-09-12T00:25:43.254881420Z" level=info msg="CreateContainer within sandbox \"27cb32c133ec782d2693a483bcf52c7332b191fe97b41ea5e5375aedae1a0eba\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7e640deb8afaeb752a54c3390cc5932c5417ffcdbae750c2f9235ad5626f4144\"" Sep 12 00:25:43.255739 containerd[1555]: time="2025-09-12T00:25:43.255259699Z" level=info msg="StartContainer for \"7e640deb8afaeb752a54c3390cc5932c5417ffcdbae750c2f9235ad5626f4144\"" Sep 12 00:25:43.256177 containerd[1555]: time="2025-09-12T00:25:43.256142905Z" level=info msg="connecting to shim 7e640deb8afaeb752a54c3390cc5932c5417ffcdbae750c2f9235ad5626f4144" address="unix:///run/containerd/s/440e1310c667a05039b06418666a849d1c7279cb0dc5ffaaac9ec6fe096d422c" protocol=ttrpc version=3 Sep 12 00:25:43.263928 systemd[1]: Started cri-containerd-f6d3405ca4e5e22233736b1d6562d596402fbd355f08268ba045be808eb3a35c.scope - libcontainer container f6d3405ca4e5e22233736b1d6562d596402fbd355f08268ba045be808eb3a35c. Sep 12 00:25:43.272936 systemd[1]: Started cri-containerd-8c01be7201442007188cb9253ffeb3a1b6dfd10aca051886c0ab70fe7585e5ee.scope - libcontainer container 8c01be7201442007188cb9253ffeb3a1b6dfd10aca051886c0ab70fe7585e5ee. Sep 12 00:25:43.283815 systemd[1]: Started cri-containerd-7e640deb8afaeb752a54c3390cc5932c5417ffcdbae750c2f9235ad5626f4144.scope - libcontainer container 7e640deb8afaeb752a54c3390cc5932c5417ffcdbae750c2f9235ad5626f4144. Sep 12 00:25:43.345683 containerd[1555]: time="2025-09-12T00:25:43.345584789Z" level=info msg="StartContainer for \"7e640deb8afaeb752a54c3390cc5932c5417ffcdbae750c2f9235ad5626f4144\" returns successfully" Sep 12 00:25:43.349508 containerd[1555]: time="2025-09-12T00:25:43.349477378Z" level=info msg="StartContainer for \"f6d3405ca4e5e22233736b1d6562d596402fbd355f08268ba045be808eb3a35c\" returns successfully" Sep 12 00:25:43.358773 containerd[1555]: time="2025-09-12T00:25:43.358731770Z" level=info msg="StartContainer for \"8c01be7201442007188cb9253ffeb3a1b6dfd10aca051886c0ab70fe7585e5ee\" returns successfully" Sep 12 00:25:43.622237 kubelet[2340]: I0912 00:25:43.622194 2340 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:25:43.706213 kubelet[2340]: E0912 00:25:43.706163 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:43.707421 kubelet[2340]: E0912 00:25:43.707396 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:43.710024 kubelet[2340]: E0912 00:25:43.709879 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:44.712043 kubelet[2340]: E0912 00:25:44.712001 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:44.712435 kubelet[2340]: E0912 00:25:44.712350 2340 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Sep 12 00:25:44.898738 kubelet[2340]: E0912 00:25:44.898683 2340 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 12 00:25:45.092589 kubelet[2340]: I0912 00:25:45.092538 2340 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 00:25:45.092589 kubelet[2340]: E0912 00:25:45.092579 2340 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 12 00:25:45.106109 kubelet[2340]: E0912 00:25:45.106057 2340 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 12 00:25:45.138027 kubelet[2340]: I0912 00:25:45.137973 2340 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 00:25:45.141602 kubelet[2340]: E0912 00:25:45.141574 2340 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Sep 12 00:25:45.141602 kubelet[2340]: I0912 00:25:45.141599 2340 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:45.143170 kubelet[2340]: E0912 00:25:45.143138 2340 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:45.143170 kubelet[2340]: I0912 00:25:45.143167 2340 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:45.144460 kubelet[2340]: E0912 00:25:45.144440 2340 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:45.628984 kubelet[2340]: I0912 00:25:45.628950 2340 apiserver.go:52] "Watching apiserver" Sep 12 00:25:45.638299 kubelet[2340]: I0912 00:25:45.638258 2340 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 00:25:45.711938 kubelet[2340]: I0912 00:25:45.711908 2340 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:45.713445 kubelet[2340]: E0912 00:25:45.713415 2340 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:46.776713 systemd[1]: Reload requested from client PID 2609 ('systemctl') (unit session-7.scope)... Sep 12 00:25:46.776725 systemd[1]: Reloading... Sep 12 00:25:46.859733 zram_generator::config[2664]: No configuration found. Sep 12 00:25:46.934631 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 00:25:47.062782 systemd[1]: Reloading finished in 285 ms. Sep 12 00:25:47.086260 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:25:47.094008 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 00:25:47.094319 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:47.094368 systemd[1]: kubelet.service: Consumed 732ms CPU time, 133.7M memory peak. Sep 12 00:25:47.096099 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 00:25:47.294106 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 00:25:47.303044 (kubelet)[2697]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 00:25:47.551715 kubelet[2697]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:25:47.551715 kubelet[2697]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 00:25:47.551715 kubelet[2697]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 00:25:47.552250 kubelet[2697]: I0912 00:25:47.551808 2697 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 00:25:47.558350 kubelet[2697]: I0912 00:25:47.558313 2697 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 00:25:47.558350 kubelet[2697]: I0912 00:25:47.558338 2697 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 00:25:47.558628 kubelet[2697]: I0912 00:25:47.558606 2697 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 00:25:47.559880 kubelet[2697]: I0912 00:25:47.559852 2697 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 00:25:47.563630 kubelet[2697]: I0912 00:25:47.563610 2697 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 00:25:47.569733 kubelet[2697]: I0912 00:25:47.569637 2697 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 00:25:47.574460 kubelet[2697]: I0912 00:25:47.574434 2697 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 00:25:47.574734 kubelet[2697]: I0912 00:25:47.574689 2697 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 00:25:47.574900 kubelet[2697]: I0912 00:25:47.574731 2697 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 00:25:47.575004 kubelet[2697]: I0912 00:25:47.574906 2697 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 00:25:47.575004 kubelet[2697]: I0912 00:25:47.574916 2697 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 00:25:47.575004 kubelet[2697]: I0912 00:25:47.574972 2697 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:25:47.575159 kubelet[2697]: I0912 00:25:47.575126 2697 kubelet.go:446] "Attempting to sync node with API server" Sep 12 00:25:47.575159 kubelet[2697]: I0912 00:25:47.575153 2697 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 00:25:47.575320 kubelet[2697]: I0912 00:25:47.575181 2697 kubelet.go:352] "Adding apiserver pod source" Sep 12 00:25:47.575320 kubelet[2697]: I0912 00:25:47.575193 2697 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 00:25:47.576057 kubelet[2697]: I0912 00:25:47.576035 2697 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 12 00:25:47.576358 kubelet[2697]: I0912 00:25:47.576336 2697 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 00:25:47.576803 kubelet[2697]: I0912 00:25:47.576787 2697 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 00:25:47.576846 kubelet[2697]: I0912 00:25:47.576825 2697 server.go:1287] "Started kubelet" Sep 12 00:25:47.577114 kubelet[2697]: I0912 00:25:47.577006 2697 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 00:25:47.581718 kubelet[2697]: I0912 00:25:47.579541 2697 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 00:25:47.584812 kubelet[2697]: I0912 00:25:47.577112 2697 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 00:25:47.585122 kubelet[2697]: I0912 00:25:47.585100 2697 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 00:25:47.585647 kubelet[2697]: I0912 00:25:47.585625 2697 server.go:479] "Adding debug handlers to kubelet server" Sep 12 00:25:47.588399 kubelet[2697]: I0912 00:25:47.588361 2697 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 00:25:47.588544 kubelet[2697]: I0912 00:25:47.588523 2697 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 00:25:47.590885 kubelet[2697]: I0912 00:25:47.590862 2697 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 00:25:47.591025 kubelet[2697]: I0912 00:25:47.591007 2697 reconciler.go:26] "Reconciler: start to sync state" Sep 12 00:25:47.591756 kubelet[2697]: I0912 00:25:47.591735 2697 factory.go:221] Registration of the systemd container factory successfully Sep 12 00:25:47.591875 kubelet[2697]: I0912 00:25:47.591845 2697 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 00:25:47.595423 kubelet[2697]: I0912 00:25:47.595399 2697 factory.go:221] Registration of the containerd container factory successfully Sep 12 00:25:47.596340 kubelet[2697]: E0912 00:25:47.596188 2697 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 00:25:47.598853 kubelet[2697]: I0912 00:25:47.598718 2697 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 00:25:47.599975 kubelet[2697]: I0912 00:25:47.599961 2697 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 00:25:47.600078 kubelet[2697]: I0912 00:25:47.600067 2697 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 00:25:47.600148 kubelet[2697]: I0912 00:25:47.600138 2697 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 00:25:47.600194 kubelet[2697]: I0912 00:25:47.600186 2697 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 00:25:47.600295 kubelet[2697]: E0912 00:25:47.600270 2697 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 00:25:47.627826 kubelet[2697]: I0912 00:25:47.627779 2697 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 00:25:47.627826 kubelet[2697]: I0912 00:25:47.627795 2697 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 00:25:47.627826 kubelet[2697]: I0912 00:25:47.627811 2697 state_mem.go:36] "Initialized new in-memory state store" Sep 12 00:25:47.628007 kubelet[2697]: I0912 00:25:47.627957 2697 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 00:25:47.628007 kubelet[2697]: I0912 00:25:47.627968 2697 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 00:25:47.628007 kubelet[2697]: I0912 00:25:47.627987 2697 policy_none.go:49] "None policy: Start" Sep 12 00:25:47.628007 kubelet[2697]: I0912 00:25:47.627999 2697 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 00:25:47.628091 kubelet[2697]: I0912 00:25:47.628011 2697 state_mem.go:35] "Initializing new in-memory state store" Sep 12 00:25:47.628114 kubelet[2697]: I0912 00:25:47.628095 2697 state_mem.go:75] "Updated machine memory state" Sep 12 00:25:47.634870 kubelet[2697]: I0912 00:25:47.634847 2697 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 00:25:47.635158 kubelet[2697]: I0912 00:25:47.635135 2697 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 00:25:47.635318 kubelet[2697]: I0912 00:25:47.635153 2697 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 00:25:47.635498 kubelet[2697]: I0912 00:25:47.635414 2697 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 00:25:47.637826 kubelet[2697]: E0912 00:25:47.637806 2697 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 00:25:47.700867 kubelet[2697]: I0912 00:25:47.700817 2697 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Sep 12 00:25:47.700867 kubelet[2697]: I0912 00:25:47.700839 2697 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:47.701009 kubelet[2697]: I0912 00:25:47.700982 2697 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:47.738823 kubelet[2697]: I0912 00:25:47.738784 2697 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Sep 12 00:25:47.744090 kubelet[2697]: I0912 00:25:47.744052 2697 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Sep 12 00:25:47.744236 kubelet[2697]: I0912 00:25:47.744115 2697 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Sep 12 00:25:47.792354 kubelet[2697]: I0912 00:25:47.792322 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bd5c44267d391565958c4f27244eea48-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"bd5c44267d391565958c4f27244eea48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:47.792354 kubelet[2697]: I0912 00:25:47.792353 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:47.792354 kubelet[2697]: I0912 00:25:47.792372 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bd5c44267d391565958c4f27244eea48-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"bd5c44267d391565958c4f27244eea48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:47.792515 kubelet[2697]: I0912 00:25:47.792387 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bd5c44267d391565958c4f27244eea48-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"bd5c44267d391565958c4f27244eea48\") " pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:47.792515 kubelet[2697]: I0912 00:25:47.792403 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:47.792515 kubelet[2697]: I0912 00:25:47.792419 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:47.792515 kubelet[2697]: I0912 00:25:47.792434 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:47.792515 kubelet[2697]: I0912 00:25:47.792489 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1403266a9792debaa127cd8df7a81c3c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1403266a9792debaa127cd8df7a81c3c\") " pod="kube-system/kube-controller-manager-localhost" Sep 12 00:25:47.792750 kubelet[2697]: I0912 00:25:47.792504 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/72a30db4fc25e4da65a3b99eba43be94-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"72a30db4fc25e4da65a3b99eba43be94\") " pod="kube-system/kube-scheduler-localhost" Sep 12 00:25:48.576440 kubelet[2697]: I0912 00:25:48.576395 2697 apiserver.go:52] "Watching apiserver" Sep 12 00:25:48.591532 kubelet[2697]: I0912 00:25:48.591507 2697 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 00:25:48.613061 kubelet[2697]: I0912 00:25:48.613033 2697 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:48.695407 kubelet[2697]: E0912 00:25:48.695363 2697 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 12 00:25:48.719888 kubelet[2697]: I0912 00:25:48.719715 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.719684845 podStartE2EDuration="1.719684845s" podCreationTimestamp="2025-09-12 00:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:25:48.719643618 +0000 UTC m=+1.412909967" watchObservedRunningTime="2025-09-12 00:25:48.719684845 +0000 UTC m=+1.412951184" Sep 12 00:25:48.719888 kubelet[2697]: I0912 00:25:48.719819 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.719815029 podStartE2EDuration="1.719815029s" podCreationTimestamp="2025-09-12 00:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:25:48.712442917 +0000 UTC m=+1.405709266" watchObservedRunningTime="2025-09-12 00:25:48.719815029 +0000 UTC m=+1.413081378" Sep 12 00:25:48.730018 kubelet[2697]: I0912 00:25:48.729954 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.7299375989999999 podStartE2EDuration="1.729937599s" podCreationTimestamp="2025-09-12 00:25:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:25:48.729292069 +0000 UTC m=+1.422558418" watchObservedRunningTime="2025-09-12 00:25:48.729937599 +0000 UTC m=+1.423203948" Sep 12 00:25:51.830066 kubelet[2697]: I0912 00:25:51.830034 2697 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 00:25:51.830506 containerd[1555]: time="2025-09-12T00:25:51.830409448Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 00:25:51.830826 kubelet[2697]: I0912 00:25:51.830565 2697 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 00:25:52.706763 systemd[1]: Created slice kubepods-besteffort-pod05ba2f2a_2096_42b1_a86b_675ea5b2cb87.slice - libcontainer container kubepods-besteffort-pod05ba2f2a_2096_42b1_a86b_675ea5b2cb87.slice. Sep 12 00:25:52.723751 kubelet[2697]: I0912 00:25:52.723681 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjqpn\" (UniqueName: \"kubernetes.io/projected/05ba2f2a-2096-42b1-a86b-675ea5b2cb87-kube-api-access-cjqpn\") pod \"kube-proxy-t6lgz\" (UID: \"05ba2f2a-2096-42b1-a86b-675ea5b2cb87\") " pod="kube-system/kube-proxy-t6lgz" Sep 12 00:25:52.723887 kubelet[2697]: I0912 00:25:52.723761 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/05ba2f2a-2096-42b1-a86b-675ea5b2cb87-kube-proxy\") pod \"kube-proxy-t6lgz\" (UID: \"05ba2f2a-2096-42b1-a86b-675ea5b2cb87\") " pod="kube-system/kube-proxy-t6lgz" Sep 12 00:25:52.723887 kubelet[2697]: I0912 00:25:52.723792 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/05ba2f2a-2096-42b1-a86b-675ea5b2cb87-xtables-lock\") pod \"kube-proxy-t6lgz\" (UID: \"05ba2f2a-2096-42b1-a86b-675ea5b2cb87\") " pod="kube-system/kube-proxy-t6lgz" Sep 12 00:25:52.723887 kubelet[2697]: I0912 00:25:52.723808 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05ba2f2a-2096-42b1-a86b-675ea5b2cb87-lib-modules\") pod \"kube-proxy-t6lgz\" (UID: \"05ba2f2a-2096-42b1-a86b-675ea5b2cb87\") " pod="kube-system/kube-proxy-t6lgz" Sep 12 00:25:52.918652 systemd[1]: Created slice kubepods-besteffort-pod37c292c9_46ba_40a4_8a54_9013b4dfffbd.slice - libcontainer container kubepods-besteffort-pod37c292c9_46ba_40a4_8a54_9013b4dfffbd.slice. Sep 12 00:25:52.924846 kubelet[2697]: I0912 00:25:52.924808 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/37c292c9-46ba-40a4-8a54-9013b4dfffbd-var-lib-calico\") pod \"tigera-operator-755d956888-k4hzw\" (UID: \"37c292c9-46ba-40a4-8a54-9013b4dfffbd\") " pod="tigera-operator/tigera-operator-755d956888-k4hzw" Sep 12 00:25:52.924846 kubelet[2697]: I0912 00:25:52.924840 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf5t7\" (UniqueName: \"kubernetes.io/projected/37c292c9-46ba-40a4-8a54-9013b4dfffbd-kube-api-access-tf5t7\") pod \"tigera-operator-755d956888-k4hzw\" (UID: \"37c292c9-46ba-40a4-8a54-9013b4dfffbd\") " pod="tigera-operator/tigera-operator-755d956888-k4hzw" Sep 12 00:25:53.018223 containerd[1555]: time="2025-09-12T00:25:53.018193491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t6lgz,Uid:05ba2f2a-2096-42b1-a86b-675ea5b2cb87,Namespace:kube-system,Attempt:0,}" Sep 12 00:25:53.046401 containerd[1555]: time="2025-09-12T00:25:53.046359619Z" level=info msg="connecting to shim 9f66849d7b04a86cee886d938e864f6801c9b2c55492a86725ae402d27c7f401" address="unix:///run/containerd/s/30a6b3f74a2e82ff5a60e65cbe25c135421fbc2e72a40fcaae640138da795c86" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:25:53.078855 systemd[1]: Started cri-containerd-9f66849d7b04a86cee886d938e864f6801c9b2c55492a86725ae402d27c7f401.scope - libcontainer container 9f66849d7b04a86cee886d938e864f6801c9b2c55492a86725ae402d27c7f401. Sep 12 00:25:53.101413 containerd[1555]: time="2025-09-12T00:25:53.101379091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t6lgz,Uid:05ba2f2a-2096-42b1-a86b-675ea5b2cb87,Namespace:kube-system,Attempt:0,} returns sandbox id \"9f66849d7b04a86cee886d938e864f6801c9b2c55492a86725ae402d27c7f401\"" Sep 12 00:25:53.104038 containerd[1555]: time="2025-09-12T00:25:53.104018524Z" level=info msg="CreateContainer within sandbox \"9f66849d7b04a86cee886d938e864f6801c9b2c55492a86725ae402d27c7f401\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 00:25:53.114457 containerd[1555]: time="2025-09-12T00:25:53.114420605Z" level=info msg="Container 4957c7bd99d70cc206df0bcd2cd1cef978c92065af9f6a099ad5364c1272c001: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:25:53.121640 containerd[1555]: time="2025-09-12T00:25:53.121608799Z" level=info msg="CreateContainer within sandbox \"9f66849d7b04a86cee886d938e864f6801c9b2c55492a86725ae402d27c7f401\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"4957c7bd99d70cc206df0bcd2cd1cef978c92065af9f6a099ad5364c1272c001\"" Sep 12 00:25:53.122070 containerd[1555]: time="2025-09-12T00:25:53.122047656Z" level=info msg="StartContainer for \"4957c7bd99d70cc206df0bcd2cd1cef978c92065af9f6a099ad5364c1272c001\"" Sep 12 00:25:53.124227 containerd[1555]: time="2025-09-12T00:25:53.124149152Z" level=info msg="connecting to shim 4957c7bd99d70cc206df0bcd2cd1cef978c92065af9f6a099ad5364c1272c001" address="unix:///run/containerd/s/30a6b3f74a2e82ff5a60e65cbe25c135421fbc2e72a40fcaae640138da795c86" protocol=ttrpc version=3 Sep 12 00:25:53.157835 systemd[1]: Started cri-containerd-4957c7bd99d70cc206df0bcd2cd1cef978c92065af9f6a099ad5364c1272c001.scope - libcontainer container 4957c7bd99d70cc206df0bcd2cd1cef978c92065af9f6a099ad5364c1272c001. Sep 12 00:25:53.198061 containerd[1555]: time="2025-09-12T00:25:53.198015896Z" level=info msg="StartContainer for \"4957c7bd99d70cc206df0bcd2cd1cef978c92065af9f6a099ad5364c1272c001\" returns successfully" Sep 12 00:25:53.223063 containerd[1555]: time="2025-09-12T00:25:53.223018443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-k4hzw,Uid:37c292c9-46ba-40a4-8a54-9013b4dfffbd,Namespace:tigera-operator,Attempt:0,}" Sep 12 00:25:53.242894 containerd[1555]: time="2025-09-12T00:25:53.242847105Z" level=info msg="connecting to shim e287b72e8aebea2da4ba034917ffb853d4c28cb29679dfcab725d96864fd5590" address="unix:///run/containerd/s/6284ebaaaf7167cf0cebf9ab22e6d4ffe7901a397aaeb9bab0a0080e7e5ed5af" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:25:53.265881 systemd[1]: Started cri-containerd-e287b72e8aebea2da4ba034917ffb853d4c28cb29679dfcab725d96864fd5590.scope - libcontainer container e287b72e8aebea2da4ba034917ffb853d4c28cb29679dfcab725d96864fd5590. Sep 12 00:25:53.310049 containerd[1555]: time="2025-09-12T00:25:53.309759843Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-k4hzw,Uid:37c292c9-46ba-40a4-8a54-9013b4dfffbd,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e287b72e8aebea2da4ba034917ffb853d4c28cb29679dfcab725d96864fd5590\"" Sep 12 00:25:53.311654 containerd[1555]: time="2025-09-12T00:25:53.311623025Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 00:25:53.835117 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2300117975.mount: Deactivated successfully. Sep 12 00:25:54.848987 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1033818935.mount: Deactivated successfully. Sep 12 00:25:56.574997 kubelet[2697]: I0912 00:25:56.574858 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-t6lgz" podStartSLOduration=4.574839068 podStartE2EDuration="4.574839068s" podCreationTimestamp="2025-09-12 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:25:53.631826222 +0000 UTC m=+6.325092571" watchObservedRunningTime="2025-09-12 00:25:56.574839068 +0000 UTC m=+9.268105417" Sep 12 00:25:57.458588 containerd[1555]: time="2025-09-12T00:25:57.458531412Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:57.459445 containerd[1555]: time="2025-09-12T00:25:57.459397868Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 00:25:57.460569 containerd[1555]: time="2025-09-12T00:25:57.460543654Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:57.462690 containerd[1555]: time="2025-09-12T00:25:57.462665715Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:25:57.463194 containerd[1555]: time="2025-09-12T00:25:57.463163150Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 4.151514226s" Sep 12 00:25:57.463230 containerd[1555]: time="2025-09-12T00:25:57.463196373Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 00:25:57.464876 containerd[1555]: time="2025-09-12T00:25:57.464837550Z" level=info msg="CreateContainer within sandbox \"e287b72e8aebea2da4ba034917ffb853d4c28cb29679dfcab725d96864fd5590\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 00:25:57.473263 containerd[1555]: time="2025-09-12T00:25:57.473216444Z" level=info msg="Container 9a43fd049706c5d342177cdb8b268efe6d06023ece8fe8e5a32ad21779f1dd82: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:25:57.479335 containerd[1555]: time="2025-09-12T00:25:57.479307663Z" level=info msg="CreateContainer within sandbox \"e287b72e8aebea2da4ba034917ffb853d4c28cb29679dfcab725d96864fd5590\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9a43fd049706c5d342177cdb8b268efe6d06023ece8fe8e5a32ad21779f1dd82\"" Sep 12 00:25:57.479733 containerd[1555]: time="2025-09-12T00:25:57.479672386Z" level=info msg="StartContainer for \"9a43fd049706c5d342177cdb8b268efe6d06023ece8fe8e5a32ad21779f1dd82\"" Sep 12 00:25:57.480450 containerd[1555]: time="2025-09-12T00:25:57.480418893Z" level=info msg="connecting to shim 9a43fd049706c5d342177cdb8b268efe6d06023ece8fe8e5a32ad21779f1dd82" address="unix:///run/containerd/s/6284ebaaaf7167cf0cebf9ab22e6d4ffe7901a397aaeb9bab0a0080e7e5ed5af" protocol=ttrpc version=3 Sep 12 00:25:57.534846 systemd[1]: Started cri-containerd-9a43fd049706c5d342177cdb8b268efe6d06023ece8fe8e5a32ad21779f1dd82.scope - libcontainer container 9a43fd049706c5d342177cdb8b268efe6d06023ece8fe8e5a32ad21779f1dd82. Sep 12 00:25:57.560925 containerd[1555]: time="2025-09-12T00:25:57.560894051Z" level=info msg="StartContainer for \"9a43fd049706c5d342177cdb8b268efe6d06023ece8fe8e5a32ad21779f1dd82\" returns successfully" Sep 12 00:25:57.645569 kubelet[2697]: I0912 00:25:57.645515 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-k4hzw" podStartSLOduration=1.492902497 podStartE2EDuration="5.645495024s" podCreationTimestamp="2025-09-12 00:25:52 +0000 UTC" firstStartedPulling="2025-09-12 00:25:53.311125977 +0000 UTC m=+6.004392326" lastFinishedPulling="2025-09-12 00:25:57.463718504 +0000 UTC m=+10.156984853" observedRunningTime="2025-09-12 00:25:57.637389559 +0000 UTC m=+10.330655908" watchObservedRunningTime="2025-09-12 00:25:57.645495024 +0000 UTC m=+10.338761373" Sep 12 00:26:01.439910 update_engine[1547]: I20250912 00:26:01.438944 1547 update_attempter.cc:509] Updating boot flags... Sep 12 00:26:03.806741 sudo[1774]: pam_unix(sudo:session): session closed for user root Sep 12 00:26:03.810748 sshd[1773]: Connection closed by 10.0.0.1 port 42638 Sep 12 00:26:03.810019 sshd-session[1771]: pam_unix(sshd:session): session closed for user core Sep 12 00:26:03.820812 systemd-logind[1546]: Session 7 logged out. Waiting for processes to exit. Sep 12 00:26:03.821198 systemd[1]: sshd@6-10.0.0.151:22-10.0.0.1:42638.service: Deactivated successfully. Sep 12 00:26:03.834414 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 00:26:03.836192 systemd[1]: session-7.scope: Consumed 5.923s CPU time, 226.9M memory peak. Sep 12 00:26:03.842184 systemd-logind[1546]: Removed session 7. Sep 12 00:26:07.378047 systemd[1]: Created slice kubepods-besteffort-pod283b2d47_2f2c_450b_a7bd_9a5223962220.slice - libcontainer container kubepods-besteffort-pod283b2d47_2f2c_450b_a7bd_9a5223962220.slice. Sep 12 00:26:07.438913 kubelet[2697]: I0912 00:26:07.438816 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/283b2d47-2f2c-450b-a7bd-9a5223962220-tigera-ca-bundle\") pod \"calico-typha-54cbb98bd9-fgjvn\" (UID: \"283b2d47-2f2c-450b-a7bd-9a5223962220\") " pod="calico-system/calico-typha-54cbb98bd9-fgjvn" Sep 12 00:26:07.439623 kubelet[2697]: I0912 00:26:07.439487 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/283b2d47-2f2c-450b-a7bd-9a5223962220-typha-certs\") pod \"calico-typha-54cbb98bd9-fgjvn\" (UID: \"283b2d47-2f2c-450b-a7bd-9a5223962220\") " pod="calico-system/calico-typha-54cbb98bd9-fgjvn" Sep 12 00:26:07.439623 kubelet[2697]: I0912 00:26:07.439576 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9sfs\" (UniqueName: \"kubernetes.io/projected/283b2d47-2f2c-450b-a7bd-9a5223962220-kube-api-access-w9sfs\") pod \"calico-typha-54cbb98bd9-fgjvn\" (UID: \"283b2d47-2f2c-450b-a7bd-9a5223962220\") " pod="calico-system/calico-typha-54cbb98bd9-fgjvn" Sep 12 00:26:07.694031 containerd[1555]: time="2025-09-12T00:26:07.692109759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54cbb98bd9-fgjvn,Uid:283b2d47-2f2c-450b-a7bd-9a5223962220,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:07.768585 systemd[1]: Created slice kubepods-besteffort-pod832f3bf8_ac7d_4148_bf9d_08caf42d92f8.slice - libcontainer container kubepods-besteffort-pod832f3bf8_ac7d_4148_bf9d_08caf42d92f8.slice. Sep 12 00:26:07.800419 containerd[1555]: time="2025-09-12T00:26:07.800358726Z" level=info msg="connecting to shim fb3ebe475fc3ae79343565a9d9baef5f4c5a6828963f2c1fada26d4ebf8cd562" address="unix:///run/containerd/s/497765bdceb8c1107740aa02dcf7e6c3ccd0907de13427b9fe9b4718ddc99ff7" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:07.844017 kubelet[2697]: I0912 00:26:07.843904 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-flexvol-driver-host\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.844253 kubelet[2697]: I0912 00:26:07.844208 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-var-run-calico\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.844891 kubelet[2697]: I0912 00:26:07.844840 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-policysync\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845047 kubelet[2697]: I0912 00:26:07.844886 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blwk4\" (UniqueName: \"kubernetes.io/projected/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-kube-api-access-blwk4\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845095 kubelet[2697]: I0912 00:26:07.845048 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-cni-log-dir\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845515 kubelet[2697]: I0912 00:26:07.845455 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-cni-net-dir\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845582 kubelet[2697]: I0912 00:26:07.845541 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-cni-bin-dir\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845762 kubelet[2697]: I0912 00:26:07.845586 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-lib-modules\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845829 kubelet[2697]: I0912 00:26:07.845774 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-node-certs\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845829 kubelet[2697]: I0912 00:26:07.845797 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-tigera-ca-bundle\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845829 kubelet[2697]: I0912 00:26:07.845821 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-var-lib-calico\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.845925 kubelet[2697]: I0912 00:26:07.845843 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/832f3bf8-ac7d-4148-bf9d-08caf42d92f8-xtables-lock\") pod \"calico-node-lph6b\" (UID: \"832f3bf8-ac7d-4148-bf9d-08caf42d92f8\") " pod="calico-system/calico-node-lph6b" Sep 12 00:26:07.848035 systemd[1]: Started cri-containerd-fb3ebe475fc3ae79343565a9d9baef5f4c5a6828963f2c1fada26d4ebf8cd562.scope - libcontainer container fb3ebe475fc3ae79343565a9d9baef5f4c5a6828963f2c1fada26d4ebf8cd562. Sep 12 00:26:07.957850 kubelet[2697]: E0912 00:26:07.953415 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:07.958241 kubelet[2697]: W0912 00:26:07.958132 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:07.958446 kubelet[2697]: E0912 00:26:07.958347 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:07.964842 kubelet[2697]: E0912 00:26:07.964667 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:07.965253 kubelet[2697]: W0912 00:26:07.965012 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:07.965253 kubelet[2697]: E0912 00:26:07.965051 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:07.966729 containerd[1555]: time="2025-09-12T00:26:07.965984766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54cbb98bd9-fgjvn,Uid:283b2d47-2f2c-450b-a7bd-9a5223962220,Namespace:calico-system,Attempt:0,} returns sandbox id \"fb3ebe475fc3ae79343565a9d9baef5f4c5a6828963f2c1fada26d4ebf8cd562\"" Sep 12 00:26:07.970018 containerd[1555]: time="2025-09-12T00:26:07.969972865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 00:26:07.971049 kubelet[2697]: E0912 00:26:07.971015 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:07.971327 kubelet[2697]: W0912 00:26:07.971231 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:07.971327 kubelet[2697]: E0912 00:26:07.971283 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.032349 kubelet[2697]: E0912 00:26:08.030569 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:08.077453 containerd[1555]: time="2025-09-12T00:26:08.077337853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-lph6b,Uid:832f3bf8-ac7d-4148-bf9d-08caf42d92f8,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:08.133262 kubelet[2697]: E0912 00:26:08.133189 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.133623 kubelet[2697]: W0912 00:26:08.133472 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.133623 kubelet[2697]: E0912 00:26:08.133519 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.136617 kubelet[2697]: E0912 00:26:08.136428 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.136617 kubelet[2697]: W0912 00:26:08.136453 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.136617 kubelet[2697]: E0912 00:26:08.136481 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.137017 kubelet[2697]: E0912 00:26:08.136984 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.137017 kubelet[2697]: W0912 00:26:08.136999 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.137017 kubelet[2697]: E0912 00:26:08.137012 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.137484 kubelet[2697]: E0912 00:26:08.137294 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.137484 kubelet[2697]: W0912 00:26:08.137306 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.137484 kubelet[2697]: E0912 00:26:08.137318 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.138616 kubelet[2697]: E0912 00:26:08.137558 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.138616 kubelet[2697]: W0912 00:26:08.137568 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.138616 kubelet[2697]: E0912 00:26:08.137579 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.138616 kubelet[2697]: E0912 00:26:08.137809 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.138616 kubelet[2697]: W0912 00:26:08.137819 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.138616 kubelet[2697]: E0912 00:26:08.137831 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.138616 kubelet[2697]: E0912 00:26:08.138025 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.138616 kubelet[2697]: W0912 00:26:08.138037 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.138616 kubelet[2697]: E0912 00:26:08.138050 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.138616 kubelet[2697]: E0912 00:26:08.138260 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.138971 kubelet[2697]: W0912 00:26:08.138269 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.138971 kubelet[2697]: E0912 00:26:08.138280 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.138971 kubelet[2697]: E0912 00:26:08.138492 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.138971 kubelet[2697]: W0912 00:26:08.138501 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.138971 kubelet[2697]: E0912 00:26:08.138512 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.138971 kubelet[2697]: E0912 00:26:08.138688 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.138971 kubelet[2697]: W0912 00:26:08.138722 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.138971 kubelet[2697]: E0912 00:26:08.138734 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.138971 kubelet[2697]: E0912 00:26:08.138929 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.138971 kubelet[2697]: W0912 00:26:08.138939 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.139312 kubelet[2697]: E0912 00:26:08.138950 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.139312 kubelet[2697]: E0912 00:26:08.139171 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.139312 kubelet[2697]: W0912 00:26:08.139185 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.139312 kubelet[2697]: E0912 00:26:08.139201 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.139459 kubelet[2697]: E0912 00:26:08.139426 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.139459 kubelet[2697]: W0912 00:26:08.139450 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.139540 kubelet[2697]: E0912 00:26:08.139462 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.139689 kubelet[2697]: E0912 00:26:08.139662 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.139689 kubelet[2697]: W0912 00:26:08.139682 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.144849 kubelet[2697]: E0912 00:26:08.139972 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.144849 kubelet[2697]: E0912 00:26:08.144093 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.152852 kubelet[2697]: W0912 00:26:08.147640 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.152852 kubelet[2697]: E0912 00:26:08.147687 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.152852 kubelet[2697]: E0912 00:26:08.148090 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.152852 kubelet[2697]: W0912 00:26:08.148103 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.152852 kubelet[2697]: E0912 00:26:08.148116 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.152852 kubelet[2697]: E0912 00:26:08.149884 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.152852 kubelet[2697]: W0912 00:26:08.149900 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.152852 kubelet[2697]: E0912 00:26:08.149917 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.152852 kubelet[2697]: E0912 00:26:08.150171 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.152852 kubelet[2697]: W0912 00:26:08.150182 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.153323 kubelet[2697]: E0912 00:26:08.150194 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.153323 kubelet[2697]: E0912 00:26:08.150402 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.153323 kubelet[2697]: W0912 00:26:08.150415 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.153323 kubelet[2697]: E0912 00:26:08.150427 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.153323 kubelet[2697]: E0912 00:26:08.150650 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.153323 kubelet[2697]: W0912 00:26:08.150660 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.153323 kubelet[2697]: E0912 00:26:08.150673 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.155752 kubelet[2697]: E0912 00:26:08.155585 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.155752 kubelet[2697]: W0912 00:26:08.155618 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.155752 kubelet[2697]: E0912 00:26:08.155645 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.155752 kubelet[2697]: I0912 00:26:08.155692 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87a73b77-6f47-40e3-9295-51af59f40cde-registration-dir\") pod \"csi-node-driver-bkq9f\" (UID: \"87a73b77-6f47-40e3-9295-51af59f40cde\") " pod="calico-system/csi-node-driver-bkq9f" Sep 12 00:26:08.156122 kubelet[2697]: E0912 00:26:08.156087 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.156122 kubelet[2697]: W0912 00:26:08.156110 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.156269 kubelet[2697]: E0912 00:26:08.156146 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.156269 kubelet[2697]: I0912 00:26:08.156170 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87a73b77-6f47-40e3-9295-51af59f40cde-socket-dir\") pod \"csi-node-driver-bkq9f\" (UID: \"87a73b77-6f47-40e3-9295-51af59f40cde\") " pod="calico-system/csi-node-driver-bkq9f" Sep 12 00:26:08.156636 kubelet[2697]: E0912 00:26:08.156544 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.156636 kubelet[2697]: W0912 00:26:08.156563 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.156636 kubelet[2697]: E0912 00:26:08.156598 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.156636 kubelet[2697]: I0912 00:26:08.156619 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87a73b77-6f47-40e3-9295-51af59f40cde-kubelet-dir\") pod \"csi-node-driver-bkq9f\" (UID: \"87a73b77-6f47-40e3-9295-51af59f40cde\") " pod="calico-system/csi-node-driver-bkq9f" Sep 12 00:26:08.157120 kubelet[2697]: E0912 00:26:08.156972 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.157120 kubelet[2697]: W0912 00:26:08.157019 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.157120 kubelet[2697]: E0912 00:26:08.157043 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.157222 kubelet[2697]: I0912 00:26:08.157139 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p662d\" (UniqueName: \"kubernetes.io/projected/87a73b77-6f47-40e3-9295-51af59f40cde-kube-api-access-p662d\") pod \"csi-node-driver-bkq9f\" (UID: \"87a73b77-6f47-40e3-9295-51af59f40cde\") " pod="calico-system/csi-node-driver-bkq9f" Sep 12 00:26:08.157472 kubelet[2697]: E0912 00:26:08.157429 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.157472 kubelet[2697]: W0912 00:26:08.157461 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.157567 kubelet[2697]: E0912 00:26:08.157496 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.157925 kubelet[2697]: E0912 00:26:08.157905 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.157925 kubelet[2697]: W0912 00:26:08.157919 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.160800 kubelet[2697]: E0912 00:26:08.158484 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.161218 kubelet[2697]: E0912 00:26:08.161163 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.161218 kubelet[2697]: W0912 00:26:08.161190 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.161354 kubelet[2697]: E0912 00:26:08.161336 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.161641 kubelet[2697]: E0912 00:26:08.161617 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.161641 kubelet[2697]: W0912 00:26:08.161635 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.161774 kubelet[2697]: E0912 00:26:08.161654 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.161910 kubelet[2697]: E0912 00:26:08.161888 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.161910 kubelet[2697]: W0912 00:26:08.161904 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.161996 kubelet[2697]: E0912 00:26:08.161932 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.162153 kubelet[2697]: E0912 00:26:08.162129 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.162153 kubelet[2697]: W0912 00:26:08.162146 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.162232 kubelet[2697]: E0912 00:26:08.162174 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.162375 kubelet[2697]: I0912 00:26:08.162208 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/87a73b77-6f47-40e3-9295-51af59f40cde-varrun\") pod \"csi-node-driver-bkq9f\" (UID: \"87a73b77-6f47-40e3-9295-51af59f40cde\") " pod="calico-system/csi-node-driver-bkq9f" Sep 12 00:26:08.162452 kubelet[2697]: E0912 00:26:08.162427 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.162452 kubelet[2697]: W0912 00:26:08.162443 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.162530 kubelet[2697]: E0912 00:26:08.162454 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.162899 containerd[1555]: time="2025-09-12T00:26:08.162811497Z" level=info msg="connecting to shim 8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4" address="unix:///run/containerd/s/f5867db5a17b589d721f9fb79fac0659e3cca05a6549d2c6868904ac498f1b71" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:08.163127 kubelet[2697]: E0912 00:26:08.163098 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.163127 kubelet[2697]: W0912 00:26:08.163116 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.163199 kubelet[2697]: E0912 00:26:08.163133 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.163410 kubelet[2697]: E0912 00:26:08.163384 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.163462 kubelet[2697]: W0912 00:26:08.163433 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.163462 kubelet[2697]: E0912 00:26:08.163446 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.164110 kubelet[2697]: E0912 00:26:08.164043 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.164110 kubelet[2697]: W0912 00:26:08.164086 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.164110 kubelet[2697]: E0912 00:26:08.164102 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.164731 kubelet[2697]: E0912 00:26:08.164673 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.164731 kubelet[2697]: W0912 00:26:08.164687 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.164841 kubelet[2697]: E0912 00:26:08.164771 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.229489 systemd[1]: Started cri-containerd-8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4.scope - libcontainer container 8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4. Sep 12 00:26:08.265970 kubelet[2697]: E0912 00:26:08.265924 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.274529 kubelet[2697]: W0912 00:26:08.268985 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.274529 kubelet[2697]: E0912 00:26:08.269078 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.274529 kubelet[2697]: E0912 00:26:08.269839 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.274529 kubelet[2697]: W0912 00:26:08.269875 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.274529 kubelet[2697]: E0912 00:26:08.270585 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.274529 kubelet[2697]: E0912 00:26:08.272929 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.274529 kubelet[2697]: W0912 00:26:08.272987 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.274529 kubelet[2697]: E0912 00:26:08.273154 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.274529 kubelet[2697]: E0912 00:26:08.273462 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.274529 kubelet[2697]: W0912 00:26:08.273475 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.274987 containerd[1555]: time="2025-09-12T00:26:08.271033707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-lph6b,Uid:832f3bf8-ac7d-4148-bf9d-08caf42d92f8,Namespace:calico-system,Attempt:0,} returns sandbox id \"8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4\"" Sep 12 00:26:08.275047 kubelet[2697]: E0912 00:26:08.273495 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.275047 kubelet[2697]: E0912 00:26:08.274148 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.275047 kubelet[2697]: W0912 00:26:08.274161 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.275047 kubelet[2697]: E0912 00:26:08.274175 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.275810 kubelet[2697]: E0912 00:26:08.275502 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.275810 kubelet[2697]: W0912 00:26:08.275519 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.275810 kubelet[2697]: E0912 00:26:08.275613 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.277794 kubelet[2697]: E0912 00:26:08.277664 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.277794 kubelet[2697]: W0912 00:26:08.277739 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.277794 kubelet[2697]: E0912 00:26:08.277780 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.278870 kubelet[2697]: E0912 00:26:08.278816 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.278870 kubelet[2697]: W0912 00:26:08.278836 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.279098 kubelet[2697]: E0912 00:26:08.278968 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.279098 kubelet[2697]: E0912 00:26:08.279065 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.279098 kubelet[2697]: W0912 00:26:08.279074 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.279515 kubelet[2697]: E0912 00:26:08.279458 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.280376 kubelet[2697]: E0912 00:26:08.280133 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.280448 kubelet[2697]: W0912 00:26:08.280393 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.281088 kubelet[2697]: E0912 00:26:08.280491 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.283052 kubelet[2697]: E0912 00:26:08.283015 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.283052 kubelet[2697]: W0912 00:26:08.283037 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.283205 kubelet[2697]: E0912 00:26:08.283142 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.284335 kubelet[2697]: E0912 00:26:08.284285 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.284335 kubelet[2697]: W0912 00:26:08.284305 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.284563 kubelet[2697]: E0912 00:26:08.284383 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.284563 kubelet[2697]: E0912 00:26:08.284547 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.284563 kubelet[2697]: W0912 00:26:08.284558 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.284662 kubelet[2697]: E0912 00:26:08.284613 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.287087 kubelet[2697]: E0912 00:26:08.284815 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.287087 kubelet[2697]: W0912 00:26:08.284828 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.287087 kubelet[2697]: E0912 00:26:08.284872 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.287087 kubelet[2697]: E0912 00:26:08.285092 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.287087 kubelet[2697]: W0912 00:26:08.285104 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.287087 kubelet[2697]: E0912 00:26:08.285185 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.287087 kubelet[2697]: E0912 00:26:08.285376 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.287087 kubelet[2697]: W0912 00:26:08.285393 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.287087 kubelet[2697]: E0912 00:26:08.285504 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.287087 kubelet[2697]: E0912 00:26:08.285681 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.288453 kubelet[2697]: W0912 00:26:08.285719 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.288453 kubelet[2697]: E0912 00:26:08.285735 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.288453 kubelet[2697]: E0912 00:26:08.286045 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.288453 kubelet[2697]: W0912 00:26:08.286056 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.288453 kubelet[2697]: E0912 00:26:08.286136 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.288453 kubelet[2697]: E0912 00:26:08.287641 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.288453 kubelet[2697]: W0912 00:26:08.287658 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.288453 kubelet[2697]: E0912 00:26:08.288249 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.290204 kubelet[2697]: E0912 00:26:08.289660 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.290204 kubelet[2697]: W0912 00:26:08.289691 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.290204 kubelet[2697]: E0912 00:26:08.289853 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.290553 kubelet[2697]: E0912 00:26:08.290515 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.290553 kubelet[2697]: W0912 00:26:08.290533 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.292690 kubelet[2697]: E0912 00:26:08.292396 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.294364 kubelet[2697]: E0912 00:26:08.293985 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.294364 kubelet[2697]: W0912 00:26:08.294014 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.294364 kubelet[2697]: E0912 00:26:08.294243 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.296121 kubelet[2697]: E0912 00:26:08.296101 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.296248 kubelet[2697]: W0912 00:26:08.296228 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.296415 kubelet[2697]: E0912 00:26:08.296378 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.297282 kubelet[2697]: E0912 00:26:08.297225 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.297618 kubelet[2697]: W0912 00:26:08.297598 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.297830 kubelet[2697]: E0912 00:26:08.297761 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.298271 kubelet[2697]: E0912 00:26:08.298246 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.298369 kubelet[2697]: W0912 00:26:08.298353 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.298456 kubelet[2697]: E0912 00:26:08.298438 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:08.298943 kubelet[2697]: E0912 00:26:08.298910 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:08.298943 kubelet[2697]: W0912 00:26:08.298940 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:08.299037 kubelet[2697]: E0912 00:26:08.298966 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:09.589818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1732892656.mount: Deactivated successfully. Sep 12 00:26:09.602313 kubelet[2697]: E0912 00:26:09.602226 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:11.292368 containerd[1555]: time="2025-09-12T00:26:11.292288261Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:11.293491 containerd[1555]: time="2025-09-12T00:26:11.293436264Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 00:26:11.295437 containerd[1555]: time="2025-09-12T00:26:11.295378104Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:11.301459 containerd[1555]: time="2025-09-12T00:26:11.301361732Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:11.302010 containerd[1555]: time="2025-09-12T00:26:11.301802222Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.33159235s" Sep 12 00:26:11.302010 containerd[1555]: time="2025-09-12T00:26:11.301932468Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 00:26:11.306958 containerd[1555]: time="2025-09-12T00:26:11.306769004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 00:26:11.323734 containerd[1555]: time="2025-09-12T00:26:11.323215985Z" level=info msg="CreateContainer within sandbox \"fb3ebe475fc3ae79343565a9d9baef5f4c5a6828963f2c1fada26d4ebf8cd562\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 00:26:11.348545 containerd[1555]: time="2025-09-12T00:26:11.348467009Z" level=info msg="Container 7434a007c9a52cca5f0d9a4dd7e65e2418c47fe319db1a2c09673dee8b9b2fc4: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:11.352536 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount742496437.mount: Deactivated successfully. Sep 12 00:26:11.366403 containerd[1555]: time="2025-09-12T00:26:11.365682598Z" level=info msg="CreateContainer within sandbox \"fb3ebe475fc3ae79343565a9d9baef5f4c5a6828963f2c1fada26d4ebf8cd562\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7434a007c9a52cca5f0d9a4dd7e65e2418c47fe319db1a2c09673dee8b9b2fc4\"" Sep 12 00:26:11.366606 containerd[1555]: time="2025-09-12T00:26:11.366434154Z" level=info msg="StartContainer for \"7434a007c9a52cca5f0d9a4dd7e65e2418c47fe319db1a2c09673dee8b9b2fc4\"" Sep 12 00:26:11.367965 containerd[1555]: time="2025-09-12T00:26:11.367898394Z" level=info msg="connecting to shim 7434a007c9a52cca5f0d9a4dd7e65e2418c47fe319db1a2c09673dee8b9b2fc4" address="unix:///run/containerd/s/497765bdceb8c1107740aa02dcf7e6c3ccd0907de13427b9fe9b4718ddc99ff7" protocol=ttrpc version=3 Sep 12 00:26:11.411089 systemd[1]: Started cri-containerd-7434a007c9a52cca5f0d9a4dd7e65e2418c47fe319db1a2c09673dee8b9b2fc4.scope - libcontainer container 7434a007c9a52cca5f0d9a4dd7e65e2418c47fe319db1a2c09673dee8b9b2fc4. Sep 12 00:26:11.490348 containerd[1555]: time="2025-09-12T00:26:11.490284403Z" level=info msg="StartContainer for \"7434a007c9a52cca5f0d9a4dd7e65e2418c47fe319db1a2c09673dee8b9b2fc4\" returns successfully" Sep 12 00:26:11.602399 kubelet[2697]: E0912 00:26:11.601603 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:11.686966 kubelet[2697]: E0912 00:26:11.686898 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.686966 kubelet[2697]: W0912 00:26:11.686928 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.687325 kubelet[2697]: E0912 00:26:11.687195 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.687537 kubelet[2697]: E0912 00:26:11.687526 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.687614 kubelet[2697]: W0912 00:26:11.687603 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.688800 kubelet[2697]: E0912 00:26:11.688762 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.690996 kubelet[2697]: E0912 00:26:11.690942 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.690996 kubelet[2697]: W0912 00:26:11.690966 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.691815 kubelet[2697]: E0912 00:26:11.691764 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.693720 kubelet[2697]: E0912 00:26:11.692350 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.693720 kubelet[2697]: W0912 00:26:11.692364 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.693720 kubelet[2697]: E0912 00:26:11.692378 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.694909 kubelet[2697]: E0912 00:26:11.694891 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.695062 kubelet[2697]: W0912 00:26:11.694993 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.695062 kubelet[2697]: E0912 00:26:11.695019 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.695441 kubelet[2697]: E0912 00:26:11.695377 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.695441 kubelet[2697]: W0912 00:26:11.695389 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.695441 kubelet[2697]: E0912 00:26:11.695400 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.695843 kubelet[2697]: E0912 00:26:11.695773 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.695843 kubelet[2697]: W0912 00:26:11.695784 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.695843 kubelet[2697]: E0912 00:26:11.695797 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.696275 kubelet[2697]: E0912 00:26:11.696191 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.696275 kubelet[2697]: W0912 00:26:11.696204 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.696275 kubelet[2697]: E0912 00:26:11.696215 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.697164 kubelet[2697]: E0912 00:26:11.697084 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.697164 kubelet[2697]: W0912 00:26:11.697105 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.697164 kubelet[2697]: E0912 00:26:11.697117 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.697543 kubelet[2697]: E0912 00:26:11.697480 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.697543 kubelet[2697]: W0912 00:26:11.697492 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.697543 kubelet[2697]: E0912 00:26:11.697503 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.699981 kubelet[2697]: E0912 00:26:11.699951 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.700227 kubelet[2697]: W0912 00:26:11.700128 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.700227 kubelet[2697]: E0912 00:26:11.700158 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.700676 kubelet[2697]: E0912 00:26:11.700581 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.700676 kubelet[2697]: W0912 00:26:11.700593 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.700676 kubelet[2697]: E0912 00:26:11.700608 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.701070 kubelet[2697]: E0912 00:26:11.701007 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.701070 kubelet[2697]: W0912 00:26:11.701018 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.701070 kubelet[2697]: E0912 00:26:11.701029 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.702884 kubelet[2697]: E0912 00:26:11.702763 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.702884 kubelet[2697]: W0912 00:26:11.702788 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.702884 kubelet[2697]: E0912 00:26:11.702821 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.704043 kubelet[2697]: E0912 00:26:11.703929 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.704043 kubelet[2697]: W0912 00:26:11.703948 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.704043 kubelet[2697]: E0912 00:26:11.703963 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.720391 kubelet[2697]: E0912 00:26:11.720260 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.720391 kubelet[2697]: W0912 00:26:11.720304 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.720391 kubelet[2697]: E0912 00:26:11.720337 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.720967 kubelet[2697]: E0912 00:26:11.720840 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.720967 kubelet[2697]: W0912 00:26:11.720851 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.720967 kubelet[2697]: E0912 00:26:11.720865 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.722642 kubelet[2697]: E0912 00:26:11.721184 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.722642 kubelet[2697]: W0912 00:26:11.721204 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.722642 kubelet[2697]: E0912 00:26:11.721217 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.723205 kubelet[2697]: E0912 00:26:11.723176 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.723332 kubelet[2697]: W0912 00:26:11.723301 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.723468 kubelet[2697]: E0912 00:26:11.723425 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.724037 kubelet[2697]: E0912 00:26:11.723991 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.724107 kubelet[2697]: W0912 00:26:11.724033 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.724107 kubelet[2697]: E0912 00:26:11.724082 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.724462 kubelet[2697]: E0912 00:26:11.724437 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.725313 kubelet[2697]: W0912 00:26:11.724940 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.725313 kubelet[2697]: E0912 00:26:11.725029 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.725840 kubelet[2697]: E0912 00:26:11.725765 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.726411 kubelet[2697]: W0912 00:26:11.726391 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.726679 kubelet[2697]: E0912 00:26:11.726619 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.728822 kubelet[2697]: E0912 00:26:11.728384 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.728822 kubelet[2697]: W0912 00:26:11.728400 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.729613 kubelet[2697]: E0912 00:26:11.728458 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.731745 kubelet[2697]: E0912 00:26:11.731030 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.731745 kubelet[2697]: W0912 00:26:11.731073 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.733985 kubelet[2697]: E0912 00:26:11.733806 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.733985 kubelet[2697]: W0912 00:26:11.733833 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.734869 kubelet[2697]: E0912 00:26:11.734845 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.735005 kubelet[2697]: E0912 00:26:11.734964 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.745917 kubelet[2697]: E0912 00:26:11.745873 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.746111 kubelet[2697]: W0912 00:26:11.746088 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.746215 kubelet[2697]: E0912 00:26:11.746197 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.746716 kubelet[2697]: E0912 00:26:11.746682 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.746850 kubelet[2697]: W0912 00:26:11.746825 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.747299 kubelet[2697]: E0912 00:26:11.746910 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.748827 kubelet[2697]: E0912 00:26:11.748808 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.748933 kubelet[2697]: W0912 00:26:11.748911 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.749156 kubelet[2697]: E0912 00:26:11.749137 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.755819 kubelet[2697]: E0912 00:26:11.749268 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.756070 kubelet[2697]: W0912 00:26:11.756037 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.756713 kubelet[2697]: E0912 00:26:11.756655 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.757965 kubelet[2697]: E0912 00:26:11.757901 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.758022 kubelet[2697]: W0912 00:26:11.757965 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.758022 kubelet[2697]: E0912 00:26:11.758001 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.760735 kubelet[2697]: E0912 00:26:11.758423 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.760735 kubelet[2697]: W0912 00:26:11.758441 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.760735 kubelet[2697]: E0912 00:26:11.758498 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.762548 kubelet[2697]: E0912 00:26:11.762507 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.762548 kubelet[2697]: W0912 00:26:11.762539 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.763456 kubelet[2697]: E0912 00:26:11.763400 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:11.767945 kubelet[2697]: E0912 00:26:11.764952 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:11.767945 kubelet[2697]: W0912 00:26:11.767946 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:11.768181 kubelet[2697]: E0912 00:26:11.767985 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.680559 kubelet[2697]: I0912 00:26:12.680496 2697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:26:12.711943 kubelet[2697]: E0912 00:26:12.711480 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.711943 kubelet[2697]: W0912 00:26:12.711519 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.711943 kubelet[2697]: E0912 00:26:12.711546 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.712961 kubelet[2697]: E0912 00:26:12.712301 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.712961 kubelet[2697]: W0912 00:26:12.712319 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.712961 kubelet[2697]: E0912 00:26:12.712334 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.713448 kubelet[2697]: E0912 00:26:12.713195 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.713448 kubelet[2697]: W0912 00:26:12.713213 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.713448 kubelet[2697]: E0912 00:26:12.713227 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.720120 kubelet[2697]: E0912 00:26:12.713628 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.720120 kubelet[2697]: W0912 00:26:12.716704 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.720120 kubelet[2697]: E0912 00:26:12.719120 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.720120 kubelet[2697]: E0912 00:26:12.719637 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.720120 kubelet[2697]: W0912 00:26:12.719664 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.720120 kubelet[2697]: E0912 00:26:12.719678 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.721320 kubelet[2697]: E0912 00:26:12.720544 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.721320 kubelet[2697]: W0912 00:26:12.720555 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.721320 kubelet[2697]: E0912 00:26:12.720568 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.721856 kubelet[2697]: E0912 00:26:12.721685 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.721856 kubelet[2697]: W0912 00:26:12.721766 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.721856 kubelet[2697]: E0912 00:26:12.721799 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.722482 kubelet[2697]: E0912 00:26:12.722289 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.722482 kubelet[2697]: W0912 00:26:12.722301 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.722482 kubelet[2697]: E0912 00:26:12.722314 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.722795 kubelet[2697]: E0912 00:26:12.722632 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.722795 kubelet[2697]: W0912 00:26:12.722662 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.722795 kubelet[2697]: E0912 00:26:12.722675 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.723222 kubelet[2697]: E0912 00:26:12.722951 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.723222 kubelet[2697]: W0912 00:26:12.722975 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.723222 kubelet[2697]: E0912 00:26:12.722988 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.723353 kubelet[2697]: E0912 00:26:12.723235 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.723353 kubelet[2697]: W0912 00:26:12.723247 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.723353 kubelet[2697]: E0912 00:26:12.723262 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.723542 kubelet[2697]: E0912 00:26:12.723517 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.723542 kubelet[2697]: W0912 00:26:12.723528 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.723542 kubelet[2697]: E0912 00:26:12.723541 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.724123 kubelet[2697]: E0912 00:26:12.723980 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.724123 kubelet[2697]: W0912 00:26:12.723998 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.724123 kubelet[2697]: E0912 00:26:12.724010 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.724466 kubelet[2697]: E0912 00:26:12.724265 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.724466 kubelet[2697]: W0912 00:26:12.724276 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.724466 kubelet[2697]: E0912 00:26:12.724289 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.724588 kubelet[2697]: E0912 00:26:12.724546 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.724588 kubelet[2697]: W0912 00:26:12.724558 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.724588 kubelet[2697]: E0912 00:26:12.724570 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.731539 kubelet[2697]: E0912 00:26:12.730206 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.731539 kubelet[2697]: W0912 00:26:12.730239 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.731539 kubelet[2697]: E0912 00:26:12.730261 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.731539 kubelet[2697]: E0912 00:26:12.730603 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.731539 kubelet[2697]: W0912 00:26:12.730659 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.731539 kubelet[2697]: E0912 00:26:12.730685 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.731539 kubelet[2697]: E0912 00:26:12.731244 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.731539 kubelet[2697]: W0912 00:26:12.731274 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.731539 kubelet[2697]: E0912 00:26:12.731313 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.733427 kubelet[2697]: E0912 00:26:12.732861 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.733427 kubelet[2697]: W0912 00:26:12.732878 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.734039 kubelet[2697]: E0912 00:26:12.733873 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.734039 kubelet[2697]: E0912 00:26:12.734028 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.734039 kubelet[2697]: W0912 00:26:12.734038 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.734337 kubelet[2697]: E0912 00:26:12.734108 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.734337 kubelet[2697]: E0912 00:26:12.734272 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.734337 kubelet[2697]: W0912 00:26:12.734284 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.734580 kubelet[2697]: E0912 00:26:12.734379 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.736503 kubelet[2697]: E0912 00:26:12.736431 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.736589 kubelet[2697]: W0912 00:26:12.736501 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.736589 kubelet[2697]: E0912 00:26:12.736551 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.737541 kubelet[2697]: E0912 00:26:12.737295 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.737541 kubelet[2697]: W0912 00:26:12.737329 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.737541 kubelet[2697]: E0912 00:26:12.737409 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.737818 kubelet[2697]: E0912 00:26:12.737780 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.737818 kubelet[2697]: W0912 00:26:12.737796 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.738075 kubelet[2697]: E0912 00:26:12.738048 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.738075 kubelet[2697]: W0912 00:26:12.738073 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.738146 kubelet[2697]: E0912 00:26:12.738094 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.738289 kubelet[2697]: E0912 00:26:12.738186 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.739771 kubelet[2697]: E0912 00:26:12.738371 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.739771 kubelet[2697]: W0912 00:26:12.738384 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.739771 kubelet[2697]: E0912 00:26:12.738398 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.739771 kubelet[2697]: E0912 00:26:12.738664 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.739771 kubelet[2697]: W0912 00:26:12.738675 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.739771 kubelet[2697]: E0912 00:26:12.738687 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.739771 kubelet[2697]: E0912 00:26:12.739483 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.739771 kubelet[2697]: W0912 00:26:12.739492 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.739771 kubelet[2697]: E0912 00:26:12.739503 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.740090 kubelet[2697]: E0912 00:26:12.739811 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.740090 kubelet[2697]: W0912 00:26:12.739833 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.740090 kubelet[2697]: E0912 00:26:12.739865 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.740192 kubelet[2697]: E0912 00:26:12.740127 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.740192 kubelet[2697]: W0912 00:26:12.740155 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.740264 kubelet[2697]: E0912 00:26:12.740193 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.740739 kubelet[2697]: E0912 00:26:12.740629 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.740739 kubelet[2697]: W0912 00:26:12.740677 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.740739 kubelet[2697]: E0912 00:26:12.740754 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.741494 kubelet[2697]: E0912 00:26:12.741348 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.741494 kubelet[2697]: W0912 00:26:12.741362 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.741494 kubelet[2697]: E0912 00:26:12.741449 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.741744 kubelet[2697]: E0912 00:26:12.741670 2697 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 00:26:12.741744 kubelet[2697]: W0912 00:26:12.741693 2697 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 00:26:12.741744 kubelet[2697]: E0912 00:26:12.741721 2697 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 00:26:12.889139 containerd[1555]: time="2025-09-12T00:26:12.889050618Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:12.890534 containerd[1555]: time="2025-09-12T00:26:12.890452920Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 00:26:12.892284 containerd[1555]: time="2025-09-12T00:26:12.892220851Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:12.895193 containerd[1555]: time="2025-09-12T00:26:12.895134812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:12.896091 containerd[1555]: time="2025-09-12T00:26:12.896012446Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.589191293s" Sep 12 00:26:12.896091 containerd[1555]: time="2025-09-12T00:26:12.896076025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 00:26:12.899163 containerd[1555]: time="2025-09-12T00:26:12.899123868Z" level=info msg="CreateContainer within sandbox \"8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 00:26:12.914427 containerd[1555]: time="2025-09-12T00:26:12.914364726Z" level=info msg="Container 0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:12.931591 containerd[1555]: time="2025-09-12T00:26:12.931398210Z" level=info msg="CreateContainer within sandbox \"8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d\"" Sep 12 00:26:12.933821 containerd[1555]: time="2025-09-12T00:26:12.932895011Z" level=info msg="StartContainer for \"0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d\"" Sep 12 00:26:12.935679 containerd[1555]: time="2025-09-12T00:26:12.935629433Z" level=info msg="connecting to shim 0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d" address="unix:///run/containerd/s/f5867db5a17b589d721f9fb79fac0659e3cca05a6549d2c6868904ac498f1b71" protocol=ttrpc version=3 Sep 12 00:26:12.987784 systemd[1]: Started cri-containerd-0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d.scope - libcontainer container 0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d. Sep 12 00:26:13.073769 containerd[1555]: time="2025-09-12T00:26:13.071408181Z" level=info msg="StartContainer for \"0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d\" returns successfully" Sep 12 00:26:13.086296 systemd[1]: cri-containerd-0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d.scope: Deactivated successfully. Sep 12 00:26:13.086752 systemd[1]: cri-containerd-0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d.scope: Consumed 53ms CPU time, 6.3M memory peak, 4.6M written to disk. Sep 12 00:26:13.090101 containerd[1555]: time="2025-09-12T00:26:13.090049885Z" level=info msg="received exit event container_id:\"0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d\" id:\"0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d\" pid:3429 exited_at:{seconds:1757636773 nanos:89107821}" Sep 12 00:26:13.090305 containerd[1555]: time="2025-09-12T00:26:13.090275721Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d\" id:\"0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d\" pid:3429 exited_at:{seconds:1757636773 nanos:89107821}" Sep 12 00:26:13.136559 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0cd7f7319bb4865c6967777ffee0e992e648068d8edcb075281a87874025972d-rootfs.mount: Deactivated successfully. Sep 12 00:26:13.601155 kubelet[2697]: E0912 00:26:13.601055 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:13.692683 containerd[1555]: time="2025-09-12T00:26:13.692608097Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 00:26:13.715716 kubelet[2697]: I0912 00:26:13.715609 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-54cbb98bd9-fgjvn" podStartSLOduration=3.379460441 podStartE2EDuration="6.715571032s" podCreationTimestamp="2025-09-12 00:26:07 +0000 UTC" firstStartedPulling="2025-09-12 00:26:07.968623317 +0000 UTC m=+20.661889666" lastFinishedPulling="2025-09-12 00:26:11.304733908 +0000 UTC m=+23.998000257" observedRunningTime="2025-09-12 00:26:11.718537465 +0000 UTC m=+24.411803814" watchObservedRunningTime="2025-09-12 00:26:13.715571032 +0000 UTC m=+26.408837381" Sep 12 00:26:15.603454 kubelet[2697]: E0912 00:26:15.603370 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:17.601295 kubelet[2697]: E0912 00:26:17.601206 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:19.607064 kubelet[2697]: E0912 00:26:19.606959 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:20.348411 containerd[1555]: time="2025-09-12T00:26:20.348313335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:20.350101 containerd[1555]: time="2025-09-12T00:26:20.350020826Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 00:26:20.353289 containerd[1555]: time="2025-09-12T00:26:20.353225763Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:20.356872 containerd[1555]: time="2025-09-12T00:26:20.356803851Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:20.357609 containerd[1555]: time="2025-09-12T00:26:20.357531951Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 6.664844363s" Sep 12 00:26:20.357609 containerd[1555]: time="2025-09-12T00:26:20.357585541Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 00:26:20.362022 containerd[1555]: time="2025-09-12T00:26:20.361939318Z" level=info msg="CreateContainer within sandbox \"8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 00:26:20.382574 containerd[1555]: time="2025-09-12T00:26:20.381034598Z" level=info msg="Container 25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:20.397129 containerd[1555]: time="2025-09-12T00:26:20.397051127Z" level=info msg="CreateContainer within sandbox \"8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729\"" Sep 12 00:26:20.397896 containerd[1555]: time="2025-09-12T00:26:20.397845570Z" level=info msg="StartContainer for \"25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729\"" Sep 12 00:26:20.399757 containerd[1555]: time="2025-09-12T00:26:20.399721418Z" level=info msg="connecting to shim 25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729" address="unix:///run/containerd/s/f5867db5a17b589d721f9fb79fac0659e3cca05a6549d2c6868904ac498f1b71" protocol=ttrpc version=3 Sep 12 00:26:20.440045 systemd[1]: Started cri-containerd-25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729.scope - libcontainer container 25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729. Sep 12 00:26:20.518351 containerd[1555]: time="2025-09-12T00:26:20.518277564Z" level=info msg="StartContainer for \"25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729\" returns successfully" Sep 12 00:26:21.603734 kubelet[2697]: E0912 00:26:21.603306 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:22.073310 containerd[1555]: time="2025-09-12T00:26:22.073235134Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 00:26:22.077754 systemd[1]: cri-containerd-25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729.scope: Deactivated successfully. Sep 12 00:26:22.078735 systemd[1]: cri-containerd-25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729.scope: Consumed 877ms CPU time, 186M memory peak, 2.8M read from disk, 171.3M written to disk. Sep 12 00:26:22.079368 containerd[1555]: time="2025-09-12T00:26:22.079308861Z" level=info msg="TaskExit event in podsandbox handler container_id:\"25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729\" id:\"25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729\" pid:3487 exited_at:{seconds:1757636782 nanos:78690809}" Sep 12 00:26:22.079457 containerd[1555]: time="2025-09-12T00:26:22.079406916Z" level=info msg="received exit event container_id:\"25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729\" id:\"25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729\" pid:3487 exited_at:{seconds:1757636782 nanos:78690809}" Sep 12 00:26:22.114499 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-25a8e8a5685cce42d9937abbee0f0a69b80021665784b155dbdad477443f7729-rootfs.mount: Deactivated successfully. Sep 12 00:26:22.147861 kubelet[2697]: I0912 00:26:22.147796 2697 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 00:26:22.309045 systemd[1]: Created slice kubepods-burstable-pod0a998cb9_35da_4ff2_b590_14bd772b287b.slice - libcontainer container kubepods-burstable-pod0a998cb9_35da_4ff2_b590_14bd772b287b.slice. Sep 12 00:26:22.321927 systemd[1]: Created slice kubepods-besteffort-podf77b2b1f_f595_49ee_b3e7_e8d7bdc928ef.slice - libcontainer container kubepods-besteffort-podf77b2b1f_f595_49ee_b3e7_e8d7bdc928ef.slice. Sep 12 00:26:22.334075 systemd[1]: Created slice kubepods-burstable-pod39f8040c_9434_4de5_b6ff_a6491073dd00.slice - libcontainer container kubepods-burstable-pod39f8040c_9434_4de5_b6ff_a6491073dd00.slice. Sep 12 00:26:22.352133 systemd[1]: Created slice kubepods-besteffort-pod12633e0e_0279_423e_8561_06b36e192c10.slice - libcontainer container kubepods-besteffort-pod12633e0e_0279_423e_8561_06b36e192c10.slice. Sep 12 00:26:22.363169 kubelet[2697]: I0912 00:26:22.363101 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39f8040c-9434-4de5-b6ff-a6491073dd00-config-volume\") pod \"coredns-668d6bf9bc-z6x5p\" (UID: \"39f8040c-9434-4de5-b6ff-a6491073dd00\") " pod="kube-system/coredns-668d6bf9bc-z6x5p" Sep 12 00:26:22.363169 kubelet[2697]: I0912 00:26:22.363157 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/02e48ce0-416b-41b3-bb94-8abf9878b10e-calico-apiserver-certs\") pod \"calico-apiserver-545b8468d6-d6cqw\" (UID: \"02e48ce0-416b-41b3-bb94-8abf9878b10e\") " pod="calico-apiserver/calico-apiserver-545b8468d6-d6cqw" Sep 12 00:26:22.363432 kubelet[2697]: I0912 00:26:22.363189 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-backend-key-pair\") pod \"whisker-854686554d-gbczl\" (UID: \"aa486295-e36e-4803-a89c-4976a4c65fc6\") " pod="calico-system/whisker-854686554d-gbczl" Sep 12 00:26:22.363432 kubelet[2697]: I0912 00:26:22.363238 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12633e0e-0279-423e-8561-06b36e192c10-config\") pod \"goldmane-54d579b49d-kcjr9\" (UID: \"12633e0e-0279-423e-8561-06b36e192c10\") " pod="calico-system/goldmane-54d579b49d-kcjr9" Sep 12 00:26:22.363432 kubelet[2697]: I0912 00:26:22.363308 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r629g\" (UniqueName: \"kubernetes.io/projected/0a998cb9-35da-4ff2-b590-14bd772b287b-kube-api-access-r629g\") pod \"coredns-668d6bf9bc-8qdzn\" (UID: \"0a998cb9-35da-4ff2-b590-14bd772b287b\") " pod="kube-system/coredns-668d6bf9bc-8qdzn" Sep 12 00:26:22.363432 kubelet[2697]: I0912 00:26:22.363386 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4hvb\" (UniqueName: \"kubernetes.io/projected/02e48ce0-416b-41b3-bb94-8abf9878b10e-kube-api-access-d4hvb\") pod \"calico-apiserver-545b8468d6-d6cqw\" (UID: \"02e48ce0-416b-41b3-bb94-8abf9878b10e\") " pod="calico-apiserver/calico-apiserver-545b8468d6-d6cqw" Sep 12 00:26:22.363432 kubelet[2697]: I0912 00:26:22.363406 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12633e0e-0279-423e-8561-06b36e192c10-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-kcjr9\" (UID: \"12633e0e-0279-423e-8561-06b36e192c10\") " pod="calico-system/goldmane-54d579b49d-kcjr9" Sep 12 00:26:22.363729 kubelet[2697]: I0912 00:26:22.363434 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-ca-bundle\") pod \"whisker-854686554d-gbczl\" (UID: \"aa486295-e36e-4803-a89c-4976a4c65fc6\") " pod="calico-system/whisker-854686554d-gbczl" Sep 12 00:26:22.363729 kubelet[2697]: I0912 00:26:22.363456 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a998cb9-35da-4ff2-b590-14bd772b287b-config-volume\") pod \"coredns-668d6bf9bc-8qdzn\" (UID: \"0a998cb9-35da-4ff2-b590-14bd772b287b\") " pod="kube-system/coredns-668d6bf9bc-8qdzn" Sep 12 00:26:22.363811 kubelet[2697]: I0912 00:26:22.363491 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvt8s\" (UniqueName: \"kubernetes.io/projected/235fa983-248c-4ae8-8cf2-57465025f544-kube-api-access-dvt8s\") pod \"calico-apiserver-545b8468d6-b9fsq\" (UID: \"235fa983-248c-4ae8-8cf2-57465025f544\") " pod="calico-apiserver/calico-apiserver-545b8468d6-b9fsq" Sep 12 00:26:22.363811 kubelet[2697]: I0912 00:26:22.363779 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef-tigera-ca-bundle\") pod \"calico-kube-controllers-75845c7647-hjrgt\" (UID: \"f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef\") " pod="calico-system/calico-kube-controllers-75845c7647-hjrgt" Sep 12 00:26:22.363889 kubelet[2697]: I0912 00:26:22.363817 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpthr\" (UniqueName: \"kubernetes.io/projected/f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef-kube-api-access-xpthr\") pod \"calico-kube-controllers-75845c7647-hjrgt\" (UID: \"f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef\") " pod="calico-system/calico-kube-controllers-75845c7647-hjrgt" Sep 12 00:26:22.363889 kubelet[2697]: I0912 00:26:22.363849 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z6jb\" (UniqueName: \"kubernetes.io/projected/39f8040c-9434-4de5-b6ff-a6491073dd00-kube-api-access-2z6jb\") pod \"coredns-668d6bf9bc-z6x5p\" (UID: \"39f8040c-9434-4de5-b6ff-a6491073dd00\") " pod="kube-system/coredns-668d6bf9bc-z6x5p" Sep 12 00:26:22.363889 kubelet[2697]: I0912 00:26:22.363871 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlzz\" (UniqueName: \"kubernetes.io/projected/aa486295-e36e-4803-a89c-4976a4c65fc6-kube-api-access-qdlzz\") pod \"whisker-854686554d-gbczl\" (UID: \"aa486295-e36e-4803-a89c-4976a4c65fc6\") " pod="calico-system/whisker-854686554d-gbczl" Sep 12 00:26:22.363990 kubelet[2697]: I0912 00:26:22.363891 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/235fa983-248c-4ae8-8cf2-57465025f544-calico-apiserver-certs\") pod \"calico-apiserver-545b8468d6-b9fsq\" (UID: \"235fa983-248c-4ae8-8cf2-57465025f544\") " pod="calico-apiserver/calico-apiserver-545b8468d6-b9fsq" Sep 12 00:26:22.363990 kubelet[2697]: I0912 00:26:22.363915 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/12633e0e-0279-423e-8561-06b36e192c10-goldmane-key-pair\") pod \"goldmane-54d579b49d-kcjr9\" (UID: \"12633e0e-0279-423e-8561-06b36e192c10\") " pod="calico-system/goldmane-54d579b49d-kcjr9" Sep 12 00:26:22.363990 kubelet[2697]: I0912 00:26:22.363939 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhswf\" (UniqueName: \"kubernetes.io/projected/12633e0e-0279-423e-8561-06b36e192c10-kube-api-access-rhswf\") pod \"goldmane-54d579b49d-kcjr9\" (UID: \"12633e0e-0279-423e-8561-06b36e192c10\") " pod="calico-system/goldmane-54d579b49d-kcjr9" Sep 12 00:26:22.367170 systemd[1]: Created slice kubepods-besteffort-pod02e48ce0_416b_41b3_bb94_8abf9878b10e.slice - libcontainer container kubepods-besteffort-pod02e48ce0_416b_41b3_bb94_8abf9878b10e.slice. Sep 12 00:26:22.375317 systemd[1]: Created slice kubepods-besteffort-podaa486295_e36e_4803_a89c_4976a4c65fc6.slice - libcontainer container kubepods-besteffort-podaa486295_e36e_4803_a89c_4976a4c65fc6.slice. Sep 12 00:26:22.387441 systemd[1]: Created slice kubepods-besteffort-pod235fa983_248c_4ae8_8cf2_57465025f544.slice - libcontainer container kubepods-besteffort-pod235fa983_248c_4ae8_8cf2_57465025f544.slice. Sep 12 00:26:22.618285 containerd[1555]: time="2025-09-12T00:26:22.618083432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8qdzn,Uid:0a998cb9-35da-4ff2-b590-14bd772b287b,Namespace:kube-system,Attempt:0,}" Sep 12 00:26:22.630873 containerd[1555]: time="2025-09-12T00:26:22.630796501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75845c7647-hjrgt,Uid:f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:22.645538 containerd[1555]: time="2025-09-12T00:26:22.643433566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z6x5p,Uid:39f8040c-9434-4de5-b6ff-a6491073dd00,Namespace:kube-system,Attempt:0,}" Sep 12 00:26:22.661622 containerd[1555]: time="2025-09-12T00:26:22.661547528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kcjr9,Uid:12633e0e-0279-423e-8561-06b36e192c10,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:22.676751 containerd[1555]: time="2025-09-12T00:26:22.676689973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-d6cqw,Uid:02e48ce0-416b-41b3-bb94-8abf9878b10e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:26:22.685059 containerd[1555]: time="2025-09-12T00:26:22.684985226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-854686554d-gbczl,Uid:aa486295-e36e-4803-a89c-4976a4c65fc6,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:22.692031 containerd[1555]: time="2025-09-12T00:26:22.691988251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-b9fsq,Uid:235fa983-248c-4ae8-8cf2-57465025f544,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:26:22.759442 containerd[1555]: time="2025-09-12T00:26:22.759403491Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 00:26:22.852353 containerd[1555]: time="2025-09-12T00:26:22.851552206Z" level=error msg="Failed to destroy network for sandbox \"7d327ae1be1dce252291683dad0ba7c9285f8098693ad42f698ac69df4809ecc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.860512 containerd[1555]: time="2025-09-12T00:26:22.860440866Z" level=error msg="Failed to destroy network for sandbox \"62f80588b3115979c63d6e58630423fd2fdecacaf8f9bd4968ce42954db29a96\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.873131 containerd[1555]: time="2025-09-12T00:26:22.872975027Z" level=error msg="Failed to destroy network for sandbox \"9200717344c793f52374831d5ff28eb1555b552b6a2a2e2dc2d728784e345003\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.873913 containerd[1555]: time="2025-09-12T00:26:22.873877875Z" level=error msg="Failed to destroy network for sandbox \"911b39cbf39a6a44b9974b440c181966a25f291285bca28a34d5aeb5f3ee1246\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.900267 containerd[1555]: time="2025-09-12T00:26:22.900182704Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8qdzn,Uid:0a998cb9-35da-4ff2-b590-14bd772b287b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d327ae1be1dce252291683dad0ba7c9285f8098693ad42f698ac69df4809ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.900519 containerd[1555]: time="2025-09-12T00:26:22.900324651Z" level=error msg="Failed to destroy network for sandbox \"70c41bc20ed6487c98dc5d84b70ce6ff2e0cae6a5c7fd363c7de11721c2b782b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.900519 containerd[1555]: time="2025-09-12T00:26:22.900196349Z" level=error msg="Failed to destroy network for sandbox \"f82641f714ccd561f87e05eb32bc8f2f686dfc34583ad9423d64063c4eaf6efc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.900670 containerd[1555]: time="2025-09-12T00:26:22.900240693Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-b9fsq,Uid:235fa983-248c-4ae8-8cf2-57465025f544,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"62f80588b3115979c63d6e58630423fd2fdecacaf8f9bd4968ce42954db29a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.900895 containerd[1555]: time="2025-09-12T00:26:22.900196509Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kcjr9,Uid:12633e0e-0279-423e-8561-06b36e192c10,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"911b39cbf39a6a44b9974b440c181966a25f291285bca28a34d5aeb5f3ee1246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.900895 containerd[1555]: time="2025-09-12T00:26:22.900228911Z" level=error msg="Failed to destroy network for sandbox \"c117c2c7820e11f83da276e363f0a44c850f7acc54e944ed8cf78f88f08c9fc7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.900990 containerd[1555]: time="2025-09-12T00:26:22.900196169Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-854686554d-gbczl,Uid:aa486295-e36e-4803-a89c-4976a4c65fc6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9200717344c793f52374831d5ff28eb1555b552b6a2a2e2dc2d728784e345003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.903606 containerd[1555]: time="2025-09-12T00:26:22.903519637Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75845c7647-hjrgt,Uid:f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"70c41bc20ed6487c98dc5d84b70ce6ff2e0cae6a5c7fd363c7de11721c2b782b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.905095 containerd[1555]: time="2025-09-12T00:26:22.905032351Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-d6cqw,Uid:02e48ce0-416b-41b3-bb94-8abf9878b10e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f82641f714ccd561f87e05eb32bc8f2f686dfc34583ad9423d64063c4eaf6efc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.906438 containerd[1555]: time="2025-09-12T00:26:22.906374653Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z6x5p,Uid:39f8040c-9434-4de5-b6ff-a6491073dd00,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c117c2c7820e11f83da276e363f0a44c850f7acc54e944ed8cf78f88f08c9fc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.917396 kubelet[2697]: E0912 00:26:22.917324 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9200717344c793f52374831d5ff28eb1555b552b6a2a2e2dc2d728784e345003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.918084 kubelet[2697]: E0912 00:26:22.918011 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9200717344c793f52374831d5ff28eb1555b552b6a2a2e2dc2d728784e345003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-854686554d-gbczl" Sep 12 00:26:22.918084 kubelet[2697]: E0912 00:26:22.918055 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9200717344c793f52374831d5ff28eb1555b552b6a2a2e2dc2d728784e345003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-854686554d-gbczl" Sep 12 00:26:22.918309 kubelet[2697]: E0912 00:26:22.917324 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70c41bc20ed6487c98dc5d84b70ce6ff2e0cae6a5c7fd363c7de11721c2b782b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.918309 kubelet[2697]: E0912 00:26:22.918116 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-854686554d-gbczl_calico-system(aa486295-e36e-4803-a89c-4976a4c65fc6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-854686554d-gbczl_calico-system(aa486295-e36e-4803-a89c-4976a4c65fc6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9200717344c793f52374831d5ff28eb1555b552b6a2a2e2dc2d728784e345003\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-854686554d-gbczl" podUID="aa486295-e36e-4803-a89c-4976a4c65fc6" Sep 12 00:26:22.918309 kubelet[2697]: E0912 00:26:22.917277 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62f80588b3115979c63d6e58630423fd2fdecacaf8f9bd4968ce42954db29a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.918309 kubelet[2697]: E0912 00:26:22.917369 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d327ae1be1dce252291683dad0ba7c9285f8098693ad42f698ac69df4809ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.918544 kubelet[2697]: E0912 00:26:22.918182 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d327ae1be1dce252291683dad0ba7c9285f8098693ad42f698ac69df4809ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-8qdzn" Sep 12 00:26:22.918544 kubelet[2697]: E0912 00:26:22.917374 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f82641f714ccd561f87e05eb32bc8f2f686dfc34583ad9423d64063c4eaf6efc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.918544 kubelet[2697]: E0912 00:26:22.918204 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d327ae1be1dce252291683dad0ba7c9285f8098693ad42f698ac69df4809ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-8qdzn" Sep 12 00:26:22.918544 kubelet[2697]: E0912 00:26:22.918223 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f82641f714ccd561f87e05eb32bc8f2f686dfc34583ad9423d64063c4eaf6efc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-545b8468d6-d6cqw" Sep 12 00:26:22.918716 kubelet[2697]: E0912 00:26:22.917398 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"911b39cbf39a6a44b9974b440c181966a25f291285bca28a34d5aeb5f3ee1246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.918716 kubelet[2697]: E0912 00:26:22.918258 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f82641f714ccd561f87e05eb32bc8f2f686dfc34583ad9423d64063c4eaf6efc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-545b8468d6-d6cqw" Sep 12 00:26:22.918716 kubelet[2697]: E0912 00:26:22.918268 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"911b39cbf39a6a44b9974b440c181966a25f291285bca28a34d5aeb5f3ee1246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-kcjr9" Sep 12 00:26:22.918716 kubelet[2697]: E0912 00:26:22.918289 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"911b39cbf39a6a44b9974b440c181966a25f291285bca28a34d5aeb5f3ee1246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-kcjr9" Sep 12 00:26:22.918916 kubelet[2697]: E0912 00:26:22.918318 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-545b8468d6-d6cqw_calico-apiserver(02e48ce0-416b-41b3-bb94-8abf9878b10e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-545b8468d6-d6cqw_calico-apiserver(02e48ce0-416b-41b3-bb94-8abf9878b10e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f82641f714ccd561f87e05eb32bc8f2f686dfc34583ad9423d64063c4eaf6efc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-545b8468d6-d6cqw" podUID="02e48ce0-416b-41b3-bb94-8abf9878b10e" Sep 12 00:26:22.918916 kubelet[2697]: E0912 00:26:22.917281 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c117c2c7820e11f83da276e363f0a44c850f7acc54e944ed8cf78f88f08c9fc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:22.918916 kubelet[2697]: E0912 00:26:22.918330 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-kcjr9_calico-system(12633e0e-0279-423e-8561-06b36e192c10)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-kcjr9_calico-system(12633e0e-0279-423e-8561-06b36e192c10)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"911b39cbf39a6a44b9974b440c181966a25f291285bca28a34d5aeb5f3ee1246\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-kcjr9" podUID="12633e0e-0279-423e-8561-06b36e192c10" Sep 12 00:26:22.919085 kubelet[2697]: E0912 00:26:22.918195 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62f80588b3115979c63d6e58630423fd2fdecacaf8f9bd4968ce42954db29a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-545b8468d6-b9fsq" Sep 12 00:26:22.919085 kubelet[2697]: E0912 00:26:22.918171 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70c41bc20ed6487c98dc5d84b70ce6ff2e0cae6a5c7fd363c7de11721c2b782b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75845c7647-hjrgt" Sep 12 00:26:22.919085 kubelet[2697]: E0912 00:26:22.918387 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62f80588b3115979c63d6e58630423fd2fdecacaf8f9bd4968ce42954db29a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-545b8468d6-b9fsq" Sep 12 00:26:22.919085 kubelet[2697]: E0912 00:26:22.918393 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70c41bc20ed6487c98dc5d84b70ce6ff2e0cae6a5c7fd363c7de11721c2b782b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75845c7647-hjrgt" Sep 12 00:26:22.919232 kubelet[2697]: E0912 00:26:22.918420 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-545b8468d6-b9fsq_calico-apiserver(235fa983-248c-4ae8-8cf2-57465025f544)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-545b8468d6-b9fsq_calico-apiserver(235fa983-248c-4ae8-8cf2-57465025f544)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62f80588b3115979c63d6e58630423fd2fdecacaf8f9bd4968ce42954db29a96\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-545b8468d6-b9fsq" podUID="235fa983-248c-4ae8-8cf2-57465025f544" Sep 12 00:26:22.919232 kubelet[2697]: E0912 00:26:22.918366 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c117c2c7820e11f83da276e363f0a44c850f7acc54e944ed8cf78f88f08c9fc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-z6x5p" Sep 12 00:26:22.919340 kubelet[2697]: E0912 00:26:22.918432 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-75845c7647-hjrgt_calico-system(f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-75845c7647-hjrgt_calico-system(f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70c41bc20ed6487c98dc5d84b70ce6ff2e0cae6a5c7fd363c7de11721c2b782b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-75845c7647-hjrgt" podUID="f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef" Sep 12 00:26:22.919340 kubelet[2697]: E0912 00:26:22.918237 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-8qdzn_kube-system(0a998cb9-35da-4ff2-b590-14bd772b287b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-8qdzn_kube-system(0a998cb9-35da-4ff2-b590-14bd772b287b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d327ae1be1dce252291683dad0ba7c9285f8098693ad42f698ac69df4809ecc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-8qdzn" podUID="0a998cb9-35da-4ff2-b590-14bd772b287b" Sep 12 00:26:22.919340 kubelet[2697]: E0912 00:26:22.918467 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c117c2c7820e11f83da276e363f0a44c850f7acc54e944ed8cf78f88f08c9fc7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-z6x5p" Sep 12 00:26:22.919533 kubelet[2697]: E0912 00:26:22.918505 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-z6x5p_kube-system(39f8040c-9434-4de5-b6ff-a6491073dd00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-z6x5p_kube-system(39f8040c-9434-4de5-b6ff-a6491073dd00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c117c2c7820e11f83da276e363f0a44c850f7acc54e944ed8cf78f88f08c9fc7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-z6x5p" podUID="39f8040c-9434-4de5-b6ff-a6491073dd00" Sep 12 00:26:23.627253 systemd[1]: Created slice kubepods-besteffort-pod87a73b77_6f47_40e3_9295_51af59f40cde.slice - libcontainer container kubepods-besteffort-pod87a73b77_6f47_40e3_9295_51af59f40cde.slice. Sep 12 00:26:23.638246 containerd[1555]: time="2025-09-12T00:26:23.636730105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bkq9f,Uid:87a73b77-6f47-40e3-9295-51af59f40cde,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:23.730686 containerd[1555]: time="2025-09-12T00:26:23.730598284Z" level=error msg="Failed to destroy network for sandbox \"69d2c03c9836be4e21aa0bb65446e12475903b5ad876da17ffb59b7b43e57669\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:23.732930 containerd[1555]: time="2025-09-12T00:26:23.732668806Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bkq9f,Uid:87a73b77-6f47-40e3-9295-51af59f40cde,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"69d2c03c9836be4e21aa0bb65446e12475903b5ad876da17ffb59b7b43e57669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:23.733353 kubelet[2697]: E0912 00:26:23.733297 2697 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69d2c03c9836be4e21aa0bb65446e12475903b5ad876da17ffb59b7b43e57669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 00:26:23.733532 kubelet[2697]: E0912 00:26:23.733505 2697 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69d2c03c9836be4e21aa0bb65446e12475903b5ad876da17ffb59b7b43e57669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bkq9f" Sep 12 00:26:23.733631 kubelet[2697]: E0912 00:26:23.733608 2697 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69d2c03c9836be4e21aa0bb65446e12475903b5ad876da17ffb59b7b43e57669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bkq9f" Sep 12 00:26:23.733852 kubelet[2697]: E0912 00:26:23.733771 2697 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bkq9f_calico-system(87a73b77-6f47-40e3-9295-51af59f40cde)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bkq9f_calico-system(87a73b77-6f47-40e3-9295-51af59f40cde)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69d2c03c9836be4e21aa0bb65446e12475903b5ad876da17ffb59b7b43e57669\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bkq9f" podUID="87a73b77-6f47-40e3-9295-51af59f40cde" Sep 12 00:26:23.738215 systemd[1]: run-netns-cni\x2d5de221ba\x2db3c3\x2debfe\x2dbdf8\x2d2b86522b2f41.mount: Deactivated successfully. Sep 12 00:26:28.840789 kubelet[2697]: I0912 00:26:28.840740 2697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:26:30.166397 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1344315234.mount: Deactivated successfully. Sep 12 00:26:30.944552 containerd[1555]: time="2025-09-12T00:26:30.944493753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:30.945299 containerd[1555]: time="2025-09-12T00:26:30.945261154Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 00:26:30.946454 containerd[1555]: time="2025-09-12T00:26:30.946408259Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:30.948582 containerd[1555]: time="2025-09-12T00:26:30.948554138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:30.949123 containerd[1555]: time="2025-09-12T00:26:30.949088591Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 8.189483591s" Sep 12 00:26:30.949155 containerd[1555]: time="2025-09-12T00:26:30.949121413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 00:26:30.959919 containerd[1555]: time="2025-09-12T00:26:30.959870509Z" level=info msg="CreateContainer within sandbox \"8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 00:26:30.989831 containerd[1555]: time="2025-09-12T00:26:30.989793381Z" level=info msg="Container c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:31.009660 containerd[1555]: time="2025-09-12T00:26:31.009618839Z" level=info msg="CreateContainer within sandbox \"8b0276a3c1a67e73e8db6c93b0f542b6b4cfc3b52c0e12d43f961ed69fe35de4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2\"" Sep 12 00:26:31.010104 containerd[1555]: time="2025-09-12T00:26:31.010081157Z" level=info msg="StartContainer for \"c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2\"" Sep 12 00:26:31.016858 containerd[1555]: time="2025-09-12T00:26:31.016828857Z" level=info msg="connecting to shim c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2" address="unix:///run/containerd/s/f5867db5a17b589d721f9fb79fac0659e3cca05a6549d2c6868904ac498f1b71" protocol=ttrpc version=3 Sep 12 00:26:31.074831 systemd[1]: Started cri-containerd-c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2.scope - libcontainer container c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2. Sep 12 00:26:31.352479 containerd[1555]: time="2025-09-12T00:26:31.351908249Z" level=info msg="StartContainer for \"c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2\" returns successfully" Sep 12 00:26:31.371428 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 00:26:31.372055 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 00:26:31.539442 kubelet[2697]: I0912 00:26:31.538940 2697 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdlzz\" (UniqueName: \"kubernetes.io/projected/aa486295-e36e-4803-a89c-4976a4c65fc6-kube-api-access-qdlzz\") pod \"aa486295-e36e-4803-a89c-4976a4c65fc6\" (UID: \"aa486295-e36e-4803-a89c-4976a4c65fc6\") " Sep 12 00:26:31.539442 kubelet[2697]: I0912 00:26:31.538988 2697 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-ca-bundle\") pod \"aa486295-e36e-4803-a89c-4976a4c65fc6\" (UID: \"aa486295-e36e-4803-a89c-4976a4c65fc6\") " Sep 12 00:26:31.539442 kubelet[2697]: I0912 00:26:31.539020 2697 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-backend-key-pair\") pod \"aa486295-e36e-4803-a89c-4976a4c65fc6\" (UID: \"aa486295-e36e-4803-a89c-4976a4c65fc6\") " Sep 12 00:26:31.540334 kubelet[2697]: I0912 00:26:31.540298 2697 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "aa486295-e36e-4803-a89c-4976a4c65fc6" (UID: "aa486295-e36e-4803-a89c-4976a4c65fc6"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 00:26:31.542709 kubelet[2697]: I0912 00:26:31.542667 2697 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa486295-e36e-4803-a89c-4976a4c65fc6-kube-api-access-qdlzz" (OuterVolumeSpecName: "kube-api-access-qdlzz") pod "aa486295-e36e-4803-a89c-4976a4c65fc6" (UID: "aa486295-e36e-4803-a89c-4976a4c65fc6"). InnerVolumeSpecName "kube-api-access-qdlzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 00:26:31.544981 systemd[1]: var-lib-kubelet-pods-aa486295\x2de36e\x2d4803\x2da89c\x2d4976a4c65fc6-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqdlzz.mount: Deactivated successfully. Sep 12 00:26:31.545122 systemd[1]: var-lib-kubelet-pods-aa486295\x2de36e\x2d4803\x2da89c\x2d4976a4c65fc6-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 00:26:31.545358 kubelet[2697]: I0912 00:26:31.545326 2697 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "aa486295-e36e-4803-a89c-4976a4c65fc6" (UID: "aa486295-e36e-4803-a89c-4976a4c65fc6"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 00:26:31.609122 systemd[1]: Removed slice kubepods-besteffort-podaa486295_e36e_4803_a89c_4976a4c65fc6.slice - libcontainer container kubepods-besteffort-podaa486295_e36e_4803_a89c_4976a4c65fc6.slice. Sep 12 00:26:31.639773 kubelet[2697]: I0912 00:26:31.639725 2697 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 12 00:26:31.639773 kubelet[2697]: I0912 00:26:31.639750 2697 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/aa486295-e36e-4803-a89c-4976a4c65fc6-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 12 00:26:31.639773 kubelet[2697]: I0912 00:26:31.639760 2697 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdlzz\" (UniqueName: \"kubernetes.io/projected/aa486295-e36e-4803-a89c-4976a4c65fc6-kube-api-access-qdlzz\") on node \"localhost\" DevicePath \"\"" Sep 12 00:26:31.908226 kubelet[2697]: I0912 00:26:31.908090 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-lph6b" podStartSLOduration=2.230888368 podStartE2EDuration="24.908063737s" podCreationTimestamp="2025-09-12 00:26:07 +0000 UTC" firstStartedPulling="2025-09-12 00:26:08.272546782 +0000 UTC m=+20.965813131" lastFinishedPulling="2025-09-12 00:26:30.949722151 +0000 UTC m=+43.642988500" observedRunningTime="2025-09-12 00:26:31.899726342 +0000 UTC m=+44.592992691" watchObservedRunningTime="2025-09-12 00:26:31.908063737 +0000 UTC m=+44.601330086" Sep 12 00:26:31.944425 systemd[1]: Created slice kubepods-besteffort-pod5bfa0dd7_fece_47b9_b9e2_520ede53956c.slice - libcontainer container kubepods-besteffort-pod5bfa0dd7_fece_47b9_b9e2_520ede53956c.slice. Sep 12 00:26:32.043230 kubelet[2697]: I0912 00:26:32.043171 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bfa0dd7-fece-47b9-b9e2-520ede53956c-whisker-ca-bundle\") pod \"whisker-86b9f779f9-8w7t8\" (UID: \"5bfa0dd7-fece-47b9-b9e2-520ede53956c\") " pod="calico-system/whisker-86b9f779f9-8w7t8" Sep 12 00:26:32.043230 kubelet[2697]: I0912 00:26:32.043210 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5bfa0dd7-fece-47b9-b9e2-520ede53956c-whisker-backend-key-pair\") pod \"whisker-86b9f779f9-8w7t8\" (UID: \"5bfa0dd7-fece-47b9-b9e2-520ede53956c\") " pod="calico-system/whisker-86b9f779f9-8w7t8" Sep 12 00:26:32.043230 kubelet[2697]: I0912 00:26:32.043224 2697 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4st8h\" (UniqueName: \"kubernetes.io/projected/5bfa0dd7-fece-47b9-b9e2-520ede53956c-kube-api-access-4st8h\") pod \"whisker-86b9f779f9-8w7t8\" (UID: \"5bfa0dd7-fece-47b9-b9e2-520ede53956c\") " pod="calico-system/whisker-86b9f779f9-8w7t8" Sep 12 00:26:32.249845 containerd[1555]: time="2025-09-12T00:26:32.249741076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86b9f779f9-8w7t8,Uid:5bfa0dd7-fece-47b9-b9e2-520ede53956c,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:32.498344 systemd-networkd[1493]: cali7d615cfcd80: Link UP Sep 12 00:26:32.498779 systemd-networkd[1493]: cali7d615cfcd80: Gained carrier Sep 12 00:26:32.515253 containerd[1555]: 2025-09-12 00:26:32.270 [INFO][3876] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 00:26:32.515253 containerd[1555]: 2025-09-12 00:26:32.286 [INFO][3876] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--86b9f779f9--8w7t8-eth0 whisker-86b9f779f9- calico-system 5bfa0dd7-fece-47b9-b9e2-520ede53956c 880 0 2025-09-12 00:26:31 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:86b9f779f9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-86b9f779f9-8w7t8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali7d615cfcd80 [] [] }} ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-" Sep 12 00:26:32.515253 containerd[1555]: 2025-09-12 00:26:32.286 [INFO][3876] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" Sep 12 00:26:32.515253 containerd[1555]: 2025-09-12 00:26:32.340 [INFO][3889] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" HandleID="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Workload="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.341 [INFO][3889] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" HandleID="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Workload="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00019f910), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-86b9f779f9-8w7t8", "timestamp":"2025-09-12 00:26:32.340726594 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.341 [INFO][3889] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.341 [INFO][3889] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.341 [INFO][3889] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.347 [INFO][3889] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" host="localhost" Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.351 [INFO][3889] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.354 [INFO][3889] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.356 [INFO][3889] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.357 [INFO][3889] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:32.515543 containerd[1555]: 2025-09-12 00:26:32.357 [INFO][3889] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" host="localhost" Sep 12 00:26:32.515856 containerd[1555]: 2025-09-12 00:26:32.358 [INFO][3889] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b Sep 12 00:26:32.515856 containerd[1555]: 2025-09-12 00:26:32.377 [INFO][3889] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" host="localhost" Sep 12 00:26:32.515856 containerd[1555]: 2025-09-12 00:26:32.488 [INFO][3889] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" host="localhost" Sep 12 00:26:32.515856 containerd[1555]: 2025-09-12 00:26:32.488 [INFO][3889] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" host="localhost" Sep 12 00:26:32.515856 containerd[1555]: 2025-09-12 00:26:32.488 [INFO][3889] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:32.515856 containerd[1555]: 2025-09-12 00:26:32.488 [INFO][3889] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" HandleID="k8s-pod-network.b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Workload="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" Sep 12 00:26:32.516019 containerd[1555]: 2025-09-12 00:26:32.491 [INFO][3876] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--86b9f779f9--8w7t8-eth0", GenerateName:"whisker-86b9f779f9-", Namespace:"calico-system", SelfLink:"", UID:"5bfa0dd7-fece-47b9-b9e2-520ede53956c", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"86b9f779f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-86b9f779f9-8w7t8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7d615cfcd80", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:32.516019 containerd[1555]: 2025-09-12 00:26:32.491 [INFO][3876] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" Sep 12 00:26:32.516117 containerd[1555]: 2025-09-12 00:26:32.491 [INFO][3876] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7d615cfcd80 ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" Sep 12 00:26:32.516117 containerd[1555]: 2025-09-12 00:26:32.500 [INFO][3876] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" Sep 12 00:26:32.516173 containerd[1555]: 2025-09-12 00:26:32.501 [INFO][3876] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--86b9f779f9--8w7t8-eth0", GenerateName:"whisker-86b9f779f9-", Namespace:"calico-system", SelfLink:"", UID:"5bfa0dd7-fece-47b9-b9e2-520ede53956c", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"86b9f779f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b", Pod:"whisker-86b9f779f9-8w7t8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7d615cfcd80", MAC:"1e:14:e2:e2:c9:6c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:32.516241 containerd[1555]: 2025-09-12 00:26:32.512 [INFO][3876] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" Namespace="calico-system" Pod="whisker-86b9f779f9-8w7t8" WorkloadEndpoint="localhost-k8s-whisker--86b9f779f9--8w7t8-eth0" Sep 12 00:26:32.664221 containerd[1555]: time="2025-09-12T00:26:32.664175272Z" level=info msg="connecting to shim b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b" address="unix:///run/containerd/s/382623ab76a4419800e60a1d37ae28bf5532d063c3f8c171b29e0aa6dfbbcd7e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:32.716943 systemd[1]: Started cri-containerd-b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b.scope - libcontainer container b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b. Sep 12 00:26:32.742125 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:32.821880 containerd[1555]: time="2025-09-12T00:26:32.821747320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86b9f779f9-8w7t8,Uid:5bfa0dd7-fece-47b9-b9e2-520ede53956c,Namespace:calico-system,Attempt:0,} returns sandbox id \"b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b\"" Sep 12 00:26:32.824196 containerd[1555]: time="2025-09-12T00:26:32.824168155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 00:26:33.188302 systemd-networkd[1493]: vxlan.calico: Link UP Sep 12 00:26:33.188314 systemd-networkd[1493]: vxlan.calico: Gained carrier Sep 12 00:26:33.601806 containerd[1555]: time="2025-09-12T00:26:33.601757935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kcjr9,Uid:12633e0e-0279-423e-8561-06b36e192c10,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:33.603606 kubelet[2697]: I0912 00:26:33.603572 2697 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa486295-e36e-4803-a89c-4976a4c65fc6" path="/var/lib/kubelet/pods/aa486295-e36e-4803-a89c-4976a4c65fc6/volumes" Sep 12 00:26:33.698106 systemd-networkd[1493]: calidbfefa46a96: Link UP Sep 12 00:26:33.698606 systemd-networkd[1493]: calidbfefa46a96: Gained carrier Sep 12 00:26:33.715939 containerd[1555]: 2025-09-12 00:26:33.644 [INFO][4155] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--54d579b49d--kcjr9-eth0 goldmane-54d579b49d- calico-system 12633e0e-0279-423e-8561-06b36e192c10 805 0 2025-09-12 00:26:07 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-54d579b49d-kcjr9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calidbfefa46a96 [] [] }} ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-" Sep 12 00:26:33.715939 containerd[1555]: 2025-09-12 00:26:33.644 [INFO][4155] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" Sep 12 00:26:33.715939 containerd[1555]: 2025-09-12 00:26:33.669 [INFO][4169] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" HandleID="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Workload="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.669 [INFO][4169] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" HandleID="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Workload="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a5640), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-54d579b49d-kcjr9", "timestamp":"2025-09-12 00:26:33.669522705 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.669 [INFO][4169] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.669 [INFO][4169] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.669 [INFO][4169] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.675 [INFO][4169] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" host="localhost" Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.678 [INFO][4169] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.681 [INFO][4169] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.682 [INFO][4169] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.684 [INFO][4169] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:33.716280 containerd[1555]: 2025-09-12 00:26:33.684 [INFO][4169] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" host="localhost" Sep 12 00:26:33.716491 containerd[1555]: 2025-09-12 00:26:33.685 [INFO][4169] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1 Sep 12 00:26:33.716491 containerd[1555]: 2025-09-12 00:26:33.688 [INFO][4169] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" host="localhost" Sep 12 00:26:33.716491 containerd[1555]: 2025-09-12 00:26:33.692 [INFO][4169] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" host="localhost" Sep 12 00:26:33.716491 containerd[1555]: 2025-09-12 00:26:33.692 [INFO][4169] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" host="localhost" Sep 12 00:26:33.716491 containerd[1555]: 2025-09-12 00:26:33.692 [INFO][4169] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:33.716491 containerd[1555]: 2025-09-12 00:26:33.692 [INFO][4169] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" HandleID="k8s-pod-network.77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Workload="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" Sep 12 00:26:33.716614 containerd[1555]: 2025-09-12 00:26:33.696 [INFO][4155] cni-plugin/k8s.go 418: Populated endpoint ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--kcjr9-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"12633e0e-0279-423e-8561-06b36e192c10", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-54d579b49d-kcjr9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidbfefa46a96", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:33.716614 containerd[1555]: 2025-09-12 00:26:33.696 [INFO][4155] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" Sep 12 00:26:33.716691 containerd[1555]: 2025-09-12 00:26:33.696 [INFO][4155] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidbfefa46a96 ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" Sep 12 00:26:33.716691 containerd[1555]: 2025-09-12 00:26:33.698 [INFO][4155] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" Sep 12 00:26:33.716775 containerd[1555]: 2025-09-12 00:26:33.699 [INFO][4155] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--54d579b49d--kcjr9-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"12633e0e-0279-423e-8561-06b36e192c10", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1", Pod:"goldmane-54d579b49d-kcjr9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidbfefa46a96", MAC:"92:c3:bf:20:a9:50", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:33.716829 containerd[1555]: 2025-09-12 00:26:33.711 [INFO][4155] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" Namespace="calico-system" Pod="goldmane-54d579b49d-kcjr9" WorkloadEndpoint="localhost-k8s-goldmane--54d579b49d--kcjr9-eth0" Sep 12 00:26:33.761937 containerd[1555]: time="2025-09-12T00:26:33.761895464Z" level=info msg="connecting to shim 77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1" address="unix:///run/containerd/s/0f0a9286a6ee184d7a2ffec89e56018797f3eda893f0fbd15690e790b46fe608" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:33.796834 systemd[1]: Started cri-containerd-77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1.scope - libcontainer container 77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1. Sep 12 00:26:33.810417 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:33.842418 containerd[1555]: time="2025-09-12T00:26:33.842368152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-kcjr9,Uid:12633e0e-0279-423e-8561-06b36e192c10,Namespace:calico-system,Attempt:0,} returns sandbox id \"77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1\"" Sep 12 00:26:34.329929 systemd-networkd[1493]: vxlan.calico: Gained IPv6LL Sep 12 00:26:34.403274 containerd[1555]: time="2025-09-12T00:26:34.403221007Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:34.403906 containerd[1555]: time="2025-09-12T00:26:34.403860117Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 00:26:34.405019 containerd[1555]: time="2025-09-12T00:26:34.404962897Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:34.406974 containerd[1555]: time="2025-09-12T00:26:34.406939237Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:34.407479 containerd[1555]: time="2025-09-12T00:26:34.407440608Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.583242006s" Sep 12 00:26:34.407479 containerd[1555]: time="2025-09-12T00:26:34.407468410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 00:26:34.408652 containerd[1555]: time="2025-09-12T00:26:34.408622417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 00:26:34.410401 containerd[1555]: time="2025-09-12T00:26:34.409724305Z" level=info msg="CreateContainer within sandbox \"b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 00:26:34.418839 containerd[1555]: time="2025-09-12T00:26:34.418791736Z" level=info msg="Container c1d96b77e302013dd35d9caad8287d9d8cb95b06d1dace55df3ffa3e885ff08c: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:34.428102 containerd[1555]: time="2025-09-12T00:26:34.428060436Z" level=info msg="CreateContainer within sandbox \"b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"c1d96b77e302013dd35d9caad8287d9d8cb95b06d1dace55df3ffa3e885ff08c\"" Sep 12 00:26:34.428486 containerd[1555]: time="2025-09-12T00:26:34.428461409Z" level=info msg="StartContainer for \"c1d96b77e302013dd35d9caad8287d9d8cb95b06d1dace55df3ffa3e885ff08c\"" Sep 12 00:26:34.429469 containerd[1555]: time="2025-09-12T00:26:34.429430909Z" level=info msg="connecting to shim c1d96b77e302013dd35d9caad8287d9d8cb95b06d1dace55df3ffa3e885ff08c" address="unix:///run/containerd/s/382623ab76a4419800e60a1d37ae28bf5532d063c3f8c171b29e0aa6dfbbcd7e" protocol=ttrpc version=3 Sep 12 00:26:34.450833 systemd[1]: Started cri-containerd-c1d96b77e302013dd35d9caad8287d9d8cb95b06d1dace55df3ffa3e885ff08c.scope - libcontainer container c1d96b77e302013dd35d9caad8287d9d8cb95b06d1dace55df3ffa3e885ff08c. Sep 12 00:26:34.458817 systemd-networkd[1493]: cali7d615cfcd80: Gained IPv6LL Sep 12 00:26:34.500369 containerd[1555]: time="2025-09-12T00:26:34.500332147Z" level=info msg="StartContainer for \"c1d96b77e302013dd35d9caad8287d9d8cb95b06d1dace55df3ffa3e885ff08c\" returns successfully" Sep 12 00:26:34.605750 containerd[1555]: time="2025-09-12T00:26:34.604623416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75845c7647-hjrgt,Uid:f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:34.697386 systemd-networkd[1493]: cali0c61b485e16: Link UP Sep 12 00:26:34.697812 systemd-networkd[1493]: cali0c61b485e16: Gained carrier Sep 12 00:26:34.710772 containerd[1555]: 2025-09-12 00:26:34.641 [INFO][4278] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0 calico-kube-controllers-75845c7647- calico-system f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef 804 0 2025-09-12 00:26:08 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:75845c7647 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-75845c7647-hjrgt eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0c61b485e16 [] [] }} ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-" Sep 12 00:26:34.710772 containerd[1555]: 2025-09-12 00:26:34.641 [INFO][4278] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" Sep 12 00:26:34.710772 containerd[1555]: 2025-09-12 00:26:34.664 [INFO][4292] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" HandleID="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Workload="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.664 [INFO][4292] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" HandleID="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Workload="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e6b0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-75845c7647-hjrgt", "timestamp":"2025-09-12 00:26:34.664450536 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.664 [INFO][4292] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.664 [INFO][4292] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.664 [INFO][4292] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.671 [INFO][4292] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" host="localhost" Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.674 [INFO][4292] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.678 [INFO][4292] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.680 [INFO][4292] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.682 [INFO][4292] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:34.710976 containerd[1555]: 2025-09-12 00:26:34.682 [INFO][4292] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" host="localhost" Sep 12 00:26:34.711223 containerd[1555]: 2025-09-12 00:26:34.683 [INFO][4292] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376 Sep 12 00:26:34.711223 containerd[1555]: 2025-09-12 00:26:34.686 [INFO][4292] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" host="localhost" Sep 12 00:26:34.711223 containerd[1555]: 2025-09-12 00:26:34.692 [INFO][4292] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" host="localhost" Sep 12 00:26:34.711223 containerd[1555]: 2025-09-12 00:26:34.692 [INFO][4292] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" host="localhost" Sep 12 00:26:34.711223 containerd[1555]: 2025-09-12 00:26:34.692 [INFO][4292] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:34.711223 containerd[1555]: 2025-09-12 00:26:34.692 [INFO][4292] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" HandleID="k8s-pod-network.9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Workload="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" Sep 12 00:26:34.711341 containerd[1555]: 2025-09-12 00:26:34.695 [INFO][4278] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0", GenerateName:"calico-kube-controllers-75845c7647-", Namespace:"calico-system", SelfLink:"", UID:"f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75845c7647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-75845c7647-hjrgt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0c61b485e16", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:34.711390 containerd[1555]: 2025-09-12 00:26:34.695 [INFO][4278] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" Sep 12 00:26:34.711390 containerd[1555]: 2025-09-12 00:26:34.695 [INFO][4278] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c61b485e16 ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" Sep 12 00:26:34.711390 containerd[1555]: 2025-09-12 00:26:34.698 [INFO][4278] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" Sep 12 00:26:34.711452 containerd[1555]: 2025-09-12 00:26:34.698 [INFO][4278] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0", GenerateName:"calico-kube-controllers-75845c7647-", Namespace:"calico-system", SelfLink:"", UID:"f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75845c7647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376", Pod:"calico-kube-controllers-75845c7647-hjrgt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0c61b485e16", MAC:"c2:48:7f:52:f6:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:34.711499 containerd[1555]: 2025-09-12 00:26:34.705 [INFO][4278] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" Namespace="calico-system" Pod="calico-kube-controllers-75845c7647-hjrgt" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--75845c7647--hjrgt-eth0" Sep 12 00:26:34.737521 containerd[1555]: time="2025-09-12T00:26:34.737431307Z" level=info msg="connecting to shim 9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376" address="unix:///run/containerd/s/131a2ffc648a515bff4b804e636a7d9602b6015f052297aefd22fc3ebaa47c97" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:34.764841 systemd[1]: Started cri-containerd-9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376.scope - libcontainer container 9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376. Sep 12 00:26:34.778792 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:34.818729 containerd[1555]: time="2025-09-12T00:26:34.818656997Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75845c7647-hjrgt,Uid:f77b2b1f-f595-49ee-b3e7-e8d7bdc928ef,Namespace:calico-system,Attempt:0,} returns sandbox id \"9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376\"" Sep 12 00:26:35.353890 systemd-networkd[1493]: calidbfefa46a96: Gained IPv6LL Sep 12 00:26:35.603584 containerd[1555]: time="2025-09-12T00:26:35.603535533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-b9fsq,Uid:235fa983-248c-4ae8-8cf2-57465025f544,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:26:35.691088 systemd-networkd[1493]: cali1f7fb514663: Link UP Sep 12 00:26:35.691488 systemd-networkd[1493]: cali1f7fb514663: Gained carrier Sep 12 00:26:35.709014 containerd[1555]: 2025-09-12 00:26:35.634 [INFO][4355] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0 calico-apiserver-545b8468d6- calico-apiserver 235fa983-248c-4ae8-8cf2-57465025f544 800 0 2025-09-12 00:26:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:545b8468d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-545b8468d6-b9fsq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1f7fb514663 [] [] }} ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-" Sep 12 00:26:35.709014 containerd[1555]: 2025-09-12 00:26:35.634 [INFO][4355] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" Sep 12 00:26:35.709014 containerd[1555]: 2025-09-12 00:26:35.657 [INFO][4370] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" HandleID="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Workload="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.657 [INFO][4370] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" HandleID="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Workload="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a5d80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-545b8468d6-b9fsq", "timestamp":"2025-09-12 00:26:35.656993208 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.657 [INFO][4370] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.657 [INFO][4370] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.657 [INFO][4370] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.662 [INFO][4370] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" host="localhost" Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.665 [INFO][4370] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.669 [INFO][4370] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.672 [INFO][4370] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.674 [INFO][4370] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:35.709469 containerd[1555]: 2025-09-12 00:26:35.674 [INFO][4370] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" host="localhost" Sep 12 00:26:35.709756 containerd[1555]: 2025-09-12 00:26:35.675 [INFO][4370] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126 Sep 12 00:26:35.709756 containerd[1555]: 2025-09-12 00:26:35.678 [INFO][4370] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" host="localhost" Sep 12 00:26:35.709756 containerd[1555]: 2025-09-12 00:26:35.685 [INFO][4370] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" host="localhost" Sep 12 00:26:35.709756 containerd[1555]: 2025-09-12 00:26:35.685 [INFO][4370] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" host="localhost" Sep 12 00:26:35.709756 containerd[1555]: 2025-09-12 00:26:35.685 [INFO][4370] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:35.709756 containerd[1555]: 2025-09-12 00:26:35.685 [INFO][4370] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" HandleID="k8s-pod-network.412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Workload="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" Sep 12 00:26:35.709886 containerd[1555]: 2025-09-12 00:26:35.688 [INFO][4355] cni-plugin/k8s.go 418: Populated endpoint ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0", GenerateName:"calico-apiserver-545b8468d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"235fa983-248c-4ae8-8cf2-57465025f544", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"545b8468d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-545b8468d6-b9fsq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1f7fb514663", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:35.709940 containerd[1555]: 2025-09-12 00:26:35.688 [INFO][4355] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" Sep 12 00:26:35.709940 containerd[1555]: 2025-09-12 00:26:35.688 [INFO][4355] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f7fb514663 ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" Sep 12 00:26:35.709940 containerd[1555]: 2025-09-12 00:26:35.691 [INFO][4355] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" Sep 12 00:26:35.710004 containerd[1555]: 2025-09-12 00:26:35.692 [INFO][4355] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0", GenerateName:"calico-apiserver-545b8468d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"235fa983-248c-4ae8-8cf2-57465025f544", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"545b8468d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126", Pod:"calico-apiserver-545b8468d6-b9fsq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1f7fb514663", MAC:"76:96:a3:f0:68:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:35.710053 containerd[1555]: 2025-09-12 00:26:35.700 [INFO][4355] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-b9fsq" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--b9fsq-eth0" Sep 12 00:26:35.729289 containerd[1555]: time="2025-09-12T00:26:35.729249459Z" level=info msg="connecting to shim 412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126" address="unix:///run/containerd/s/46fbef29e8fae6c40b052ef787796a1dabfcef51691a5149d330391e80243b35" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:35.756820 systemd[1]: Started cri-containerd-412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126.scope - libcontainer container 412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126. Sep 12 00:26:35.770036 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:35.800270 containerd[1555]: time="2025-09-12T00:26:35.800226640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-b9fsq,Uid:235fa983-248c-4ae8-8cf2-57465025f544,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126\"" Sep 12 00:26:36.121950 systemd-networkd[1493]: cali0c61b485e16: Gained IPv6LL Sep 12 00:26:36.601890 containerd[1555]: time="2025-09-12T00:26:36.601844933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8qdzn,Uid:0a998cb9-35da-4ff2-b590-14bd772b287b,Namespace:kube-system,Attempt:0,}" Sep 12 00:26:36.734616 systemd-networkd[1493]: calif57e89ba217: Link UP Sep 12 00:26:36.735252 systemd-networkd[1493]: calif57e89ba217: Gained carrier Sep 12 00:26:36.748418 containerd[1555]: 2025-09-12 00:26:36.665 [INFO][4436] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0 coredns-668d6bf9bc- kube-system 0a998cb9-35da-4ff2-b590-14bd772b287b 796 0 2025-09-12 00:25:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-8qdzn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif57e89ba217 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-" Sep 12 00:26:36.748418 containerd[1555]: 2025-09-12 00:26:36.665 [INFO][4436] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" Sep 12 00:26:36.748418 containerd[1555]: 2025-09-12 00:26:36.696 [INFO][4451] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" HandleID="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Workload="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.696 [INFO][4451] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" HandleID="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Workload="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c71c0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-8qdzn", "timestamp":"2025-09-12 00:26:36.696188402 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.696 [INFO][4451] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.696 [INFO][4451] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.696 [INFO][4451] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.704 [INFO][4451] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" host="localhost" Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.707 [INFO][4451] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.711 [INFO][4451] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.713 [INFO][4451] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.714 [INFO][4451] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:36.748811 containerd[1555]: 2025-09-12 00:26:36.715 [INFO][4451] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" host="localhost" Sep 12 00:26:36.749027 containerd[1555]: 2025-09-12 00:26:36.716 [INFO][4451] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251 Sep 12 00:26:36.749027 containerd[1555]: 2025-09-12 00:26:36.719 [INFO][4451] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" host="localhost" Sep 12 00:26:36.749027 containerd[1555]: 2025-09-12 00:26:36.727 [INFO][4451] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" host="localhost" Sep 12 00:26:36.749027 containerd[1555]: 2025-09-12 00:26:36.727 [INFO][4451] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" host="localhost" Sep 12 00:26:36.749027 containerd[1555]: 2025-09-12 00:26:36.727 [INFO][4451] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:36.749027 containerd[1555]: 2025-09-12 00:26:36.727 [INFO][4451] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" HandleID="k8s-pod-network.d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Workload="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" Sep 12 00:26:36.749141 containerd[1555]: 2025-09-12 00:26:36.731 [INFO][4436] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"0a998cb9-35da-4ff2-b590-14bd772b287b", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 25, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-8qdzn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif57e89ba217", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:36.749217 containerd[1555]: 2025-09-12 00:26:36.731 [INFO][4436] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" Sep 12 00:26:36.749217 containerd[1555]: 2025-09-12 00:26:36.731 [INFO][4436] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif57e89ba217 ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" Sep 12 00:26:36.749217 containerd[1555]: 2025-09-12 00:26:36.735 [INFO][4436] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" Sep 12 00:26:36.749285 containerd[1555]: 2025-09-12 00:26:36.735 [INFO][4436] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"0a998cb9-35da-4ff2-b590-14bd772b287b", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 25, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251", Pod:"coredns-668d6bf9bc-8qdzn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif57e89ba217", MAC:"d6:5a:46:b1:7e:dd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:36.749285 containerd[1555]: 2025-09-12 00:26:36.743 [INFO][4436] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" Namespace="kube-system" Pod="coredns-668d6bf9bc-8qdzn" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--8qdzn-eth0" Sep 12 00:26:36.778601 containerd[1555]: time="2025-09-12T00:26:36.778491902Z" level=info msg="connecting to shim d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251" address="unix:///run/containerd/s/e953e773d97d6b266b70962559557d453be08a5992d028e41c51b41882f82081" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:36.809568 systemd[1]: Started cri-containerd-d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251.scope - libcontainer container d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251. Sep 12 00:26:36.825348 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:36.864796 containerd[1555]: time="2025-09-12T00:26:36.864677740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8qdzn,Uid:0a998cb9-35da-4ff2-b590-14bd772b287b,Namespace:kube-system,Attempt:0,} returns sandbox id \"d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251\"" Sep 12 00:26:36.868596 containerd[1555]: time="2025-09-12T00:26:36.868558985Z" level=info msg="CreateContainer within sandbox \"d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 00:26:36.870738 kubelet[2697]: I0912 00:26:36.870687 2697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:26:36.887019 containerd[1555]: time="2025-09-12T00:26:36.885931572Z" level=info msg="Container 5ec5642537300b8a698ad2657704f5e15427e66451d3c8f1adf170733487ae80: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:36.887066 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2232999310.mount: Deactivated successfully. Sep 12 00:26:36.891476 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4272494480.mount: Deactivated successfully. Sep 12 00:26:36.906926 containerd[1555]: time="2025-09-12T00:26:36.906900842Z" level=info msg="CreateContainer within sandbox \"d538f8c6cf0174da50d2e3d061cbe123b2b82bfde022d986ed636bab32a0a251\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5ec5642537300b8a698ad2657704f5e15427e66451d3c8f1adf170733487ae80\"" Sep 12 00:26:36.907992 containerd[1555]: time="2025-09-12T00:26:36.907956333Z" level=info msg="StartContainer for \"5ec5642537300b8a698ad2657704f5e15427e66451d3c8f1adf170733487ae80\"" Sep 12 00:26:36.908810 containerd[1555]: time="2025-09-12T00:26:36.908729223Z" level=info msg="connecting to shim 5ec5642537300b8a698ad2657704f5e15427e66451d3c8f1adf170733487ae80" address="unix:///run/containerd/s/e953e773d97d6b266b70962559557d453be08a5992d028e41c51b41882f82081" protocol=ttrpc version=3 Sep 12 00:26:36.943618 systemd[1]: Started cri-containerd-5ec5642537300b8a698ad2657704f5e15427e66451d3c8f1adf170733487ae80.scope - libcontainer container 5ec5642537300b8a698ad2657704f5e15427e66451d3c8f1adf170733487ae80. Sep 12 00:26:37.093725 containerd[1555]: time="2025-09-12T00:26:37.092914791Z" level=info msg="StartContainer for \"5ec5642537300b8a698ad2657704f5e15427e66451d3c8f1adf170733487ae80\" returns successfully" Sep 12 00:26:37.099364 containerd[1555]: time="2025-09-12T00:26:37.099314693Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2\" id:\"05c7a95b9c2d5fb4eea7e8c4ef91dbb2aa36b8b7878251cc94ee100eeea0c243\" pid:4532 exited_at:{seconds:1757636797 nanos:98943386}" Sep 12 00:26:37.265769 containerd[1555]: time="2025-09-12T00:26:37.265728033Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2\" id:\"189f5be23b3227e968d371870fd33b59e9f8f29f1c7ce14c0a7a1b1a34959202\" pid:4590 exited_at:{seconds:1757636797 nanos:265345837}" Sep 12 00:26:37.499448 containerd[1555]: time="2025-09-12T00:26:37.499387966Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:37.500097 containerd[1555]: time="2025-09-12T00:26:37.500038737Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 00:26:37.501193 containerd[1555]: time="2025-09-12T00:26:37.501157396Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:37.503295 containerd[1555]: time="2025-09-12T00:26:37.503252328Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:37.510536 containerd[1555]: time="2025-09-12T00:26:37.510511072Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.101834703s" Sep 12 00:26:37.510588 containerd[1555]: time="2025-09-12T00:26:37.510537291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 00:26:37.511431 containerd[1555]: time="2025-09-12T00:26:37.511411532Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 00:26:37.512498 containerd[1555]: time="2025-09-12T00:26:37.512471000Z" level=info msg="CreateContainer within sandbox \"77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 00:26:37.520169 containerd[1555]: time="2025-09-12T00:26:37.520078318Z" level=info msg="Container 7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:37.528112 containerd[1555]: time="2025-09-12T00:26:37.528055691Z" level=info msg="CreateContainer within sandbox \"77af74f06249d9f671ef43367973289120283be14d22d4d154facc438e3cb1f1\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\"" Sep 12 00:26:37.528464 containerd[1555]: time="2025-09-12T00:26:37.528435793Z" level=info msg="StartContainer for \"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\"" Sep 12 00:26:37.529631 containerd[1555]: time="2025-09-12T00:26:37.529606772Z" level=info msg="connecting to shim 7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91" address="unix:///run/containerd/s/0f0a9286a6ee184d7a2ffec89e56018797f3eda893f0fbd15690e790b46fe608" protocol=ttrpc version=3 Sep 12 00:26:37.529892 systemd-networkd[1493]: cali1f7fb514663: Gained IPv6LL Sep 12 00:26:37.564912 systemd[1]: Started cri-containerd-7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91.scope - libcontainer container 7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91. Sep 12 00:26:37.604890 containerd[1555]: time="2025-09-12T00:26:37.604842286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-d6cqw,Uid:02e48ce0-416b-41b3-bb94-8abf9878b10e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 00:26:37.694493 containerd[1555]: time="2025-09-12T00:26:37.694458281Z" level=info msg="StartContainer for \"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\" returns successfully" Sep 12 00:26:37.775136 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount285673204.mount: Deactivated successfully. Sep 12 00:26:37.794419 systemd-networkd[1493]: cali91b6fdf46ac: Link UP Sep 12 00:26:37.795114 systemd-networkd[1493]: cali91b6fdf46ac: Gained carrier Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.725 [INFO][4651] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0 calico-apiserver-545b8468d6- calico-apiserver 02e48ce0-416b-41b3-bb94-8abf9878b10e 807 0 2025-09-12 00:26:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:545b8468d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-545b8468d6-d6cqw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali91b6fdf46ac [] [] }} ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.725 [INFO][4651] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.751 [INFO][4665] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" HandleID="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Workload="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.751 [INFO][4665] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" HandleID="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Workload="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f640), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-545b8468d6-d6cqw", "timestamp":"2025-09-12 00:26:37.751298501 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.751 [INFO][4665] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.751 [INFO][4665] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.751 [INFO][4665] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.759 [INFO][4665] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.764 [INFO][4665] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.771 [INFO][4665] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.773 [INFO][4665] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.776 [INFO][4665] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.776 [INFO][4665] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.778 [INFO][4665] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.782 [INFO][4665] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.789 [INFO][4665] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.789 [INFO][4665] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" host="localhost" Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.789 [INFO][4665] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:37.814736 containerd[1555]: 2025-09-12 00:26:37.789 [INFO][4665] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" HandleID="k8s-pod-network.470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Workload="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" Sep 12 00:26:37.815510 containerd[1555]: 2025-09-12 00:26:37.792 [INFO][4651] cni-plugin/k8s.go 418: Populated endpoint ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0", GenerateName:"calico-apiserver-545b8468d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"02e48ce0-416b-41b3-bb94-8abf9878b10e", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"545b8468d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-545b8468d6-d6cqw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali91b6fdf46ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:37.815510 containerd[1555]: 2025-09-12 00:26:37.792 [INFO][4651] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" Sep 12 00:26:37.815510 containerd[1555]: 2025-09-12 00:26:37.792 [INFO][4651] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91b6fdf46ac ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" Sep 12 00:26:37.815510 containerd[1555]: 2025-09-12 00:26:37.794 [INFO][4651] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" Sep 12 00:26:37.815510 containerd[1555]: 2025-09-12 00:26:37.796 [INFO][4651] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0", GenerateName:"calico-apiserver-545b8468d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"02e48ce0-416b-41b3-bb94-8abf9878b10e", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"545b8468d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d", Pod:"calico-apiserver-545b8468d6-d6cqw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali91b6fdf46ac", MAC:"b2:e8:79:b9:97:10", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:37.815510 containerd[1555]: 2025-09-12 00:26:37.809 [INFO][4651] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" Namespace="calico-apiserver" Pod="calico-apiserver-545b8468d6-d6cqw" WorkloadEndpoint="localhost-k8s-calico--apiserver--545b8468d6--d6cqw-eth0" Sep 12 00:26:37.842482 containerd[1555]: time="2025-09-12T00:26:37.842436753Z" level=info msg="connecting to shim 470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d" address="unix:///run/containerd/s/127e0b2f011839b36e7088ec890efc4de637a0a755b2573227819d5d3586985c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:37.884957 systemd[1]: Started cri-containerd-470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d.scope - libcontainer container 470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d. Sep 12 00:26:37.897292 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:37.942487 containerd[1555]: time="2025-09-12T00:26:37.942263151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-545b8468d6-d6cqw,Uid:02e48ce0-416b-41b3-bb94-8abf9878b10e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d\"" Sep 12 00:26:37.944311 kubelet[2697]: I0912 00:26:37.944253 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-kcjr9" podStartSLOduration=27.27646953 podStartE2EDuration="30.944233438s" podCreationTimestamp="2025-09-12 00:26:07 +0000 UTC" firstStartedPulling="2025-09-12 00:26:33.843454652 +0000 UTC m=+46.536721001" lastFinishedPulling="2025-09-12 00:26:37.51121856 +0000 UTC m=+50.204484909" observedRunningTime="2025-09-12 00:26:37.926502431 +0000 UTC m=+50.619768780" watchObservedRunningTime="2025-09-12 00:26:37.944233438 +0000 UTC m=+50.637499787" Sep 12 00:26:38.014184 containerd[1555]: time="2025-09-12T00:26:38.014138819Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\" id:\"921750b8a9afc44f798c2b81738e6242ca27aae006d9570c73db0333bac19914\" pid:4745 exit_status:1 exited_at:{seconds:1757636798 nanos:13733939}" Sep 12 00:26:38.170912 systemd-networkd[1493]: calif57e89ba217: Gained IPv6LL Sep 12 00:26:38.601745 containerd[1555]: time="2025-09-12T00:26:38.601686283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z6x5p,Uid:39f8040c-9434-4de5-b6ff-a6491073dd00,Namespace:kube-system,Attempt:0,}" Sep 12 00:26:38.715778 systemd-networkd[1493]: cali77e4ae54f8f: Link UP Sep 12 00:26:38.717810 systemd-networkd[1493]: cali77e4ae54f8f: Gained carrier Sep 12 00:26:38.729662 kubelet[2697]: I0912 00:26:38.729607 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-8qdzn" podStartSLOduration=46.729590294 podStartE2EDuration="46.729590294s" podCreationTimestamp="2025-09-12 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:26:37.946300538 +0000 UTC m=+50.639566877" watchObservedRunningTime="2025-09-12 00:26:38.729590294 +0000 UTC m=+51.422856643" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.642 [INFO][4766] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0 coredns-668d6bf9bc- kube-system 39f8040c-9434-4de5-b6ff-a6491073dd00 806 0 2025-09-12 00:25:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-z6x5p eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali77e4ae54f8f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.642 [INFO][4766] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.676 [INFO][4780] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" HandleID="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Workload="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.676 [INFO][4780] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" HandleID="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Workload="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7270), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-z6x5p", "timestamp":"2025-09-12 00:26:38.676680356 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.677 [INFO][4780] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.677 [INFO][4780] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.677 [INFO][4780] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.685 [INFO][4780] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.689 [INFO][4780] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.693 [INFO][4780] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.695 [INFO][4780] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.697 [INFO][4780] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.697 [INFO][4780] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.698 [INFO][4780] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10 Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.702 [INFO][4780] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.707 [INFO][4780] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.707 [INFO][4780] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" host="localhost" Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.707 [INFO][4780] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:38.733041 containerd[1555]: 2025-09-12 00:26:38.707 [INFO][4780] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" HandleID="k8s-pod-network.138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Workload="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" Sep 12 00:26:38.733530 containerd[1555]: 2025-09-12 00:26:38.711 [INFO][4766] cni-plugin/k8s.go 418: Populated endpoint ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"39f8040c-9434-4de5-b6ff-a6491073dd00", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 25, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-z6x5p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali77e4ae54f8f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:38.733530 containerd[1555]: 2025-09-12 00:26:38.712 [INFO][4766] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" Sep 12 00:26:38.733530 containerd[1555]: 2025-09-12 00:26:38.712 [INFO][4766] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali77e4ae54f8f ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" Sep 12 00:26:38.733530 containerd[1555]: 2025-09-12 00:26:38.717 [INFO][4766] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" Sep 12 00:26:38.733530 containerd[1555]: 2025-09-12 00:26:38.718 [INFO][4766] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"39f8040c-9434-4de5-b6ff-a6491073dd00", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 25, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10", Pod:"coredns-668d6bf9bc-z6x5p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali77e4ae54f8f", MAC:"9e:f7:5d:d8:4b:11", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:38.733530 containerd[1555]: 2025-09-12 00:26:38.727 [INFO][4766] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" Namespace="kube-system" Pod="coredns-668d6bf9bc-z6x5p" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--z6x5p-eth0" Sep 12 00:26:38.835532 containerd[1555]: time="2025-09-12T00:26:38.835480930Z" level=info msg="connecting to shim 138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10" address="unix:///run/containerd/s/667b7ca12f91b287140fa7bcf307e8302e57cfff7c4c36c70318e90d9e40c879" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:38.876812 systemd[1]: Started cri-containerd-138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10.scope - libcontainer container 138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10. Sep 12 00:26:38.931007 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:39.001901 systemd-networkd[1493]: cali91b6fdf46ac: Gained IPv6LL Sep 12 00:26:39.035501 containerd[1555]: time="2025-09-12T00:26:39.035465723Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\" id:\"8c5d21e676523f8eca604d396bcf34bd854fcfd8d8f6bd1c271b9e647248d87f\" pid:4846 exit_status:1 exited_at:{seconds:1757636799 nanos:35151995}" Sep 12 00:26:39.105599 containerd[1555]: time="2025-09-12T00:26:39.105565895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z6x5p,Uid:39f8040c-9434-4de5-b6ff-a6491073dd00,Namespace:kube-system,Attempt:0,} returns sandbox id \"138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10\"" Sep 12 00:26:39.108196 containerd[1555]: time="2025-09-12T00:26:39.108148000Z" level=info msg="CreateContainer within sandbox \"138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 00:26:39.118907 containerd[1555]: time="2025-09-12T00:26:39.118878647Z" level=info msg="Container 3c57de139f3c600eadfc3ed790ecc3e5e9795680d2ab3861a0e0a5ee1c2d066b: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:39.127645 containerd[1555]: time="2025-09-12T00:26:39.127612066Z" level=info msg="CreateContainer within sandbox \"138dc1a6302648079689d9f18b8dc91fe6f24fb868db66bb81e50ee51065da10\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3c57de139f3c600eadfc3ed790ecc3e5e9795680d2ab3861a0e0a5ee1c2d066b\"" Sep 12 00:26:39.128675 containerd[1555]: time="2025-09-12T00:26:39.128300137Z" level=info msg="StartContainer for \"3c57de139f3c600eadfc3ed790ecc3e5e9795680d2ab3861a0e0a5ee1c2d066b\"" Sep 12 00:26:39.130998 containerd[1555]: time="2025-09-12T00:26:39.130976130Z" level=info msg="connecting to shim 3c57de139f3c600eadfc3ed790ecc3e5e9795680d2ab3861a0e0a5ee1c2d066b" address="unix:///run/containerd/s/667b7ca12f91b287140fa7bcf307e8302e57cfff7c4c36c70318e90d9e40c879" protocol=ttrpc version=3 Sep 12 00:26:39.158822 systemd[1]: Started cri-containerd-3c57de139f3c600eadfc3ed790ecc3e5e9795680d2ab3861a0e0a5ee1c2d066b.scope - libcontainer container 3c57de139f3c600eadfc3ed790ecc3e5e9795680d2ab3861a0e0a5ee1c2d066b. Sep 12 00:26:39.201265 containerd[1555]: time="2025-09-12T00:26:39.201110204Z" level=info msg="StartContainer for \"3c57de139f3c600eadfc3ed790ecc3e5e9795680d2ab3861a0e0a5ee1c2d066b\" returns successfully" Sep 12 00:26:39.607638 containerd[1555]: time="2025-09-12T00:26:39.607521924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bkq9f,Uid:87a73b77-6f47-40e3-9295-51af59f40cde,Namespace:calico-system,Attempt:0,}" Sep 12 00:26:39.769382 systemd-networkd[1493]: cali1e1e671b8fc: Link UP Sep 12 00:26:39.769897 systemd-networkd[1493]: cali1e1e671b8fc: Gained carrier Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.681 [INFO][4904] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--bkq9f-eth0 csi-node-driver- calico-system 87a73b77-6f47-40e3-9295-51af59f40cde 686 0 2025-09-12 00:26:08 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-bkq9f eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali1e1e671b8fc [] [] }} ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.682 [INFO][4904] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-eth0" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.720 [INFO][4918] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" HandleID="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Workload="localhost-k8s-csi--node--driver--bkq9f-eth0" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.720 [INFO][4918] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" HandleID="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Workload="localhost-k8s-csi--node--driver--bkq9f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000138e30), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-bkq9f", "timestamp":"2025-09-12 00:26:39.720717037 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.721 [INFO][4918] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.723 [INFO][4918] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.723 [INFO][4918] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.731 [INFO][4918] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.736 [INFO][4918] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.741 [INFO][4918] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.743 [INFO][4918] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.745 [INFO][4918] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.745 [INFO][4918] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.746 [INFO][4918] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.750 [INFO][4918] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.756 [INFO][4918] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.756 [INFO][4918] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" host="localhost" Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.756 [INFO][4918] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 00:26:39.787397 containerd[1555]: 2025-09-12 00:26:39.756 [INFO][4918] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" HandleID="k8s-pod-network.08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Workload="localhost-k8s-csi--node--driver--bkq9f-eth0" Sep 12 00:26:39.788155 containerd[1555]: 2025-09-12 00:26:39.763 [INFO][4904] cni-plugin/k8s.go 418: Populated endpoint ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--bkq9f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"87a73b77-6f47-40e3-9295-51af59f40cde", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-bkq9f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1e1e671b8fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:39.788155 containerd[1555]: 2025-09-12 00:26:39.764 [INFO][4904] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-eth0" Sep 12 00:26:39.788155 containerd[1555]: 2025-09-12 00:26:39.764 [INFO][4904] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1e1e671b8fc ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-eth0" Sep 12 00:26:39.788155 containerd[1555]: 2025-09-12 00:26:39.770 [INFO][4904] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-eth0" Sep 12 00:26:39.788155 containerd[1555]: 2025-09-12 00:26:39.772 [INFO][4904] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--bkq9f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"87a73b77-6f47-40e3-9295-51af59f40cde", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 0, 26, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a", Pod:"csi-node-driver-bkq9f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1e1e671b8fc", MAC:"02:06:d1:fd:ce:ac", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 00:26:39.788155 containerd[1555]: 2025-09-12 00:26:39.783 [INFO][4904] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" Namespace="calico-system" Pod="csi-node-driver-bkq9f" WorkloadEndpoint="localhost-k8s-csi--node--driver--bkq9f-eth0" Sep 12 00:26:39.815742 containerd[1555]: time="2025-09-12T00:26:39.815687059Z" level=info msg="connecting to shim 08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a" address="unix:///run/containerd/s/62e6d24b6f34a51c95ac6ce5cd8aa8e959f266869f453afdd0d7dc50e809a69e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 00:26:39.881956 systemd[1]: Started cri-containerd-08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a.scope - libcontainer container 08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a. Sep 12 00:26:39.897870 systemd-networkd[1493]: cali77e4ae54f8f: Gained IPv6LL Sep 12 00:26:39.951563 kubelet[2697]: I0912 00:26:39.951356 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-z6x5p" podStartSLOduration=47.951340799 podStartE2EDuration="47.951340799s" podCreationTimestamp="2025-09-12 00:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 00:26:39.949867665 +0000 UTC m=+52.643134014" watchObservedRunningTime="2025-09-12 00:26:39.951340799 +0000 UTC m=+52.644607148" Sep 12 00:26:40.012781 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 12 00:26:40.033587 containerd[1555]: time="2025-09-12T00:26:40.033396603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bkq9f,Uid:87a73b77-6f47-40e3-9295-51af59f40cde,Namespace:calico-system,Attempt:0,} returns sandbox id \"08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a\"" Sep 12 00:26:40.082677 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1752027319.mount: Deactivated successfully. Sep 12 00:26:40.104793 containerd[1555]: time="2025-09-12T00:26:40.104749548Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:40.106329 containerd[1555]: time="2025-09-12T00:26:40.106157129Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 00:26:40.107456 containerd[1555]: time="2025-09-12T00:26:40.107416232Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:40.109906 containerd[1555]: time="2025-09-12T00:26:40.109879144Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:40.111591 containerd[1555]: time="2025-09-12T00:26:40.111408503Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.599972094s" Sep 12 00:26:40.111669 containerd[1555]: time="2025-09-12T00:26:40.111654946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 00:26:40.113374 containerd[1555]: time="2025-09-12T00:26:40.113328516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 00:26:40.114110 containerd[1555]: time="2025-09-12T00:26:40.113858030Z" level=info msg="CreateContainer within sandbox \"b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 00:26:40.123132 containerd[1555]: time="2025-09-12T00:26:40.123012909Z" level=info msg="Container 0684629396d0cfd75907ca0e67d3feeb1203215ce493ba6298aa996218323add: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:40.124031 containerd[1555]: time="2025-09-12T00:26:40.124003529Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\" id:\"5c322020bfc3d470e4cb2bdea06e7fc60d9cf615483e78cc844b8f2207402cb0\" pid:4992 exit_status:1 exited_at:{seconds:1757636800 nanos:123618686}" Sep 12 00:26:40.128905 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3099565940.mount: Deactivated successfully. Sep 12 00:26:40.134731 containerd[1555]: time="2025-09-12T00:26:40.134649145Z" level=info msg="CreateContainer within sandbox \"b4c40520792f19aa5958ecaded0c5d2428f71644b0018298d18ec01dcdf9697b\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"0684629396d0cfd75907ca0e67d3feeb1203215ce493ba6298aa996218323add\"" Sep 12 00:26:40.135723 containerd[1555]: time="2025-09-12T00:26:40.135292673Z" level=info msg="StartContainer for \"0684629396d0cfd75907ca0e67d3feeb1203215ce493ba6298aa996218323add\"" Sep 12 00:26:40.136306 containerd[1555]: time="2025-09-12T00:26:40.136275377Z" level=info msg="connecting to shim 0684629396d0cfd75907ca0e67d3feeb1203215ce493ba6298aa996218323add" address="unix:///run/containerd/s/382623ab76a4419800e60a1d37ae28bf5532d063c3f8c171b29e0aa6dfbbcd7e" protocol=ttrpc version=3 Sep 12 00:26:40.160849 systemd[1]: Started cri-containerd-0684629396d0cfd75907ca0e67d3feeb1203215ce493ba6298aa996218323add.scope - libcontainer container 0684629396d0cfd75907ca0e67d3feeb1203215ce493ba6298aa996218323add. Sep 12 00:26:40.225363 containerd[1555]: time="2025-09-12T00:26:40.225246342Z" level=info msg="StartContainer for \"0684629396d0cfd75907ca0e67d3feeb1203215ce493ba6298aa996218323add\" returns successfully" Sep 12 00:26:40.941405 kubelet[2697]: I0912 00:26:40.941341 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-86b9f779f9-8w7t8" podStartSLOduration=2.651955701 podStartE2EDuration="9.941326838s" podCreationTimestamp="2025-09-12 00:26:31 +0000 UTC" firstStartedPulling="2025-09-12 00:26:32.823263506 +0000 UTC m=+45.516529856" lastFinishedPulling="2025-09-12 00:26:40.112634644 +0000 UTC m=+52.805900993" observedRunningTime="2025-09-12 00:26:40.940845625 +0000 UTC m=+53.634111974" watchObservedRunningTime="2025-09-12 00:26:40.941326838 +0000 UTC m=+53.634593187" Sep 12 00:26:41.433861 systemd-networkd[1493]: cali1e1e671b8fc: Gained IPv6LL Sep 12 00:26:43.908711 containerd[1555]: time="2025-09-12T00:26:43.908641909Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:43.909580 containerd[1555]: time="2025-09-12T00:26:43.909548089Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 00:26:43.910843 containerd[1555]: time="2025-09-12T00:26:43.910811569Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:43.912779 containerd[1555]: time="2025-09-12T00:26:43.912739637Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:43.913286 containerd[1555]: time="2025-09-12T00:26:43.913241760Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.79927231s" Sep 12 00:26:43.913286 containerd[1555]: time="2025-09-12T00:26:43.913281624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 00:26:43.917730 containerd[1555]: time="2025-09-12T00:26:43.917691588Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 00:26:43.927903 containerd[1555]: time="2025-09-12T00:26:43.927865848Z" level=info msg="CreateContainer within sandbox \"9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 00:26:43.935872 containerd[1555]: time="2025-09-12T00:26:43.935826724Z" level=info msg="Container 0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:43.949424 containerd[1555]: time="2025-09-12T00:26:43.949375746Z" level=info msg="CreateContainer within sandbox \"9a11e257540700309e56d73fb286cdf63da5415ea5234dfb717a8dc1491f4376\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7\"" Sep 12 00:26:43.950720 containerd[1555]: time="2025-09-12T00:26:43.949817865Z" level=info msg="StartContainer for \"0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7\"" Sep 12 00:26:43.950881 containerd[1555]: time="2025-09-12T00:26:43.950840504Z" level=info msg="connecting to shim 0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7" address="unix:///run/containerd/s/131a2ffc648a515bff4b804e636a7d9602b6015f052297aefd22fc3ebaa47c97" protocol=ttrpc version=3 Sep 12 00:26:43.972838 systemd[1]: Started cri-containerd-0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7.scope - libcontainer container 0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7. Sep 12 00:26:44.025391 containerd[1555]: time="2025-09-12T00:26:44.025352796Z" level=info msg="StartContainer for \"0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7\" returns successfully" Sep 12 00:26:44.959746 kubelet[2697]: I0912 00:26:44.959471 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-75845c7647-hjrgt" podStartSLOduration=27.861921095 podStartE2EDuration="36.959457835s" podCreationTimestamp="2025-09-12 00:26:08 +0000 UTC" firstStartedPulling="2025-09-12 00:26:34.820003575 +0000 UTC m=+47.513269924" lastFinishedPulling="2025-09-12 00:26:43.917540325 +0000 UTC m=+56.610806664" observedRunningTime="2025-09-12 00:26:44.955435409 +0000 UTC m=+57.648701768" watchObservedRunningTime="2025-09-12 00:26:44.959457835 +0000 UTC m=+57.652724184" Sep 12 00:26:44.996786 containerd[1555]: time="2025-09-12T00:26:44.995632666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7\" id:\"f19848e3c8a18d32d38b7692a34337c0c3f9dd96450e54daa8620c6c08191266\" pid:5121 exited_at:{seconds:1757636804 nanos:994748828}" Sep 12 00:26:47.532136 containerd[1555]: time="2025-09-12T00:26:47.532065953Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:47.533467 containerd[1555]: time="2025-09-12T00:26:47.533403302Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 00:26:47.534629 containerd[1555]: time="2025-09-12T00:26:47.534600688Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:47.537484 containerd[1555]: time="2025-09-12T00:26:47.537042118Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:47.537866 containerd[1555]: time="2025-09-12T00:26:47.537835857Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.620099144s" Sep 12 00:26:47.537933 containerd[1555]: time="2025-09-12T00:26:47.537868278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 00:26:47.551424 containerd[1555]: time="2025-09-12T00:26:47.551383492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 00:26:47.561667 containerd[1555]: time="2025-09-12T00:26:47.561621689Z" level=info msg="CreateContainer within sandbox \"412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 00:26:47.572719 containerd[1555]: time="2025-09-12T00:26:47.572649316Z" level=info msg="Container 2acb5788d1340a039c5bcba7e28d906bd8c3a7690603fabfdd43fda2c11ff0af: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:47.581357 containerd[1555]: time="2025-09-12T00:26:47.581304132Z" level=info msg="CreateContainer within sandbox \"412d802098a8ad01e67be3fd876e4268fac9e27496f04ea6c2291eabc4c00126\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2acb5788d1340a039c5bcba7e28d906bd8c3a7690603fabfdd43fda2c11ff0af\"" Sep 12 00:26:47.582752 containerd[1555]: time="2025-09-12T00:26:47.582473376Z" level=info msg="StartContainer for \"2acb5788d1340a039c5bcba7e28d906bd8c3a7690603fabfdd43fda2c11ff0af\"" Sep 12 00:26:47.584136 containerd[1555]: time="2025-09-12T00:26:47.583645204Z" level=info msg="connecting to shim 2acb5788d1340a039c5bcba7e28d906bd8c3a7690603fabfdd43fda2c11ff0af" address="unix:///run/containerd/s/46fbef29e8fae6c40b052ef787796a1dabfcef51691a5149d330391e80243b35" protocol=ttrpc version=3 Sep 12 00:26:47.634125 systemd[1]: Started cri-containerd-2acb5788d1340a039c5bcba7e28d906bd8c3a7690603fabfdd43fda2c11ff0af.scope - libcontainer container 2acb5788d1340a039c5bcba7e28d906bd8c3a7690603fabfdd43fda2c11ff0af. Sep 12 00:26:47.685687 containerd[1555]: time="2025-09-12T00:26:47.685645824Z" level=info msg="StartContainer for \"2acb5788d1340a039c5bcba7e28d906bd8c3a7690603fabfdd43fda2c11ff0af\" returns successfully" Sep 12 00:26:48.113844 kubelet[2697]: I0912 00:26:48.113665 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-545b8468d6-b9fsq" podStartSLOduration=32.364188355 podStartE2EDuration="44.113643464s" podCreationTimestamp="2025-09-12 00:26:04 +0000 UTC" firstStartedPulling="2025-09-12 00:26:35.801583547 +0000 UTC m=+48.494849896" lastFinishedPulling="2025-09-12 00:26:47.551038656 +0000 UTC m=+60.244305005" observedRunningTime="2025-09-12 00:26:48.113330207 +0000 UTC m=+60.806596576" watchObservedRunningTime="2025-09-12 00:26:48.113643464 +0000 UTC m=+60.806909813" Sep 12 00:26:48.186839 containerd[1555]: time="2025-09-12T00:26:48.186795792Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:48.188583 containerd[1555]: time="2025-09-12T00:26:48.188554370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 00:26:48.190059 containerd[1555]: time="2025-09-12T00:26:48.190036702Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 638.615669ms" Sep 12 00:26:48.190499 containerd[1555]: time="2025-09-12T00:26:48.190061548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 00:26:48.192063 containerd[1555]: time="2025-09-12T00:26:48.192042193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 00:26:48.195248 containerd[1555]: time="2025-09-12T00:26:48.195164300Z" level=info msg="CreateContainer within sandbox \"470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 00:26:48.206721 containerd[1555]: time="2025-09-12T00:26:48.206071240Z" level=info msg="Container a505676622c32c7390d3adcadf7cb48aae4e65c227f58b5742d41462c5ad65de: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:48.214551 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3416693261.mount: Deactivated successfully. Sep 12 00:26:48.217123 containerd[1555]: time="2025-09-12T00:26:48.217073640Z" level=info msg="CreateContainer within sandbox \"470721ee2a95b41e2ea9034db0ea32555aa6b409cec42e89596f76c6e188883d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a505676622c32c7390d3adcadf7cb48aae4e65c227f58b5742d41462c5ad65de\"" Sep 12 00:26:48.218062 containerd[1555]: time="2025-09-12T00:26:48.218026277Z" level=info msg="StartContainer for \"a505676622c32c7390d3adcadf7cb48aae4e65c227f58b5742d41462c5ad65de\"" Sep 12 00:26:48.219240 containerd[1555]: time="2025-09-12T00:26:48.219214576Z" level=info msg="connecting to shim a505676622c32c7390d3adcadf7cb48aae4e65c227f58b5742d41462c5ad65de" address="unix:///run/containerd/s/127e0b2f011839b36e7088ec890efc4de637a0a755b2573227819d5d3586985c" protocol=ttrpc version=3 Sep 12 00:26:48.245943 systemd[1]: Started cri-containerd-a505676622c32c7390d3adcadf7cb48aae4e65c227f58b5742d41462c5ad65de.scope - libcontainer container a505676622c32c7390d3adcadf7cb48aae4e65c227f58b5742d41462c5ad65de. Sep 12 00:26:48.334183 containerd[1555]: time="2025-09-12T00:26:48.334110850Z" level=info msg="StartContainer for \"a505676622c32c7390d3adcadf7cb48aae4e65c227f58b5742d41462c5ad65de\" returns successfully" Sep 12 00:26:48.969751 kubelet[2697]: I0912 00:26:48.969709 2697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:26:49.003970 kubelet[2697]: I0912 00:26:49.003904 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-545b8468d6-d6cqw" podStartSLOduration=34.762092989 podStartE2EDuration="45.00388918s" podCreationTimestamp="2025-09-12 00:26:04 +0000 UTC" firstStartedPulling="2025-09-12 00:26:37.949344211 +0000 UTC m=+50.642610560" lastFinishedPulling="2025-09-12 00:26:48.191140402 +0000 UTC m=+60.884406751" observedRunningTime="2025-09-12 00:26:49.00290278 +0000 UTC m=+61.696169119" watchObservedRunningTime="2025-09-12 00:26:49.00388918 +0000 UTC m=+61.697155529" Sep 12 00:26:50.202094 systemd[1]: Started sshd@7-10.0.0.151:22-10.0.0.1:37586.service - OpenSSH per-connection server daemon (10.0.0.1:37586). Sep 12 00:26:50.278975 sshd[5227]: Accepted publickey for core from 10.0.0.1 port 37586 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:26:50.281097 sshd-session[5227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:26:50.287997 systemd-logind[1546]: New session 8 of user core. Sep 12 00:26:50.294863 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 00:26:50.490817 sshd[5230]: Connection closed by 10.0.0.1 port 37586 Sep 12 00:26:50.490636 sshd-session[5227]: pam_unix(sshd:session): session closed for user core Sep 12 00:26:50.495604 systemd[1]: sshd@7-10.0.0.151:22-10.0.0.1:37586.service: Deactivated successfully. Sep 12 00:26:50.498120 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 00:26:50.501303 systemd-logind[1546]: Session 8 logged out. Waiting for processes to exit. Sep 12 00:26:50.503080 systemd-logind[1546]: Removed session 8. Sep 12 00:26:50.992464 containerd[1555]: time="2025-09-12T00:26:50.992419237Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:50.993487 containerd[1555]: time="2025-09-12T00:26:50.993312132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 00:26:50.994616 containerd[1555]: time="2025-09-12T00:26:50.994582525Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:50.996744 containerd[1555]: time="2025-09-12T00:26:50.996676883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:50.997069 containerd[1555]: time="2025-09-12T00:26:50.997038572Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.804875993s" Sep 12 00:26:50.997069 containerd[1555]: time="2025-09-12T00:26:50.997065873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 00:26:50.999777 containerd[1555]: time="2025-09-12T00:26:50.999734960Z" level=info msg="CreateContainer within sandbox \"08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 00:26:51.017300 containerd[1555]: time="2025-09-12T00:26:51.016442578Z" level=info msg="Container 2f0627973d74eaf16c4cff2b67fcc37327d04a325a0c998f9c7783f369019c77: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:51.035927 containerd[1555]: time="2025-09-12T00:26:51.035860823Z" level=info msg="CreateContainer within sandbox \"08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2f0627973d74eaf16c4cff2b67fcc37327d04a325a0c998f9c7783f369019c77\"" Sep 12 00:26:51.036642 containerd[1555]: time="2025-09-12T00:26:51.036616359Z" level=info msg="StartContainer for \"2f0627973d74eaf16c4cff2b67fcc37327d04a325a0c998f9c7783f369019c77\"" Sep 12 00:26:51.038370 containerd[1555]: time="2025-09-12T00:26:51.038257919Z" level=info msg="connecting to shim 2f0627973d74eaf16c4cff2b67fcc37327d04a325a0c998f9c7783f369019c77" address="unix:///run/containerd/s/62e6d24b6f34a51c95ac6ce5cd8aa8e959f266869f453afdd0d7dc50e809a69e" protocol=ttrpc version=3 Sep 12 00:26:51.067968 systemd[1]: Started cri-containerd-2f0627973d74eaf16c4cff2b67fcc37327d04a325a0c998f9c7783f369019c77.scope - libcontainer container 2f0627973d74eaf16c4cff2b67fcc37327d04a325a0c998f9c7783f369019c77. Sep 12 00:26:51.131321 containerd[1555]: time="2025-09-12T00:26:51.131277650Z" level=info msg="StartContainer for \"2f0627973d74eaf16c4cff2b67fcc37327d04a325a0c998f9c7783f369019c77\" returns successfully" Sep 12 00:26:51.133242 containerd[1555]: time="2025-09-12T00:26:51.133219042Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 00:26:53.042438 containerd[1555]: time="2025-09-12T00:26:53.042375447Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:53.043761 containerd[1555]: time="2025-09-12T00:26:53.043729072Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 00:26:53.045036 containerd[1555]: time="2025-09-12T00:26:53.044920213Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:53.047580 containerd[1555]: time="2025-09-12T00:26:53.046969993Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 00:26:53.047755 containerd[1555]: time="2025-09-12T00:26:53.047683332Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.914435736s" Sep 12 00:26:53.047858 containerd[1555]: time="2025-09-12T00:26:53.047843691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 00:26:53.051483 containerd[1555]: time="2025-09-12T00:26:53.051431924Z" level=info msg="CreateContainer within sandbox \"08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 00:26:53.060269 containerd[1555]: time="2025-09-12T00:26:53.060249640Z" level=info msg="Container d5299653d685165ee72e0458dafc75ee85c69db7a7e9c38fa2f67e362fa09e45: CDI devices from CRI Config.CDIDevices: []" Sep 12 00:26:53.070850 containerd[1555]: time="2025-09-12T00:26:53.070818129Z" level=info msg="CreateContainer within sandbox \"08052b5025da1c7eeb626440eaf38067297948ff657db47d8d0e6b9adb106e2a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d5299653d685165ee72e0458dafc75ee85c69db7a7e9c38fa2f67e362fa09e45\"" Sep 12 00:26:53.071649 containerd[1555]: time="2025-09-12T00:26:53.071605912Z" level=info msg="StartContainer for \"d5299653d685165ee72e0458dafc75ee85c69db7a7e9c38fa2f67e362fa09e45\"" Sep 12 00:26:53.073220 containerd[1555]: time="2025-09-12T00:26:53.073180775Z" level=info msg="connecting to shim d5299653d685165ee72e0458dafc75ee85c69db7a7e9c38fa2f67e362fa09e45" address="unix:///run/containerd/s/62e6d24b6f34a51c95ac6ce5cd8aa8e959f266869f453afdd0d7dc50e809a69e" protocol=ttrpc version=3 Sep 12 00:26:53.099019 systemd[1]: Started cri-containerd-d5299653d685165ee72e0458dafc75ee85c69db7a7e9c38fa2f67e362fa09e45.scope - libcontainer container d5299653d685165ee72e0458dafc75ee85c69db7a7e9c38fa2f67e362fa09e45. Sep 12 00:26:53.150833 containerd[1555]: time="2025-09-12T00:26:53.150784438Z" level=info msg="StartContainer for \"d5299653d685165ee72e0458dafc75ee85c69db7a7e9c38fa2f67e362fa09e45\" returns successfully" Sep 12 00:26:53.738693 kubelet[2697]: I0912 00:26:53.738650 2697 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 00:26:53.738693 kubelet[2697]: I0912 00:26:53.738679 2697 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 00:26:54.005895 kubelet[2697]: I0912 00:26:54.005752 2697 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-bkq9f" podStartSLOduration=32.99247217 podStartE2EDuration="46.005736354s" podCreationTimestamp="2025-09-12 00:26:08 +0000 UTC" firstStartedPulling="2025-09-12 00:26:40.035570903 +0000 UTC m=+52.728837252" lastFinishedPulling="2025-09-12 00:26:53.048835097 +0000 UTC m=+65.742101436" observedRunningTime="2025-09-12 00:26:54.004832008 +0000 UTC m=+66.698098377" watchObservedRunningTime="2025-09-12 00:26:54.005736354 +0000 UTC m=+66.699002713" Sep 12 00:26:55.507122 systemd[1]: Started sshd@8-10.0.0.151:22-10.0.0.1:37596.service - OpenSSH per-connection server daemon (10.0.0.1:37596). Sep 12 00:26:55.572566 sshd[5331]: Accepted publickey for core from 10.0.0.1 port 37596 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:26:55.574285 sshd-session[5331]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:26:55.580748 systemd-logind[1546]: New session 9 of user core. Sep 12 00:26:55.586879 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 00:26:55.780728 sshd[5334]: Connection closed by 10.0.0.1 port 37596 Sep 12 00:26:55.782922 sshd-session[5331]: pam_unix(sshd:session): session closed for user core Sep 12 00:26:55.797518 systemd-logind[1546]: Session 9 logged out. Waiting for processes to exit. Sep 12 00:26:55.800485 systemd[1]: sshd@8-10.0.0.151:22-10.0.0.1:37596.service: Deactivated successfully. Sep 12 00:26:55.803660 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 00:26:55.811439 systemd-logind[1546]: Removed session 9. Sep 12 00:27:00.797570 systemd[1]: Started sshd@9-10.0.0.151:22-10.0.0.1:53738.service - OpenSSH per-connection server daemon (10.0.0.1:53738). Sep 12 00:27:00.842376 sshd[5353]: Accepted publickey for core from 10.0.0.1 port 53738 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:00.843783 sshd-session[5353]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:00.848089 systemd-logind[1546]: New session 10 of user core. Sep 12 00:27:00.857839 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 00:27:00.982346 sshd[5355]: Connection closed by 10.0.0.1 port 53738 Sep 12 00:27:00.982658 sshd-session[5353]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:00.986592 systemd[1]: sshd@9-10.0.0.151:22-10.0.0.1:53738.service: Deactivated successfully. Sep 12 00:27:00.988826 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 00:27:00.990469 systemd-logind[1546]: Session 10 logged out. Waiting for processes to exit. Sep 12 00:27:00.993146 systemd-logind[1546]: Removed session 10. Sep 12 00:27:03.448371 kubelet[2697]: I0912 00:27:03.448147 2697 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 00:27:05.996586 systemd[1]: Started sshd@10-10.0.0.151:22-10.0.0.1:53754.service - OpenSSH per-connection server daemon (10.0.0.1:53754). Sep 12 00:27:06.045875 sshd[5372]: Accepted publickey for core from 10.0.0.1 port 53754 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:06.047914 sshd-session[5372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:06.052789 systemd-logind[1546]: New session 11 of user core. Sep 12 00:27:06.061901 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 00:27:06.180839 sshd[5374]: Connection closed by 10.0.0.1 port 53754 Sep 12 00:27:06.181237 sshd-session[5372]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:06.192143 systemd[1]: sshd@10-10.0.0.151:22-10.0.0.1:53754.service: Deactivated successfully. Sep 12 00:27:06.194316 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 00:27:06.198901 systemd-logind[1546]: Session 11 logged out. Waiting for processes to exit. Sep 12 00:27:06.203319 systemd[1]: Started sshd@11-10.0.0.151:22-10.0.0.1:53770.service - OpenSSH per-connection server daemon (10.0.0.1:53770). Sep 12 00:27:06.205286 systemd-logind[1546]: Removed session 11. Sep 12 00:27:06.255384 sshd[5388]: Accepted publickey for core from 10.0.0.1 port 53770 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:06.257065 sshd-session[5388]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:06.262201 systemd-logind[1546]: New session 12 of user core. Sep 12 00:27:06.281904 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 00:27:06.455941 sshd[5390]: Connection closed by 10.0.0.1 port 53770 Sep 12 00:27:06.456318 sshd-session[5388]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:06.473573 systemd[1]: sshd@11-10.0.0.151:22-10.0.0.1:53770.service: Deactivated successfully. Sep 12 00:27:06.475738 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 00:27:06.476477 systemd-logind[1546]: Session 12 logged out. Waiting for processes to exit. Sep 12 00:27:06.479638 systemd[1]: Started sshd@12-10.0.0.151:22-10.0.0.1:53774.service - OpenSSH per-connection server daemon (10.0.0.1:53774). Sep 12 00:27:06.481030 systemd-logind[1546]: Removed session 12. Sep 12 00:27:06.530878 sshd[5402]: Accepted publickey for core from 10.0.0.1 port 53774 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:06.532591 sshd-session[5402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:06.537677 systemd-logind[1546]: New session 13 of user core. Sep 12 00:27:06.545939 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 00:27:06.663434 sshd[5404]: Connection closed by 10.0.0.1 port 53774 Sep 12 00:27:06.663737 sshd-session[5402]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:06.667759 systemd[1]: sshd@12-10.0.0.151:22-10.0.0.1:53774.service: Deactivated successfully. Sep 12 00:27:06.670015 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 00:27:06.671052 systemd-logind[1546]: Session 13 logged out. Waiting for processes to exit. Sep 12 00:27:06.672334 systemd-logind[1546]: Removed session 13. Sep 12 00:27:07.214660 containerd[1555]: time="2025-09-12T00:27:07.214539775Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2\" id:\"bd458ad43eadba3a5e636374514cdcc56afd4b42e1b70b8e463a7e303bdfcc87\" pid:5428 exited_at:{seconds:1757636827 nanos:214178343}" Sep 12 00:27:10.066637 containerd[1555]: time="2025-09-12T00:27:10.066603571Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\" id:\"649cba8d4ca16174187f5b2d302418fa8bd14d196d47b23320da485b2560cf3a\" pid:5453 exited_at:{seconds:1757636830 nanos:66105029}" Sep 12 00:27:11.683969 systemd[1]: Started sshd@13-10.0.0.151:22-10.0.0.1:55658.service - OpenSSH per-connection server daemon (10.0.0.1:55658). Sep 12 00:27:11.740666 sshd[5471]: Accepted publickey for core from 10.0.0.1 port 55658 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:11.742448 sshd-session[5471]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:11.748784 systemd-logind[1546]: New session 14 of user core. Sep 12 00:27:11.752852 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 00:27:11.919809 sshd[5473]: Connection closed by 10.0.0.1 port 55658 Sep 12 00:27:11.920104 sshd-session[5471]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:11.926092 systemd[1]: sshd@13-10.0.0.151:22-10.0.0.1:55658.service: Deactivated successfully. Sep 12 00:27:11.928427 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 00:27:11.929453 systemd-logind[1546]: Session 14 logged out. Waiting for processes to exit. Sep 12 00:27:11.932801 systemd-logind[1546]: Removed session 14. Sep 12 00:27:14.797725 containerd[1555]: time="2025-09-12T00:27:14.797293299Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\" id:\"876dea69d2ead9d57d690e31df3d9cdf405cf8adcce5e04dfa33c72a2f4058b0\" pid:5503 exited_at:{seconds:1757636834 nanos:797019076}" Sep 12 00:27:14.992190 containerd[1555]: time="2025-09-12T00:27:14.992138311Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7\" id:\"4b7e2778f11e43c8bc9708f62dc77f3df268f9f59d1fb9fee10133d123eedd76\" pid:5528 exited_at:{seconds:1757636834 nanos:991829011}" Sep 12 00:27:16.936212 systemd[1]: Started sshd@14-10.0.0.151:22-10.0.0.1:55672.service - OpenSSH per-connection server daemon (10.0.0.1:55672). Sep 12 00:27:16.996815 sshd[5539]: Accepted publickey for core from 10.0.0.1 port 55672 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:16.998333 sshd-session[5539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:17.002620 systemd-logind[1546]: New session 15 of user core. Sep 12 00:27:17.010835 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 00:27:17.177062 sshd[5541]: Connection closed by 10.0.0.1 port 55672 Sep 12 00:27:17.177429 sshd-session[5539]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:17.182417 systemd[1]: sshd@14-10.0.0.151:22-10.0.0.1:55672.service: Deactivated successfully. Sep 12 00:27:17.184616 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 00:27:17.185483 systemd-logind[1546]: Session 15 logged out. Waiting for processes to exit. Sep 12 00:27:17.187054 systemd-logind[1546]: Removed session 15. Sep 12 00:27:22.189234 systemd[1]: Started sshd@15-10.0.0.151:22-10.0.0.1:50364.service - OpenSSH per-connection server daemon (10.0.0.1:50364). Sep 12 00:27:22.242676 sshd[5556]: Accepted publickey for core from 10.0.0.1 port 50364 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:22.244419 sshd-session[5556]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:22.249368 systemd-logind[1546]: New session 16 of user core. Sep 12 00:27:22.255813 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 00:27:22.369965 sshd[5558]: Connection closed by 10.0.0.1 port 50364 Sep 12 00:27:22.370297 sshd-session[5556]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:22.374849 systemd[1]: sshd@15-10.0.0.151:22-10.0.0.1:50364.service: Deactivated successfully. Sep 12 00:27:22.376943 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 00:27:22.377894 systemd-logind[1546]: Session 16 logged out. Waiting for processes to exit. Sep 12 00:27:22.379125 systemd-logind[1546]: Removed session 16. Sep 12 00:27:27.386855 systemd[1]: Started sshd@16-10.0.0.151:22-10.0.0.1:50378.service - OpenSSH per-connection server daemon (10.0.0.1:50378). Sep 12 00:27:27.451846 sshd[5574]: Accepted publickey for core from 10.0.0.1 port 50378 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:27.453321 sshd-session[5574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:27.457447 systemd-logind[1546]: New session 17 of user core. Sep 12 00:27:27.465928 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 00:27:27.669773 sshd[5576]: Connection closed by 10.0.0.1 port 50378 Sep 12 00:27:27.670570 sshd-session[5574]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:27.676735 systemd[1]: sshd@16-10.0.0.151:22-10.0.0.1:50378.service: Deactivated successfully. Sep 12 00:27:27.678837 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 00:27:27.679903 systemd-logind[1546]: Session 17 logged out. Waiting for processes to exit. Sep 12 00:27:27.681801 systemd-logind[1546]: Removed session 17. Sep 12 00:27:32.683743 systemd[1]: Started sshd@17-10.0.0.151:22-10.0.0.1:41988.service - OpenSSH per-connection server daemon (10.0.0.1:41988). Sep 12 00:27:32.742063 sshd[5590]: Accepted publickey for core from 10.0.0.1 port 41988 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:32.743793 sshd-session[5590]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:32.748435 systemd-logind[1546]: New session 18 of user core. Sep 12 00:27:32.752822 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 00:27:32.879954 sshd[5592]: Connection closed by 10.0.0.1 port 41988 Sep 12 00:27:32.880636 sshd-session[5590]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:32.889863 systemd[1]: sshd@17-10.0.0.151:22-10.0.0.1:41988.service: Deactivated successfully. Sep 12 00:27:32.891962 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 00:27:32.894194 systemd-logind[1546]: Session 18 logged out. Waiting for processes to exit. Sep 12 00:27:32.896596 systemd[1]: Started sshd@18-10.0.0.151:22-10.0.0.1:42000.service - OpenSSH per-connection server daemon (10.0.0.1:42000). Sep 12 00:27:32.898448 systemd-logind[1546]: Removed session 18. Sep 12 00:27:32.942439 sshd[5605]: Accepted publickey for core from 10.0.0.1 port 42000 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:32.943979 sshd-session[5605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:32.948765 systemd-logind[1546]: New session 19 of user core. Sep 12 00:27:32.956834 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 00:27:33.262397 sshd[5607]: Connection closed by 10.0.0.1 port 42000 Sep 12 00:27:33.262794 sshd-session[5605]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:33.273551 systemd[1]: sshd@18-10.0.0.151:22-10.0.0.1:42000.service: Deactivated successfully. Sep 12 00:27:33.275336 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 00:27:33.276456 systemd-logind[1546]: Session 19 logged out. Waiting for processes to exit. Sep 12 00:27:33.280040 systemd[1]: Started sshd@19-10.0.0.151:22-10.0.0.1:42016.service - OpenSSH per-connection server daemon (10.0.0.1:42016). Sep 12 00:27:33.280964 systemd-logind[1546]: Removed session 19. Sep 12 00:27:33.342257 sshd[5619]: Accepted publickey for core from 10.0.0.1 port 42016 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:33.343689 sshd-session[5619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:33.348147 systemd-logind[1546]: New session 20 of user core. Sep 12 00:27:33.358833 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 00:27:34.007955 sshd[5621]: Connection closed by 10.0.0.1 port 42016 Sep 12 00:27:34.009739 sshd-session[5619]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:34.020496 systemd[1]: sshd@19-10.0.0.151:22-10.0.0.1:42016.service: Deactivated successfully. Sep 12 00:27:34.022744 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 00:27:34.024638 systemd-logind[1546]: Session 20 logged out. Waiting for processes to exit. Sep 12 00:27:34.030951 systemd[1]: Started sshd@20-10.0.0.151:22-10.0.0.1:42028.service - OpenSSH per-connection server daemon (10.0.0.1:42028). Sep 12 00:27:34.033230 systemd-logind[1546]: Removed session 20. Sep 12 00:27:34.075722 sshd[5638]: Accepted publickey for core from 10.0.0.1 port 42028 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:34.076166 sshd-session[5638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:34.085663 systemd-logind[1546]: New session 21 of user core. Sep 12 00:27:34.088149 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 00:27:34.432306 sshd[5641]: Connection closed by 10.0.0.1 port 42028 Sep 12 00:27:34.432924 sshd-session[5638]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:34.446063 systemd[1]: sshd@20-10.0.0.151:22-10.0.0.1:42028.service: Deactivated successfully. Sep 12 00:27:34.450466 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 00:27:34.452960 systemd-logind[1546]: Session 21 logged out. Waiting for processes to exit. Sep 12 00:27:34.459939 systemd[1]: Started sshd@21-10.0.0.151:22-10.0.0.1:42042.service - OpenSSH per-connection server daemon (10.0.0.1:42042). Sep 12 00:27:34.463755 systemd-logind[1546]: Removed session 21. Sep 12 00:27:34.513123 sshd[5653]: Accepted publickey for core from 10.0.0.1 port 42042 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:34.515075 sshd-session[5653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:34.521638 systemd-logind[1546]: New session 22 of user core. Sep 12 00:27:34.525948 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 00:27:34.646725 sshd[5655]: Connection closed by 10.0.0.1 port 42042 Sep 12 00:27:34.646664 sshd-session[5653]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:34.650507 systemd[1]: sshd@21-10.0.0.151:22-10.0.0.1:42042.service: Deactivated successfully. Sep 12 00:27:34.652996 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 00:27:34.655350 systemd-logind[1546]: Session 22 logged out. Waiting for processes to exit. Sep 12 00:27:34.656843 systemd-logind[1546]: Removed session 22. Sep 12 00:27:37.239334 containerd[1555]: time="2025-09-12T00:27:37.239292697Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c24a0c8575976a2178bd8352bf2a3f439bd49fbc9af41b7bb2585d48f1288de2\" id:\"4fd15e05e49530522cf83db257493ecddbc3d8feeae66133e371c936b24c81a4\" pid:5681 exited_at:{seconds:1757636857 nanos:230846942}" Sep 12 00:27:39.663601 systemd[1]: Started sshd@22-10.0.0.151:22-10.0.0.1:42052.service - OpenSSH per-connection server daemon (10.0.0.1:42052). Sep 12 00:27:39.733943 sshd[5697]: Accepted publickey for core from 10.0.0.1 port 42052 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:39.735636 sshd-session[5697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:39.739995 systemd-logind[1546]: New session 23 of user core. Sep 12 00:27:39.750824 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 00:27:39.888119 sshd[5699]: Connection closed by 10.0.0.1 port 42052 Sep 12 00:27:39.888451 sshd-session[5697]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:39.893126 systemd[1]: sshd@22-10.0.0.151:22-10.0.0.1:42052.service: Deactivated successfully. Sep 12 00:27:39.895473 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 00:27:39.896866 systemd-logind[1546]: Session 23 logged out. Waiting for processes to exit. Sep 12 00:27:39.898937 systemd-logind[1546]: Removed session 23. Sep 12 00:27:40.017836 containerd[1555]: time="2025-09-12T00:27:40.017791200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e385e3fef18a9a99595160f7d1be1427e7f73bd327ff1b14d0be47ed9f44e91\" id:\"a47f15ea3b7fe3838613f941742668b90121848e490e94e5fa87aa3cc30b5829\" pid:5724 exited_at:{seconds:1757636860 nanos:17392446}" Sep 12 00:27:44.906766 systemd[1]: Started sshd@23-10.0.0.151:22-10.0.0.1:54046.service - OpenSSH per-connection server daemon (10.0.0.1:54046). Sep 12 00:27:44.961742 sshd[5737]: Accepted publickey for core from 10.0.0.1 port 54046 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:44.962929 sshd-session[5737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:44.968928 systemd-logind[1546]: New session 24 of user core. Sep 12 00:27:44.978834 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 00:27:44.985469 containerd[1555]: time="2025-09-12T00:27:44.985424658Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7\" id:\"e1d4741818cecca4853bbd76f6da621f72b39264c2801e847a24411d656be771\" pid:5751 exited_at:{seconds:1757636864 nanos:984916267}" Sep 12 00:27:45.141785 sshd[5761]: Connection closed by 10.0.0.1 port 54046 Sep 12 00:27:45.142251 sshd-session[5737]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:45.148352 systemd-logind[1546]: Session 24 logged out. Waiting for processes to exit. Sep 12 00:27:45.149224 systemd[1]: sshd@23-10.0.0.151:22-10.0.0.1:54046.service: Deactivated successfully. Sep 12 00:27:45.153126 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 00:27:45.160136 systemd-logind[1546]: Removed session 24. Sep 12 00:27:50.153674 systemd[1]: Started sshd@24-10.0.0.151:22-10.0.0.1:43384.service - OpenSSH per-connection server daemon (10.0.0.1:43384). Sep 12 00:27:50.219554 sshd[5776]: Accepted publickey for core from 10.0.0.1 port 43384 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:50.221067 sshd-session[5776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:50.225527 systemd-logind[1546]: New session 25 of user core. Sep 12 00:27:50.233895 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 00:27:50.406175 sshd[5778]: Connection closed by 10.0.0.1 port 43384 Sep 12 00:27:50.406854 sshd-session[5776]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:50.411308 systemd[1]: sshd@24-10.0.0.151:22-10.0.0.1:43384.service: Deactivated successfully. Sep 12 00:27:50.413512 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 00:27:50.414378 systemd-logind[1546]: Session 25 logged out. Waiting for processes to exit. Sep 12 00:27:50.415674 systemd-logind[1546]: Removed session 25. Sep 12 00:27:51.001284 containerd[1555]: time="2025-09-12T00:27:51.001240310Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0fb5c41e4cd11b25b88fca68a0675a827b5e23cf4ccdfb0cee092abe118b7ac7\" id:\"b5868f6934c3a32b9d37d3fe1cd3551c74bb7fa8a64113007f00b68aa6fe2fee\" pid:5803 exited_at:{seconds:1757636871 nanos:878366}" Sep 12 00:27:55.422384 systemd[1]: Started sshd@25-10.0.0.151:22-10.0.0.1:43392.service - OpenSSH per-connection server daemon (10.0.0.1:43392). Sep 12 00:27:55.469881 sshd[5823]: Accepted publickey for core from 10.0.0.1 port 43392 ssh2: RSA SHA256:WXW9gUxhZ4uIheI1A/0E9DU0u6cagImKTAH2szZTKFU Sep 12 00:27:55.471735 sshd-session[5823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 00:27:55.476828 systemd-logind[1546]: New session 26 of user core. Sep 12 00:27:55.485883 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 12 00:27:55.601775 sshd[5825]: Connection closed by 10.0.0.1 port 43392 Sep 12 00:27:55.602167 sshd-session[5823]: pam_unix(sshd:session): session closed for user core Sep 12 00:27:55.606209 systemd[1]: sshd@25-10.0.0.151:22-10.0.0.1:43392.service: Deactivated successfully. Sep 12 00:27:55.608352 systemd[1]: session-26.scope: Deactivated successfully. Sep 12 00:27:55.610074 systemd-logind[1546]: Session 26 logged out. Waiting for processes to exit. Sep 12 00:27:55.611811 systemd-logind[1546]: Removed session 26.