Sep 13 00:27:33.829212 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 22:15:39 -00 2025 Sep 13 00:27:33.829239 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=21b29c6e420cf06e0546ff797fc1285d986af130e4ba1abb9f27cb6343b53294 Sep 13 00:27:33.829253 kernel: BIOS-provided physical RAM map: Sep 13 00:27:33.829262 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 13 00:27:33.829271 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 13 00:27:33.829279 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 13 00:27:33.829289 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Sep 13 00:27:33.829298 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 13 00:27:33.829313 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 13 00:27:33.829322 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 13 00:27:33.829331 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Sep 13 00:27:33.829339 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 13 00:27:33.829348 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 13 00:27:33.829357 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 13 00:27:33.829371 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 13 00:27:33.829380 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 13 00:27:33.829390 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 13 00:27:33.829399 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 13 00:27:33.829408 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 13 00:27:33.829418 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 13 00:27:33.829427 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 13 00:27:33.829436 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 13 00:27:33.829445 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 13 00:27:33.829454 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 13 00:27:33.829463 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 13 00:27:33.829475 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 13 00:27:33.829485 kernel: NX (Execute Disable) protection: active Sep 13 00:27:33.829494 kernel: APIC: Static calls initialized Sep 13 00:27:33.829503 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Sep 13 00:27:33.829512 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Sep 13 00:27:33.829532 kernel: extended physical RAM map: Sep 13 00:27:33.829542 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 13 00:27:33.829551 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 13 00:27:33.829560 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 13 00:27:33.829570 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Sep 13 00:27:33.829579 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 13 00:27:33.829592 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 13 00:27:33.829601 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 13 00:27:33.829610 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Sep 13 00:27:33.829620 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Sep 13 00:27:33.829634 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Sep 13 00:27:33.829643 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Sep 13 00:27:33.829655 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Sep 13 00:27:33.829665 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 13 00:27:33.829675 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 13 00:27:33.829685 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 13 00:27:33.829694 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 13 00:27:33.829704 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 13 00:27:33.829714 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 13 00:27:33.829724 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 13 00:27:33.829733 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 13 00:27:33.829745 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 13 00:27:33.829754 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 13 00:27:33.829764 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 13 00:27:33.829786 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 13 00:27:33.829797 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 13 00:27:33.829807 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 13 00:27:33.829816 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 13 00:27:33.829830 kernel: efi: EFI v2.7 by EDK II Sep 13 00:27:33.829839 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Sep 13 00:27:33.829848 kernel: random: crng init done Sep 13 00:27:33.829858 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Sep 13 00:27:33.829868 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Sep 13 00:27:33.829881 kernel: secureboot: Secure boot disabled Sep 13 00:27:33.829891 kernel: SMBIOS 2.8 present. Sep 13 00:27:33.829901 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Sep 13 00:27:33.829911 kernel: DMI: Memory slots populated: 1/1 Sep 13 00:27:33.829921 kernel: Hypervisor detected: KVM Sep 13 00:27:33.829930 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 13 00:27:33.829940 kernel: kvm-clock: using sched offset of 5637366388 cycles Sep 13 00:27:33.829950 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 13 00:27:33.829960 kernel: tsc: Detected 2794.748 MHz processor Sep 13 00:27:33.829971 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 13 00:27:33.829980 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 13 00:27:33.829992 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Sep 13 00:27:33.830002 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 13 00:27:33.830011 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 13 00:27:33.830020 kernel: Using GB pages for direct mapping Sep 13 00:27:33.830030 kernel: ACPI: Early table checksum verification disabled Sep 13 00:27:33.830039 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Sep 13 00:27:33.830049 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Sep 13 00:27:33.830058 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:27:33.830068 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:27:33.830080 kernel: ACPI: FACS 0x000000009CBDD000 000040 Sep 13 00:27:33.830089 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:27:33.830099 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:27:33.830109 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:27:33.830118 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:27:33.830127 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Sep 13 00:27:33.830137 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Sep 13 00:27:33.830156 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Sep 13 00:27:33.830170 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Sep 13 00:27:33.830180 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Sep 13 00:27:33.830191 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Sep 13 00:27:33.830201 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Sep 13 00:27:33.830210 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Sep 13 00:27:33.830220 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Sep 13 00:27:33.830229 kernel: No NUMA configuration found Sep 13 00:27:33.830239 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Sep 13 00:27:33.830248 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Sep 13 00:27:33.830260 kernel: Zone ranges: Sep 13 00:27:33.830270 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 13 00:27:33.830279 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Sep 13 00:27:33.830289 kernel: Normal empty Sep 13 00:27:33.830298 kernel: Device empty Sep 13 00:27:33.830308 kernel: Movable zone start for each node Sep 13 00:27:33.830317 kernel: Early memory node ranges Sep 13 00:27:33.830326 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 13 00:27:33.830335 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Sep 13 00:27:33.830345 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Sep 13 00:27:33.830357 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Sep 13 00:27:33.830366 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Sep 13 00:27:33.830376 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Sep 13 00:27:33.830385 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Sep 13 00:27:33.830394 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Sep 13 00:27:33.830403 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Sep 13 00:27:33.830416 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 13 00:27:33.830426 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 13 00:27:33.830446 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Sep 13 00:27:33.830456 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 13 00:27:33.830474 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Sep 13 00:27:33.830490 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Sep 13 00:27:33.830522 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 13 00:27:33.830533 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Sep 13 00:27:33.830543 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Sep 13 00:27:33.830553 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 13 00:27:33.830563 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 13 00:27:33.830576 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 13 00:27:33.830587 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 13 00:27:33.830597 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 13 00:27:33.830607 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 13 00:27:33.830617 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 13 00:27:33.830627 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 13 00:27:33.830636 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 13 00:27:33.830646 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 13 00:27:33.830656 kernel: TSC deadline timer available Sep 13 00:27:33.830669 kernel: CPU topo: Max. logical packages: 1 Sep 13 00:27:33.830679 kernel: CPU topo: Max. logical dies: 1 Sep 13 00:27:33.830689 kernel: CPU topo: Max. dies per package: 1 Sep 13 00:27:33.830698 kernel: CPU topo: Max. threads per core: 1 Sep 13 00:27:33.830708 kernel: CPU topo: Num. cores per package: 4 Sep 13 00:27:33.830718 kernel: CPU topo: Num. threads per package: 4 Sep 13 00:27:33.830728 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 13 00:27:33.830738 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 13 00:27:33.830748 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 13 00:27:33.830759 kernel: kvm-guest: setup PV sched yield Sep 13 00:27:33.830873 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Sep 13 00:27:33.830886 kernel: Booting paravirtualized kernel on KVM Sep 13 00:27:33.830897 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 13 00:27:33.830908 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 13 00:27:33.830920 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 13 00:27:33.830931 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 13 00:27:33.830941 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 13 00:27:33.830952 kernel: kvm-guest: PV spinlocks enabled Sep 13 00:27:33.830963 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 13 00:27:33.830979 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=21b29c6e420cf06e0546ff797fc1285d986af130e4ba1abb9f27cb6343b53294 Sep 13 00:27:33.830991 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 13 00:27:33.831001 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 13 00:27:33.831012 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 13 00:27:33.831023 kernel: Fallback order for Node 0: 0 Sep 13 00:27:33.831034 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Sep 13 00:27:33.831045 kernel: Policy zone: DMA32 Sep 13 00:27:33.831056 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 13 00:27:33.831070 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 13 00:27:33.831081 kernel: ftrace: allocating 40122 entries in 157 pages Sep 13 00:27:33.831092 kernel: ftrace: allocated 157 pages with 5 groups Sep 13 00:27:33.831103 kernel: Dynamic Preempt: voluntary Sep 13 00:27:33.831114 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 13 00:27:33.831129 kernel: rcu: RCU event tracing is enabled. Sep 13 00:27:33.831140 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 13 00:27:33.831162 kernel: Trampoline variant of Tasks RCU enabled. Sep 13 00:27:33.831173 kernel: Rude variant of Tasks RCU enabled. Sep 13 00:27:33.831187 kernel: Tracing variant of Tasks RCU enabled. Sep 13 00:27:33.831198 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 13 00:27:33.831213 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 13 00:27:33.831224 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 13 00:27:33.831235 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 13 00:27:33.831246 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 13 00:27:33.831257 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 13 00:27:33.831268 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 13 00:27:33.831279 kernel: Console: colour dummy device 80x25 Sep 13 00:27:33.831293 kernel: printk: legacy console [ttyS0] enabled Sep 13 00:27:33.831304 kernel: ACPI: Core revision 20240827 Sep 13 00:27:33.831316 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 13 00:27:33.831327 kernel: APIC: Switch to symmetric I/O mode setup Sep 13 00:27:33.831337 kernel: x2apic enabled Sep 13 00:27:33.831348 kernel: APIC: Switched APIC routing to: physical x2apic Sep 13 00:27:33.831358 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 13 00:27:33.831368 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 13 00:27:33.831379 kernel: kvm-guest: setup PV IPIs Sep 13 00:27:33.831391 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 13 00:27:33.831402 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 13 00:27:33.831412 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Sep 13 00:27:33.831423 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 13 00:27:33.831433 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 13 00:27:33.831443 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 13 00:27:33.831453 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 13 00:27:33.831463 kernel: Spectre V2 : Mitigation: Retpolines Sep 13 00:27:33.831474 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 13 00:27:33.831487 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 13 00:27:33.831497 kernel: active return thunk: retbleed_return_thunk Sep 13 00:27:33.831508 kernel: RETBleed: Mitigation: untrained return thunk Sep 13 00:27:33.831518 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 13 00:27:33.831528 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 13 00:27:33.831539 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 13 00:27:33.831550 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 13 00:27:33.831560 kernel: active return thunk: srso_return_thunk Sep 13 00:27:33.831574 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 13 00:27:33.831584 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 13 00:27:33.831594 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 13 00:27:33.831604 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 13 00:27:33.831614 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 13 00:27:33.831625 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 13 00:27:33.831635 kernel: Freeing SMP alternatives memory: 32K Sep 13 00:27:33.831645 kernel: pid_max: default: 32768 minimum: 301 Sep 13 00:27:33.831655 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 13 00:27:33.831668 kernel: landlock: Up and running. Sep 13 00:27:33.831678 kernel: SELinux: Initializing. Sep 13 00:27:33.831688 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 13 00:27:33.831699 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 13 00:27:33.831709 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 13 00:27:33.831720 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 13 00:27:33.831730 kernel: ... version: 0 Sep 13 00:27:33.831740 kernel: ... bit width: 48 Sep 13 00:27:33.831750 kernel: ... generic registers: 6 Sep 13 00:27:33.831764 kernel: ... value mask: 0000ffffffffffff Sep 13 00:27:33.831788 kernel: ... max period: 00007fffffffffff Sep 13 00:27:33.831799 kernel: ... fixed-purpose events: 0 Sep 13 00:27:33.831809 kernel: ... event mask: 000000000000003f Sep 13 00:27:33.831819 kernel: signal: max sigframe size: 1776 Sep 13 00:27:33.831829 kernel: rcu: Hierarchical SRCU implementation. Sep 13 00:27:33.831844 kernel: rcu: Max phase no-delay instances is 400. Sep 13 00:27:33.831854 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 13 00:27:33.831865 kernel: smp: Bringing up secondary CPUs ... Sep 13 00:27:33.831875 kernel: smpboot: x86: Booting SMP configuration: Sep 13 00:27:33.831889 kernel: .... node #0, CPUs: #1 #2 #3 Sep 13 00:27:33.831899 kernel: smp: Brought up 1 node, 4 CPUs Sep 13 00:27:33.831909 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Sep 13 00:27:33.831919 kernel: Memory: 2424724K/2565800K available (14336K kernel code, 2432K rwdata, 9960K rodata, 53828K init, 1088K bss, 135148K reserved, 0K cma-reserved) Sep 13 00:27:33.831929 kernel: devtmpfs: initialized Sep 13 00:27:33.831939 kernel: x86/mm: Memory block size: 128MB Sep 13 00:27:33.831965 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Sep 13 00:27:33.831991 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Sep 13 00:27:33.832006 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Sep 13 00:27:33.832017 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Sep 13 00:27:33.832028 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Sep 13 00:27:33.832038 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Sep 13 00:27:33.832048 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 13 00:27:33.832065 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 13 00:27:33.832075 kernel: pinctrl core: initialized pinctrl subsystem Sep 13 00:27:33.832084 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 13 00:27:33.832094 kernel: audit: initializing netlink subsys (disabled) Sep 13 00:27:33.832107 kernel: audit: type=2000 audit(1757723251.093:1): state=initialized audit_enabled=0 res=1 Sep 13 00:27:33.832117 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 13 00:27:33.832127 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 13 00:27:33.832137 kernel: cpuidle: using governor menu Sep 13 00:27:33.832160 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 13 00:27:33.832170 kernel: dca service started, version 1.12.1 Sep 13 00:27:33.832180 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 13 00:27:33.832191 kernel: PCI: Using configuration type 1 for base access Sep 13 00:27:33.832201 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 13 00:27:33.832214 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 13 00:27:33.832224 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 13 00:27:33.832234 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 13 00:27:33.832243 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 13 00:27:33.832253 kernel: ACPI: Added _OSI(Module Device) Sep 13 00:27:33.832263 kernel: ACPI: Added _OSI(Processor Device) Sep 13 00:27:33.832273 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 13 00:27:33.832283 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 13 00:27:33.832293 kernel: ACPI: Interpreter enabled Sep 13 00:27:33.832305 kernel: ACPI: PM: (supports S0 S3 S5) Sep 13 00:27:33.832315 kernel: ACPI: Using IOAPIC for interrupt routing Sep 13 00:27:33.832325 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 13 00:27:33.832335 kernel: PCI: Using E820 reservations for host bridge windows Sep 13 00:27:33.832345 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 13 00:27:33.832355 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 13 00:27:33.832571 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 13 00:27:33.832720 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 13 00:27:33.832953 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 13 00:27:33.832971 kernel: PCI host bridge to bus 0000:00 Sep 13 00:27:33.833156 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 13 00:27:33.833309 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 13 00:27:33.833455 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 13 00:27:33.833589 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Sep 13 00:27:33.833720 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Sep 13 00:27:33.833883 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Sep 13 00:27:33.834017 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 13 00:27:33.834194 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 13 00:27:33.834358 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 13 00:27:33.834515 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Sep 13 00:27:33.834667 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Sep 13 00:27:33.834837 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 13 00:27:33.834963 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 13 00:27:33.835090 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 13 00:27:33.835217 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Sep 13 00:27:33.835333 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Sep 13 00:27:33.835447 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Sep 13 00:27:33.835575 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 13 00:27:33.835698 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Sep 13 00:27:33.835851 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Sep 13 00:27:33.836006 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Sep 13 00:27:33.836177 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 13 00:27:33.836326 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Sep 13 00:27:33.836475 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Sep 13 00:27:33.836623 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Sep 13 00:27:33.836796 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Sep 13 00:27:33.836963 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 13 00:27:33.837108 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 13 00:27:33.837281 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 13 00:27:33.837436 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Sep 13 00:27:33.837586 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Sep 13 00:27:33.837754 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 13 00:27:33.837932 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Sep 13 00:27:33.837948 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 13 00:27:33.837959 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 13 00:27:33.837969 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 13 00:27:33.837979 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 13 00:27:33.837990 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 13 00:27:33.838000 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 13 00:27:33.838010 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 13 00:27:33.838024 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 13 00:27:33.838034 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 13 00:27:33.838044 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 13 00:27:33.838055 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 13 00:27:33.838065 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 13 00:27:33.838075 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 13 00:27:33.838086 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 13 00:27:33.838096 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 13 00:27:33.838106 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 13 00:27:33.838119 kernel: iommu: Default domain type: Translated Sep 13 00:27:33.838129 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 13 00:27:33.838139 kernel: efivars: Registered efivars operations Sep 13 00:27:33.838161 kernel: PCI: Using ACPI for IRQ routing Sep 13 00:27:33.838172 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 13 00:27:33.838182 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Sep 13 00:27:33.838192 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Sep 13 00:27:33.838202 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Sep 13 00:27:33.838213 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Sep 13 00:27:33.838226 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Sep 13 00:27:33.838236 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Sep 13 00:27:33.838246 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Sep 13 00:27:33.838257 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Sep 13 00:27:33.838407 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 13 00:27:33.838555 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 13 00:27:33.838702 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 13 00:27:33.838719 kernel: vgaarb: loaded Sep 13 00:27:33.838730 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 13 00:27:33.838740 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 13 00:27:33.838751 kernel: clocksource: Switched to clocksource kvm-clock Sep 13 00:27:33.838761 kernel: VFS: Disk quotas dquot_6.6.0 Sep 13 00:27:33.838791 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 13 00:27:33.838803 kernel: pnp: PnP ACPI init Sep 13 00:27:33.838978 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Sep 13 00:27:33.838998 kernel: pnp: PnP ACPI: found 6 devices Sep 13 00:27:33.839012 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 13 00:27:33.839023 kernel: NET: Registered PF_INET protocol family Sep 13 00:27:33.839033 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 13 00:27:33.839044 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 13 00:27:33.839055 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 13 00:27:33.839066 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 13 00:27:33.839077 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 13 00:27:33.839087 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 13 00:27:33.839101 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 13 00:27:33.839111 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 13 00:27:33.839122 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 13 00:27:33.839133 kernel: NET: Registered PF_XDP protocol family Sep 13 00:27:33.839298 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Sep 13 00:27:33.839442 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Sep 13 00:27:33.839573 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 13 00:27:33.839703 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 13 00:27:33.839891 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 13 00:27:33.840026 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Sep 13 00:27:33.840167 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Sep 13 00:27:33.840300 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Sep 13 00:27:33.840315 kernel: PCI: CLS 0 bytes, default 64 Sep 13 00:27:33.840326 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 13 00:27:33.840337 kernel: Initialise system trusted keyrings Sep 13 00:27:33.840352 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 13 00:27:33.840362 kernel: Key type asymmetric registered Sep 13 00:27:33.840373 kernel: Asymmetric key parser 'x509' registered Sep 13 00:27:33.840384 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 13 00:27:33.840395 kernel: io scheduler mq-deadline registered Sep 13 00:27:33.840406 kernel: io scheduler kyber registered Sep 13 00:27:33.840417 kernel: io scheduler bfq registered Sep 13 00:27:33.840428 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 13 00:27:33.840443 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 13 00:27:33.840454 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 13 00:27:33.840466 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 13 00:27:33.840477 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 13 00:27:33.840488 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 13 00:27:33.840499 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 13 00:27:33.840510 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 13 00:27:33.840524 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 13 00:27:33.840673 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 13 00:27:33.840841 kernel: rtc_cmos 00:04: registered as rtc0 Sep 13 00:27:33.840975 kernel: rtc_cmos 00:04: setting system clock to 2025-09-13T00:27:33 UTC (1757723253) Sep 13 00:27:33.840990 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 13 00:27:33.841118 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 13 00:27:33.841132 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 13 00:27:33.841151 kernel: efifb: probing for efifb Sep 13 00:27:33.841163 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Sep 13 00:27:33.841180 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Sep 13 00:27:33.841191 kernel: efifb: scrolling: redraw Sep 13 00:27:33.841202 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 13 00:27:33.841213 kernel: Console: switching to colour frame buffer device 160x50 Sep 13 00:27:33.841224 kernel: fb0: EFI VGA frame buffer device Sep 13 00:27:33.841235 kernel: pstore: Using crash dump compression: deflate Sep 13 00:27:33.841246 kernel: pstore: Registered efi_pstore as persistent store backend Sep 13 00:27:33.841258 kernel: NET: Registered PF_INET6 protocol family Sep 13 00:27:33.841269 kernel: Segment Routing with IPv6 Sep 13 00:27:33.841280 kernel: In-situ OAM (IOAM) with IPv6 Sep 13 00:27:33.841293 kernel: NET: Registered PF_PACKET protocol family Sep 13 00:27:33.841305 kernel: Key type dns_resolver registered Sep 13 00:27:33.841315 kernel: IPI shorthand broadcast: enabled Sep 13 00:27:33.841327 kernel: sched_clock: Marking stable (3434004032, 175501992)->(3631912550, -22406526) Sep 13 00:27:33.841338 kernel: registered taskstats version 1 Sep 13 00:27:33.841349 kernel: Loading compiled-in X.509 certificates Sep 13 00:27:33.841360 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: dd6b45f5ed9ac8d42d60bdb17f83ef06c8bcd8f6' Sep 13 00:27:33.841371 kernel: Demotion targets for Node 0: null Sep 13 00:27:33.841382 kernel: Key type .fscrypt registered Sep 13 00:27:33.841395 kernel: Key type fscrypt-provisioning registered Sep 13 00:27:33.841406 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 13 00:27:33.841417 kernel: ima: Allocated hash algorithm: sha1 Sep 13 00:27:33.841428 kernel: ima: No architecture policies found Sep 13 00:27:33.841439 kernel: clk: Disabling unused clocks Sep 13 00:27:33.841450 kernel: Warning: unable to open an initial console. Sep 13 00:27:33.841461 kernel: Freeing unused kernel image (initmem) memory: 53828K Sep 13 00:27:33.841472 kernel: Write protecting the kernel read-only data: 24576k Sep 13 00:27:33.841485 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 13 00:27:33.841496 kernel: Run /init as init process Sep 13 00:27:33.841507 kernel: with arguments: Sep 13 00:27:33.841518 kernel: /init Sep 13 00:27:33.841529 kernel: with environment: Sep 13 00:27:33.841539 kernel: HOME=/ Sep 13 00:27:33.841550 kernel: TERM=linux Sep 13 00:27:33.841561 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 13 00:27:33.841572 systemd[1]: Successfully made /usr/ read-only. Sep 13 00:27:33.841590 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 13 00:27:33.841602 systemd[1]: Detected virtualization kvm. Sep 13 00:27:33.841613 systemd[1]: Detected architecture x86-64. Sep 13 00:27:33.841624 systemd[1]: Running in initrd. Sep 13 00:27:33.841635 systemd[1]: No hostname configured, using default hostname. Sep 13 00:27:33.841646 systemd[1]: Hostname set to . Sep 13 00:27:33.841657 systemd[1]: Initializing machine ID from VM UUID. Sep 13 00:27:33.841671 systemd[1]: Queued start job for default target initrd.target. Sep 13 00:27:33.841683 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 00:27:33.841694 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 00:27:33.841707 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 13 00:27:33.841718 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 00:27:33.841730 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 13 00:27:33.841742 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 13 00:27:33.841758 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 13 00:27:33.841784 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 13 00:27:33.841796 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 00:27:33.841807 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 00:27:33.841818 systemd[1]: Reached target paths.target - Path Units. Sep 13 00:27:33.841851 systemd[1]: Reached target slices.target - Slice Units. Sep 13 00:27:33.841863 systemd[1]: Reached target swap.target - Swaps. Sep 13 00:27:33.841874 systemd[1]: Reached target timers.target - Timer Units. Sep 13 00:27:33.841886 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 00:27:33.841901 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 00:27:33.841913 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 13 00:27:33.841924 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 13 00:27:33.841935 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 00:27:33.841946 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 00:27:33.841957 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 00:27:33.841968 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 00:27:33.841979 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 13 00:27:33.841993 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 00:27:33.842004 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 13 00:27:33.842016 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 13 00:27:33.842027 systemd[1]: Starting systemd-fsck-usr.service... Sep 13 00:27:33.842038 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 00:27:33.842049 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 00:27:33.842061 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:27:33.842072 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 13 00:27:33.842086 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 00:27:33.842097 systemd[1]: Finished systemd-fsck-usr.service. Sep 13 00:27:33.842150 systemd-journald[221]: Collecting audit messages is disabled. Sep 13 00:27:33.842182 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 00:27:33.842194 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 00:27:33.842206 systemd-journald[221]: Journal started Sep 13 00:27:33.842231 systemd-journald[221]: Runtime Journal (/run/log/journal/b4594c1a2f53460ab78036b11cbde580) is 6M, max 48.5M, 42.4M free. Sep 13 00:27:33.830831 systemd-modules-load[222]: Inserted module 'overlay' Sep 13 00:27:33.846176 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 00:27:33.846206 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 00:27:33.861790 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 13 00:27:33.861839 kernel: Bridge firewalling registered Sep 13 00:27:33.859927 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 00:27:33.861864 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:27:33.864343 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 00:27:33.864545 systemd-modules-load[222]: Inserted module 'br_netfilter' Sep 13 00:27:33.866281 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 00:27:33.867650 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 00:27:33.876506 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 00:27:33.884050 systemd-tmpfiles[242]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 13 00:27:33.889468 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 00:27:33.890348 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 00:27:33.892915 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 00:27:33.893382 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 00:27:33.896659 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 13 00:27:33.921428 dracut-cmdline[264]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=21b29c6e420cf06e0546ff797fc1285d986af130e4ba1abb9f27cb6343b53294 Sep 13 00:27:33.951028 systemd-resolved[263]: Positive Trust Anchors: Sep 13 00:27:33.951042 systemd-resolved[263]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 00:27:33.951074 systemd-resolved[263]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 00:27:33.955170 systemd-resolved[263]: Defaulting to hostname 'linux'. Sep 13 00:27:33.956450 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 00:27:33.961096 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 00:27:34.066839 kernel: SCSI subsystem initialized Sep 13 00:27:34.081809 kernel: Loading iSCSI transport class v2.0-870. Sep 13 00:27:34.095801 kernel: iscsi: registered transport (tcp) Sep 13 00:27:34.119818 kernel: iscsi: registered transport (qla4xxx) Sep 13 00:27:34.119916 kernel: QLogic iSCSI HBA Driver Sep 13 00:27:34.146794 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 13 00:27:34.172038 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 00:27:34.173734 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 13 00:27:34.301877 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 13 00:27:34.304281 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 13 00:27:34.370814 kernel: raid6: avx2x4 gen() 24758 MB/s Sep 13 00:27:34.387822 kernel: raid6: avx2x2 gen() 30603 MB/s Sep 13 00:27:34.404868 kernel: raid6: avx2x1 gen() 25542 MB/s Sep 13 00:27:34.404961 kernel: raid6: using algorithm avx2x2 gen() 30603 MB/s Sep 13 00:27:34.422859 kernel: raid6: .... xor() 19891 MB/s, rmw enabled Sep 13 00:27:34.422899 kernel: raid6: using avx2x2 recovery algorithm Sep 13 00:27:34.445813 kernel: xor: automatically using best checksumming function avx Sep 13 00:27:34.653811 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 13 00:27:34.663010 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 13 00:27:34.665973 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 00:27:34.705376 systemd-udevd[475]: Using default interface naming scheme 'v255'. Sep 13 00:27:34.711704 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 00:27:34.715517 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 13 00:27:34.753102 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Sep 13 00:27:34.787163 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 00:27:34.790035 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 00:27:34.870898 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 00:27:34.875329 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 13 00:27:34.918807 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 13 00:27:34.932082 kernel: cryptd: max_cpu_qlen set to 1000 Sep 13 00:27:34.940867 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 13 00:27:34.949722 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 13 00:27:34.949745 kernel: GPT:9289727 != 19775487 Sep 13 00:27:34.949755 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 13 00:27:34.949765 kernel: GPT:9289727 != 19775487 Sep 13 00:27:34.949787 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 13 00:27:34.949797 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 00:27:34.954998 kernel: libata version 3.00 loaded. Sep 13 00:27:34.960794 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 13 00:27:34.962987 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 00:27:34.973053 kernel: ahci 0000:00:1f.2: version 3.0 Sep 13 00:27:34.973934 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 13 00:27:34.973951 kernel: AES CTR mode by8 optimization enabled Sep 13 00:27:34.970343 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:27:34.973462 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:27:34.980238 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 13 00:27:34.980459 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 13 00:27:34.980637 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 13 00:27:34.981429 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:27:34.986719 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 13 00:27:34.990147 kernel: scsi host0: ahci Sep 13 00:27:34.996397 kernel: scsi host1: ahci Sep 13 00:27:35.019851 kernel: scsi host2: ahci Sep 13 00:27:35.022338 kernel: scsi host3: ahci Sep 13 00:27:35.024618 kernel: scsi host4: ahci Sep 13 00:27:35.024835 kernel: scsi host5: ahci Sep 13 00:27:35.025010 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Sep 13 00:27:35.025026 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Sep 13 00:27:35.026452 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Sep 13 00:27:35.026474 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Sep 13 00:27:35.028229 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Sep 13 00:27:35.028252 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Sep 13 00:27:35.038402 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 13 00:27:35.041184 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:27:35.058138 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 13 00:27:35.072989 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 13 00:27:35.076352 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 13 00:27:35.091155 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 00:27:35.095417 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 13 00:27:35.132624 disk-uuid[634]: Primary Header is updated. Sep 13 00:27:35.132624 disk-uuid[634]: Secondary Entries is updated. Sep 13 00:27:35.132624 disk-uuid[634]: Secondary Header is updated. Sep 13 00:27:35.141801 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 00:27:35.148809 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 00:27:35.334825 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 13 00:27:35.334931 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 13 00:27:35.335943 kernel: ata3.00: LPM support broken, forcing max_power Sep 13 00:27:35.335971 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 13 00:27:35.337371 kernel: ata3.00: applying bridge limits Sep 13 00:27:35.337794 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 13 00:27:35.338807 kernel: ata3.00: LPM support broken, forcing max_power Sep 13 00:27:35.339804 kernel: ata3.00: configured for UDMA/100 Sep 13 00:27:35.341813 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 13 00:27:35.344811 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 13 00:27:35.344877 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 13 00:27:35.345805 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 13 00:27:35.414823 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 13 00:27:35.415243 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 13 00:27:35.441919 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 13 00:27:35.892504 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 13 00:27:35.894601 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 00:27:35.896389 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 00:27:35.897794 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 00:27:35.900945 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 13 00:27:35.933868 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 13 00:27:36.149983 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 13 00:27:36.150042 disk-uuid[635]: The operation has completed successfully. Sep 13 00:27:36.188715 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 13 00:27:36.188871 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 13 00:27:36.233585 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 13 00:27:36.262868 sh[663]: Success Sep 13 00:27:36.283289 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 13 00:27:36.283350 kernel: device-mapper: uevent: version 1.0.3 Sep 13 00:27:36.284519 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 13 00:27:36.294800 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 13 00:27:36.329434 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 13 00:27:36.333200 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 13 00:27:36.356661 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 13 00:27:36.363634 kernel: BTRFS: device fsid ca815b72-c68a-4b5e-8622-cfb6842bab47 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (675) Sep 13 00:27:36.363662 kernel: BTRFS info (device dm-0): first mount of filesystem ca815b72-c68a-4b5e-8622-cfb6842bab47 Sep 13 00:27:36.363686 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:27:36.369469 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 13 00:27:36.369495 kernel: BTRFS info (device dm-0): enabling free space tree Sep 13 00:27:36.370716 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 13 00:27:36.372200 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 13 00:27:36.373848 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 13 00:27:36.374942 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 13 00:27:36.376756 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 13 00:27:36.409849 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (708) Sep 13 00:27:36.409933 kernel: BTRFS info (device vda6): first mount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 00:27:36.409947 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:27:36.413930 kernel: BTRFS info (device vda6): turning on async discard Sep 13 00:27:36.413987 kernel: BTRFS info (device vda6): enabling free space tree Sep 13 00:27:36.420916 kernel: BTRFS info (device vda6): last unmount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 00:27:36.421318 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 13 00:27:36.425289 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 13 00:27:36.535377 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 00:27:36.537154 ignition[755]: Ignition 2.21.0 Sep 13 00:27:36.537162 ignition[755]: Stage: fetch-offline Sep 13 00:27:36.537197 ignition[755]: no configs at "/usr/lib/ignition/base.d" Sep 13 00:27:36.537207 ignition[755]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 13 00:27:36.537304 ignition[755]: parsed url from cmdline: "" Sep 13 00:27:36.537308 ignition[755]: no config URL provided Sep 13 00:27:36.537314 ignition[755]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 00:27:36.537322 ignition[755]: no config at "/usr/lib/ignition/user.ign" Sep 13 00:27:36.537346 ignition[755]: op(1): [started] loading QEMU firmware config module Sep 13 00:27:36.537354 ignition[755]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 13 00:27:36.548797 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 00:27:36.555908 ignition[755]: op(1): [finished] loading QEMU firmware config module Sep 13 00:27:36.594468 systemd-networkd[853]: lo: Link UP Sep 13 00:27:36.594481 systemd-networkd[853]: lo: Gained carrier Sep 13 00:27:36.596198 systemd-networkd[853]: Enumeration completed Sep 13 00:27:36.596611 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:27:36.596616 systemd-networkd[853]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 00:27:36.596929 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 00:27:36.599084 systemd-networkd[853]: eth0: Link UP Sep 13 00:27:36.599336 systemd-networkd[853]: eth0: Gained carrier Sep 13 00:27:36.599346 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:27:36.607311 ignition[755]: parsing config with SHA512: ebd19ad42c216b62e63ada4991a0b20f085f80ccdef5ffb9f38eb752570df33fe2e73ac3aa33f753c5f6adf1c6a9bc71fb7440ff389a7d004161ef3f978878d4 Sep 13 00:27:36.599925 systemd[1]: Reached target network.target - Network. Sep 13 00:27:36.614302 unknown[755]: fetched base config from "system" Sep 13 00:27:36.614318 unknown[755]: fetched user config from "qemu" Sep 13 00:27:36.614791 ignition[755]: fetch-offline: fetch-offline passed Sep 13 00:27:36.614870 ignition[755]: Ignition finished successfully Sep 13 00:27:36.616826 systemd-networkd[853]: eth0: DHCPv4 address 10.0.0.95/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 13 00:27:36.617692 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 00:27:36.619077 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 13 00:27:36.620167 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 13 00:27:36.672032 ignition[858]: Ignition 2.21.0 Sep 13 00:27:36.672048 ignition[858]: Stage: kargs Sep 13 00:27:36.672219 ignition[858]: no configs at "/usr/lib/ignition/base.d" Sep 13 00:27:36.672230 ignition[858]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 13 00:27:36.675894 ignition[858]: kargs: kargs passed Sep 13 00:27:36.675990 ignition[858]: Ignition finished successfully Sep 13 00:27:36.681367 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 13 00:27:36.684894 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 13 00:27:36.722443 ignition[866]: Ignition 2.21.0 Sep 13 00:27:36.722457 ignition[866]: Stage: disks Sep 13 00:27:36.722626 ignition[866]: no configs at "/usr/lib/ignition/base.d" Sep 13 00:27:36.722638 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 13 00:27:36.727557 ignition[866]: disks: disks passed Sep 13 00:27:36.728282 ignition[866]: Ignition finished successfully Sep 13 00:27:36.731738 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 13 00:27:36.732315 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 13 00:27:36.734184 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 13 00:27:36.734536 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 00:27:36.735208 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 00:27:36.735554 systemd[1]: Reached target basic.target - Basic System. Sep 13 00:27:36.737291 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 13 00:27:36.778823 systemd-fsck[876]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 13 00:27:36.787311 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 13 00:27:36.788722 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 13 00:27:36.935815 kernel: EXT4-fs (vda9): mounted filesystem 7f859ed0-e8c8-40c1-91d3-e1e964d8c4e8 r/w with ordered data mode. Quota mode: none. Sep 13 00:27:36.936402 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 13 00:27:36.937726 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 13 00:27:36.939692 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 00:27:36.942314 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 13 00:27:36.943697 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 13 00:27:36.943741 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 13 00:27:36.943767 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 00:27:36.962495 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 13 00:27:36.965358 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 13 00:27:36.971827 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (884) Sep 13 00:27:36.971864 kernel: BTRFS info (device vda6): first mount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 00:27:36.973805 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:27:36.978498 kernel: BTRFS info (device vda6): turning on async discard Sep 13 00:27:36.978564 kernel: BTRFS info (device vda6): enabling free space tree Sep 13 00:27:36.981253 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 00:27:37.010109 initrd-setup-root[908]: cut: /sysroot/etc/passwd: No such file or directory Sep 13 00:27:37.015268 initrd-setup-root[915]: cut: /sysroot/etc/group: No such file or directory Sep 13 00:27:37.019753 initrd-setup-root[922]: cut: /sysroot/etc/shadow: No such file or directory Sep 13 00:27:37.024560 initrd-setup-root[929]: cut: /sysroot/etc/gshadow: No such file or directory Sep 13 00:27:37.128681 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 13 00:27:37.131096 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 13 00:27:37.133018 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 13 00:27:37.153796 kernel: BTRFS info (device vda6): last unmount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 00:27:37.167462 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 13 00:27:37.185808 ignition[998]: INFO : Ignition 2.21.0 Sep 13 00:27:37.185808 ignition[998]: INFO : Stage: mount Sep 13 00:27:37.188967 ignition[998]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 00:27:37.188967 ignition[998]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 13 00:27:37.192803 ignition[998]: INFO : mount: mount passed Sep 13 00:27:37.193678 ignition[998]: INFO : Ignition finished successfully Sep 13 00:27:37.197538 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 13 00:27:37.199046 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 13 00:27:37.362436 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 13 00:27:37.364231 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 00:27:37.398802 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1010) Sep 13 00:27:37.400975 kernel: BTRFS info (device vda6): first mount of filesystem 9cd66393-e258-466a-9c7b-a40c48e4924e Sep 13 00:27:37.400993 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:27:37.404305 kernel: BTRFS info (device vda6): turning on async discard Sep 13 00:27:37.404326 kernel: BTRFS info (device vda6): enabling free space tree Sep 13 00:27:37.406477 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 00:27:37.438363 ignition[1027]: INFO : Ignition 2.21.0 Sep 13 00:27:37.438363 ignition[1027]: INFO : Stage: files Sep 13 00:27:37.440481 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 00:27:37.440481 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 13 00:27:37.440481 ignition[1027]: DEBUG : files: compiled without relabeling support, skipping Sep 13 00:27:37.444136 ignition[1027]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 13 00:27:37.444136 ignition[1027]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 13 00:27:37.449149 ignition[1027]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 13 00:27:37.450835 ignition[1027]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 13 00:27:37.452395 ignition[1027]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 13 00:27:37.451217 unknown[1027]: wrote ssh authorized keys file for user: core Sep 13 00:27:37.455320 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 13 00:27:37.455320 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 13 00:27:37.513376 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 13 00:27:37.693556 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 13 00:27:37.693556 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 13 00:27:37.698816 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 13 00:27:37.698816 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 13 00:27:37.698816 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 13 00:27:37.698816 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 00:27:37.698816 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 00:27:37.698816 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 00:27:37.698816 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 00:27:37.763198 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 00:27:37.763198 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 00:27:37.763198 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 00:27:37.771182 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 00:27:37.774389 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 00:27:37.776958 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 13 00:27:38.250109 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 13 00:27:38.670967 systemd-networkd[853]: eth0: Gained IPv6LL Sep 13 00:27:38.916116 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 13 00:27:38.916116 ignition[1027]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 13 00:27:38.921025 ignition[1027]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 00:27:38.923292 ignition[1027]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 00:27:38.923292 ignition[1027]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 13 00:27:38.923292 ignition[1027]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 13 00:27:38.923292 ignition[1027]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 13 00:27:38.923292 ignition[1027]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 13 00:27:38.923292 ignition[1027]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 13 00:27:38.923292 ignition[1027]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 13 00:27:38.952422 ignition[1027]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 13 00:27:38.956742 ignition[1027]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 13 00:27:38.958682 ignition[1027]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 13 00:27:38.960345 ignition[1027]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 13 00:27:38.960345 ignition[1027]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 13 00:27:38.963451 ignition[1027]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 13 00:27:38.963451 ignition[1027]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 13 00:27:38.963451 ignition[1027]: INFO : files: files passed Sep 13 00:27:38.963451 ignition[1027]: INFO : Ignition finished successfully Sep 13 00:27:38.972123 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 13 00:27:38.974862 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 13 00:27:38.976891 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 13 00:27:38.999523 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 13 00:27:38.999659 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 13 00:27:39.005029 initrd-setup-root-after-ignition[1056]: grep: /sysroot/oem/oem-release: No such file or directory Sep 13 00:27:39.009114 initrd-setup-root-after-ignition[1058]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 00:27:39.009114 initrd-setup-root-after-ignition[1058]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 13 00:27:39.012638 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 00:27:39.011939 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 00:27:39.013246 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 13 00:27:39.018531 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 13 00:27:39.089560 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 13 00:27:39.089718 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 13 00:27:39.090505 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 13 00:27:39.095259 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 13 00:27:39.095597 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 13 00:27:39.096504 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 13 00:27:39.133488 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 00:27:39.135682 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 13 00:27:39.163540 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 13 00:27:39.166054 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 00:27:39.166619 systemd[1]: Stopped target timers.target - Timer Units. Sep 13 00:27:39.167159 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 13 00:27:39.167293 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 00:27:39.172556 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 13 00:27:39.173453 systemd[1]: Stopped target basic.target - Basic System. Sep 13 00:27:39.174133 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 13 00:27:39.174569 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 00:27:39.175637 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 13 00:27:39.182642 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 13 00:27:39.183178 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 13 00:27:39.183541 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 00:27:39.184108 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 13 00:27:39.184458 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 13 00:27:39.184828 systemd[1]: Stopped target swap.target - Swaps. Sep 13 00:27:39.185339 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 13 00:27:39.185465 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 13 00:27:39.198404 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 13 00:27:39.199102 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 00:27:39.199420 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 13 00:27:39.203908 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 00:27:39.204597 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 13 00:27:39.204718 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 13 00:27:39.208422 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 13 00:27:39.208544 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 00:27:39.211189 systemd[1]: Stopped target paths.target - Path Units. Sep 13 00:27:39.211444 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 13 00:27:39.218866 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 00:27:39.221767 systemd[1]: Stopped target slices.target - Slice Units. Sep 13 00:27:39.222322 systemd[1]: Stopped target sockets.target - Socket Units. Sep 13 00:27:39.222680 systemd[1]: iscsid.socket: Deactivated successfully. Sep 13 00:27:39.222798 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 00:27:39.226101 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 13 00:27:39.226202 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 00:27:39.227791 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 13 00:27:39.227929 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 00:27:39.228304 systemd[1]: ignition-files.service: Deactivated successfully. Sep 13 00:27:39.228425 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 13 00:27:39.234532 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 13 00:27:39.236120 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 13 00:27:39.238040 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 13 00:27:39.238171 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 00:27:39.244751 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 13 00:27:39.245914 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 00:27:39.252887 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 13 00:27:39.253039 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 13 00:27:39.271206 ignition[1082]: INFO : Ignition 2.21.0 Sep 13 00:27:39.271206 ignition[1082]: INFO : Stage: umount Sep 13 00:27:39.274063 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 13 00:27:39.276543 ignition[1082]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 00:27:39.276543 ignition[1082]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 13 00:27:39.278912 ignition[1082]: INFO : umount: umount passed Sep 13 00:27:39.278912 ignition[1082]: INFO : Ignition finished successfully Sep 13 00:27:39.280935 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 13 00:27:39.281101 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 13 00:27:39.281902 systemd[1]: Stopped target network.target - Network. Sep 13 00:27:39.283436 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 13 00:27:39.283506 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 13 00:27:39.285645 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 13 00:27:39.285700 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 13 00:27:39.286185 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 13 00:27:39.286244 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 13 00:27:39.286542 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 13 00:27:39.286608 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 13 00:27:39.287233 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 13 00:27:39.294535 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 13 00:27:39.303695 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 13 00:27:39.303855 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 13 00:27:39.309342 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 13 00:27:39.310105 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 13 00:27:39.310201 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 00:27:39.315697 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 13 00:27:39.321434 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 13 00:27:39.321592 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 13 00:27:39.326592 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 13 00:27:39.326796 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 13 00:27:39.327437 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 13 00:27:39.327479 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 13 00:27:39.328630 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 13 00:27:39.332514 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 13 00:27:39.332571 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 00:27:39.334631 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 13 00:27:39.334678 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 13 00:27:39.339987 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 13 00:27:39.340040 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 13 00:27:39.342068 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 00:27:39.343462 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 13 00:27:39.363264 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 13 00:27:39.363404 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 13 00:27:39.371754 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 13 00:27:39.371989 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 00:27:39.372636 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 13 00:27:39.372686 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 13 00:27:39.375679 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 13 00:27:39.375723 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 00:27:39.377837 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 13 00:27:39.377893 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 13 00:27:39.378755 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 13 00:27:39.378820 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 13 00:27:39.385441 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 00:27:39.385497 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 00:27:39.390405 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 13 00:27:39.391058 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 13 00:27:39.391112 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 00:27:39.394981 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 13 00:27:39.395048 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 00:27:39.398469 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 13 00:27:39.398520 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 00:27:39.403030 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 13 00:27:39.403083 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 00:27:39.403760 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 00:27:39.403838 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:27:39.417705 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 13 00:27:39.417887 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 13 00:27:39.459883 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 13 00:27:39.460041 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 13 00:27:39.460920 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 13 00:27:39.461364 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 13 00:27:39.461422 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 13 00:27:39.462702 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 13 00:27:39.492111 systemd[1]: Switching root. Sep 13 00:27:39.531974 systemd-journald[221]: Journal stopped Sep 13 00:27:40.962741 systemd-journald[221]: Received SIGTERM from PID 1 (systemd). Sep 13 00:27:40.962835 kernel: SELinux: policy capability network_peer_controls=1 Sep 13 00:27:40.962859 kernel: SELinux: policy capability open_perms=1 Sep 13 00:27:40.962876 kernel: SELinux: policy capability extended_socket_class=1 Sep 13 00:27:40.962889 kernel: SELinux: policy capability always_check_network=0 Sep 13 00:27:40.962903 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 13 00:27:40.962917 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 13 00:27:40.962930 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 13 00:27:40.962944 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 13 00:27:40.962966 kernel: SELinux: policy capability userspace_initial_context=0 Sep 13 00:27:40.962980 kernel: audit: type=1403 audit(1757723260.070:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 13 00:27:40.962996 systemd[1]: Successfully loaded SELinux policy in 52.455ms. Sep 13 00:27:40.963020 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 14.314ms. Sep 13 00:27:40.963042 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 13 00:27:40.963057 systemd[1]: Detected virtualization kvm. Sep 13 00:27:40.963072 systemd[1]: Detected architecture x86-64. Sep 13 00:27:40.963086 systemd[1]: Detected first boot. Sep 13 00:27:40.963101 systemd[1]: Initializing machine ID from VM UUID. Sep 13 00:27:40.963115 zram_generator::config[1129]: No configuration found. Sep 13 00:27:40.963130 kernel: Guest personality initialized and is inactive Sep 13 00:27:40.963152 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 13 00:27:40.963166 kernel: Initialized host personality Sep 13 00:27:40.963179 kernel: NET: Registered PF_VSOCK protocol family Sep 13 00:27:40.963193 systemd[1]: Populated /etc with preset unit settings. Sep 13 00:27:40.963208 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 13 00:27:40.963229 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 13 00:27:40.963243 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 13 00:27:40.963258 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 13 00:27:40.963279 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 13 00:27:40.963296 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 13 00:27:40.963311 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 13 00:27:40.963325 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 13 00:27:40.963340 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 13 00:27:40.963355 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 13 00:27:40.963369 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 13 00:27:40.963384 systemd[1]: Created slice user.slice - User and Session Slice. Sep 13 00:27:40.963399 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 00:27:40.963413 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 00:27:40.963430 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 13 00:27:40.963446 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 13 00:27:40.963466 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 13 00:27:40.963481 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 00:27:40.963496 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 13 00:27:40.963511 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 00:27:40.963526 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 00:27:40.963542 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 13 00:27:40.963557 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 13 00:27:40.963574 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 13 00:27:40.963588 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 13 00:27:40.963602 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 00:27:40.963617 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 00:27:40.963632 systemd[1]: Reached target slices.target - Slice Units. Sep 13 00:27:40.963646 systemd[1]: Reached target swap.target - Swaps. Sep 13 00:27:40.963661 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 13 00:27:40.963684 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 13 00:27:40.963699 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 13 00:27:40.963715 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 00:27:40.963733 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 00:27:40.963747 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 00:27:40.963761 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 13 00:27:40.963792 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 13 00:27:40.963806 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 13 00:27:40.963821 systemd[1]: Mounting media.mount - External Media Directory... Sep 13 00:27:40.963838 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:40.963852 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 13 00:27:40.963866 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 13 00:27:40.963880 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 13 00:27:40.963895 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 13 00:27:40.963910 systemd[1]: Reached target machines.target - Containers. Sep 13 00:27:40.963924 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 13 00:27:40.963940 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:27:40.963964 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 00:27:40.963981 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 13 00:27:40.963996 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 00:27:40.964010 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 00:27:40.964028 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 00:27:40.964042 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 13 00:27:40.964058 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 00:27:40.964073 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 13 00:27:40.964088 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 13 00:27:40.964108 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 13 00:27:40.964122 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 13 00:27:40.964136 systemd[1]: Stopped systemd-fsck-usr.service. Sep 13 00:27:40.964151 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 13 00:27:40.964166 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 00:27:40.964180 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 00:27:40.964194 kernel: loop: module loaded Sep 13 00:27:40.964208 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 13 00:27:40.964223 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 13 00:27:40.964243 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 13 00:27:40.964257 kernel: fuse: init (API version 7.41) Sep 13 00:27:40.964271 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 00:27:40.964285 systemd[1]: verity-setup.service: Deactivated successfully. Sep 13 00:27:40.964300 systemd[1]: Stopped verity-setup.service. Sep 13 00:27:40.964320 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:40.964334 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 13 00:27:40.964349 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 13 00:27:40.964363 systemd[1]: Mounted media.mount - External Media Directory. Sep 13 00:27:40.964378 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 13 00:27:40.964402 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 13 00:27:40.964417 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 13 00:27:40.964431 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 00:27:40.964446 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 13 00:27:40.964461 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 13 00:27:40.964475 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 00:27:40.964490 kernel: ACPI: bus type drm_connector registered Sep 13 00:27:40.964504 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 00:27:40.964541 systemd-journald[1197]: Collecting audit messages is disabled. Sep 13 00:27:40.964577 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 00:27:40.964592 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 00:27:40.964606 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 00:27:40.964620 systemd-journald[1197]: Journal started Sep 13 00:27:40.964646 systemd-journald[1197]: Runtime Journal (/run/log/journal/b4594c1a2f53460ab78036b11cbde580) is 6M, max 48.5M, 42.4M free. Sep 13 00:27:40.682179 systemd[1]: Queued start job for default target multi-user.target. Sep 13 00:27:40.704131 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 13 00:27:40.704655 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 13 00:27:40.966914 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 00:27:40.968944 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 00:27:40.970783 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 13 00:27:40.971035 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 13 00:27:40.972637 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 13 00:27:40.974259 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 00:27:40.974501 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 00:27:40.976082 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 00:27:40.977667 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 00:27:40.979500 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 13 00:27:40.981233 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 13 00:27:40.994743 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 13 00:27:40.997491 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 13 00:27:40.999713 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 13 00:27:41.000913 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 13 00:27:41.000945 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 00:27:41.003074 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 13 00:27:41.008045 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 13 00:27:41.012169 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:27:41.013909 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 13 00:27:41.016963 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 13 00:27:41.018625 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 00:27:41.021812 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 13 00:27:41.023195 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 00:27:41.024541 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 00:27:41.028353 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 13 00:27:41.032536 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 00:27:41.036339 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 13 00:27:41.037719 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 13 00:27:41.039299 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 13 00:27:41.045266 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 13 00:27:41.047863 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 13 00:27:41.050362 systemd-journald[1197]: Time spent on flushing to /var/log/journal/b4594c1a2f53460ab78036b11cbde580 is 41.233ms for 1072 entries. Sep 13 00:27:41.050362 systemd-journald[1197]: System Journal (/var/log/journal/b4594c1a2f53460ab78036b11cbde580) is 8M, max 195.6M, 187.6M free. Sep 13 00:27:41.121292 systemd-journald[1197]: Received client request to flush runtime journal. Sep 13 00:27:41.121362 kernel: loop0: detected capacity change from 0 to 113872 Sep 13 00:27:41.121390 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 13 00:27:41.064234 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 00:27:41.068731 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 00:27:41.100392 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 13 00:27:41.105896 systemd-tmpfiles[1249]: ACLs are not supported, ignoring. Sep 13 00:27:41.105909 systemd-tmpfiles[1249]: ACLs are not supported, ignoring. Sep 13 00:27:41.111589 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 00:27:41.114641 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 13 00:27:41.128829 kernel: loop1: detected capacity change from 0 to 146240 Sep 13 00:27:41.129392 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 13 00:27:41.159141 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 13 00:27:41.161864 kernel: loop2: detected capacity change from 0 to 221472 Sep 13 00:27:41.164188 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 00:27:41.195804 kernel: loop3: detected capacity change from 0 to 113872 Sep 13 00:27:41.199330 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 13 00:27:41.199352 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 13 00:27:41.206177 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 00:27:41.221809 kernel: loop4: detected capacity change from 0 to 146240 Sep 13 00:27:41.237812 kernel: loop5: detected capacity change from 0 to 221472 Sep 13 00:27:41.248477 (sd-merge)[1272]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 13 00:27:41.249194 (sd-merge)[1272]: Merged extensions into '/usr'. Sep 13 00:27:41.257112 systemd[1]: Reload requested from client PID 1248 ('systemd-sysext') (unit systemd-sysext.service)... Sep 13 00:27:41.257132 systemd[1]: Reloading... Sep 13 00:27:41.354821 zram_generator::config[1305]: No configuration found. Sep 13 00:27:41.473343 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:27:41.477874 ldconfig[1243]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 13 00:27:41.575344 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 13 00:27:41.576003 systemd[1]: Reloading finished in 318 ms. Sep 13 00:27:41.607855 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 13 00:27:41.609597 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 13 00:27:41.634453 systemd[1]: Starting ensure-sysext.service... Sep 13 00:27:41.636562 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 00:27:41.652547 systemd[1]: Reload requested from client PID 1336 ('systemctl') (unit ensure-sysext.service)... Sep 13 00:27:41.652566 systemd[1]: Reloading... Sep 13 00:27:41.676261 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 13 00:27:41.676548 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 13 00:27:41.676967 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 13 00:27:41.677277 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 13 00:27:41.678373 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 13 00:27:41.678704 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Sep 13 00:27:41.678803 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Sep 13 00:27:41.683976 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 00:27:41.683988 systemd-tmpfiles[1337]: Skipping /boot Sep 13 00:27:41.708450 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 00:27:41.708730 systemd-tmpfiles[1337]: Skipping /boot Sep 13 00:27:41.727904 zram_generator::config[1369]: No configuration found. Sep 13 00:27:41.835057 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:27:41.936563 systemd[1]: Reloading finished in 283 ms. Sep 13 00:27:41.963407 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 13 00:27:41.993528 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 00:27:42.004001 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 13 00:27:42.007062 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 13 00:27:42.009648 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 13 00:27:42.021197 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 00:27:42.024298 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 00:27:42.027346 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 13 00:27:42.031595 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:42.031811 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:27:42.042116 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 00:27:42.045381 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 00:27:42.049222 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 00:27:42.050633 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:27:42.050760 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 13 00:27:42.050890 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:42.056895 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:42.057156 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:27:42.057394 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:27:42.057555 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 13 00:27:42.062032 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 13 00:27:42.063253 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:42.064531 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 13 00:27:42.066705 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 13 00:27:42.068734 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 00:27:42.073954 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 00:27:42.075973 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 00:27:42.076205 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 00:27:42.078078 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 00:27:42.078309 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 00:27:42.082327 systemd-udevd[1407]: Using default interface naming scheme 'v255'. Sep 13 00:27:42.096437 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:42.096712 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:27:42.099898 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 00:27:42.103062 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 00:27:42.105260 augenrules[1439]: No rules Sep 13 00:27:42.110532 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 00:27:42.115044 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 00:27:42.117592 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:27:42.117731 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 13 00:27:42.123130 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 13 00:27:42.124338 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:27:42.125841 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 00:27:42.128508 systemd[1]: audit-rules.service: Deactivated successfully. Sep 13 00:27:42.128814 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 13 00:27:42.135964 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 13 00:27:42.138904 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 00:27:42.139151 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 00:27:42.140983 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 00:27:42.141371 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 00:27:42.143674 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 00:27:42.143935 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 00:27:42.152505 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 13 00:27:42.154343 systemd[1]: Finished ensure-sysext.service. Sep 13 00:27:42.157500 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 00:27:42.160130 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 00:27:42.173461 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 00:27:42.174912 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 00:27:42.175010 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 00:27:42.178958 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 13 00:27:42.180845 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 13 00:27:42.187174 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 13 00:27:42.231885 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 13 00:27:42.294036 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 13 00:27:42.299041 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 13 00:27:42.321374 kernel: mousedev: PS/2 mouse device common for all mice Sep 13 00:27:42.328226 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 13 00:27:42.359706 systemd-networkd[1487]: lo: Link UP Sep 13 00:27:42.359717 systemd-networkd[1487]: lo: Gained carrier Sep 13 00:27:42.361491 systemd-networkd[1487]: Enumeration completed Sep 13 00:27:42.361604 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 00:27:42.362083 systemd-networkd[1487]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:27:42.362089 systemd-networkd[1487]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 00:27:42.363578 systemd-networkd[1487]: eth0: Link UP Sep 13 00:27:42.363730 systemd-networkd[1487]: eth0: Gained carrier Sep 13 00:27:42.363746 systemd-networkd[1487]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:27:42.364737 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 13 00:27:42.369033 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 13 00:27:42.373826 systemd-resolved[1406]: Positive Trust Anchors: Sep 13 00:27:42.373843 systemd-resolved[1406]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 00:27:42.373877 systemd-resolved[1406]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 00:27:42.377798 systemd-resolved[1406]: Defaulting to hostname 'linux'. Sep 13 00:27:42.379525 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 00:27:42.380877 systemd[1]: Reached target network.target - Network. Sep 13 00:27:42.381902 systemd-networkd[1487]: eth0: DHCPv4 address 10.0.0.95/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 13 00:27:42.381977 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 00:27:42.384809 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 13 00:27:42.393791 kernel: ACPI: button: Power Button [PWRF] Sep 13 00:27:42.404218 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 13 00:27:43.471201 systemd-resolved[1406]: Clock change detected. Flushing caches. Sep 13 00:27:43.471232 systemd-timesyncd[1489]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 13 00:27:43.471286 systemd-timesyncd[1489]: Initial clock synchronization to Sat 2025-09-13 00:27:43.471145 UTC. Sep 13 00:27:43.473049 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 13 00:27:43.474892 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 00:27:43.476284 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 13 00:27:43.478710 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 13 00:27:43.479357 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Sep 13 00:27:43.481135 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 13 00:27:43.481315 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 13 00:27:43.482218 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 13 00:27:43.483517 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 13 00:27:43.484965 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 13 00:27:43.485025 systemd[1]: Reached target paths.target - Path Units. Sep 13 00:27:43.486101 systemd[1]: Reached target time-set.target - System Time Set. Sep 13 00:27:43.487459 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 13 00:27:43.488754 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 13 00:27:43.490218 systemd[1]: Reached target timers.target - Timer Units. Sep 13 00:27:43.493194 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 13 00:27:43.497167 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 13 00:27:43.502719 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 13 00:27:43.504834 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 13 00:27:43.506393 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 13 00:27:43.516355 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 13 00:27:43.517946 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 13 00:27:43.523262 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 13 00:27:43.525324 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 00:27:43.526760 systemd[1]: Reached target basic.target - Basic System. Sep 13 00:27:43.527896 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 13 00:27:43.527932 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 13 00:27:43.529597 systemd[1]: Starting containerd.service - containerd container runtime... Sep 13 00:27:43.536111 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 13 00:27:43.544153 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 13 00:27:43.547270 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 13 00:27:43.551620 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 13 00:27:43.552814 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 13 00:27:43.556571 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 13 00:27:43.561731 jq[1528]: false Sep 13 00:27:43.564175 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 13 00:27:43.567656 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 13 00:27:43.573466 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 13 00:27:43.575703 extend-filesystems[1529]: Found /dev/vda6 Sep 13 00:27:43.577005 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 13 00:27:43.579878 extend-filesystems[1529]: Found /dev/vda9 Sep 13 00:27:43.581563 extend-filesystems[1529]: Checking size of /dev/vda9 Sep 13 00:27:43.587575 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 13 00:27:43.589726 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 13 00:27:43.591442 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 13 00:27:43.595214 extend-filesystems[1529]: Resized partition /dev/vda9 Sep 13 00:27:43.599462 systemd[1]: Starting update-engine.service - Update Engine... Sep 13 00:27:43.601436 extend-filesystems[1551]: resize2fs 1.47.2 (1-Jan-2025) Sep 13 00:27:43.603382 oslogin_cache_refresh[1530]: Refreshing passwd entry cache Sep 13 00:27:43.604748 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Refreshing passwd entry cache Sep 13 00:27:43.605904 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 13 00:27:43.607369 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 13 00:27:43.615034 jq[1552]: true Sep 13 00:27:43.615391 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Failure getting users, quitting Sep 13 00:27:43.615391 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 13 00:27:43.615391 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Refreshing group entry cache Sep 13 00:27:43.615124 oslogin_cache_refresh[1530]: Failure getting users, quitting Sep 13 00:27:43.615147 oslogin_cache_refresh[1530]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 13 00:27:43.615219 oslogin_cache_refresh[1530]: Refreshing group entry cache Sep 13 00:27:43.616271 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 13 00:27:43.618690 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 13 00:27:43.618998 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 13 00:27:43.619396 systemd[1]: motdgen.service: Deactivated successfully. Sep 13 00:27:43.619687 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 13 00:27:43.622258 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Failure getting groups, quitting Sep 13 00:27:43.622258 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 13 00:27:43.622247 oslogin_cache_refresh[1530]: Failure getting groups, quitting Sep 13 00:27:43.622261 oslogin_cache_refresh[1530]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 13 00:27:43.623965 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 13 00:27:43.624255 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 13 00:27:43.629938 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 13 00:27:43.630311 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 13 00:27:43.643364 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 13 00:27:43.648997 jq[1557]: true Sep 13 00:27:43.686545 extend-filesystems[1551]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 13 00:27:43.686545 extend-filesystems[1551]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 13 00:27:43.686545 extend-filesystems[1551]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 13 00:27:43.696368 extend-filesystems[1529]: Resized filesystem in /dev/vda9 Sep 13 00:27:43.712360 update_engine[1548]: I20250913 00:27:43.691367 1548 main.cc:92] Flatcar Update Engine starting Sep 13 00:27:43.688399 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 13 00:27:43.689412 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 13 00:27:43.713525 (ntainerd)[1571]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 13 00:27:43.746116 dbus-daemon[1526]: [system] SELinux support is enabled Sep 13 00:27:43.746359 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 13 00:27:43.759140 update_engine[1548]: I20250913 00:27:43.758976 1548 update_check_scheduler.cc:74] Next update check in 11m15s Sep 13 00:27:43.764227 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 13 00:27:43.765401 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 13 00:27:43.798660 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 13 00:27:43.798845 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 13 00:27:43.814415 tar[1555]: linux-amd64/helm Sep 13 00:27:43.818258 systemd[1]: Started update-engine.service - Update Engine. Sep 13 00:27:43.850767 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 13 00:27:43.989650 systemd-logind[1546]: Watching system buttons on /dev/input/event2 (Power Button) Sep 13 00:27:43.989681 systemd-logind[1546]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 13 00:27:43.996655 systemd-logind[1546]: New seat seat0. Sep 13 00:27:44.004388 systemd[1]: Started systemd-logind.service - User Login Management. Sep 13 00:27:44.024385 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:27:44.033317 bash[1590]: Updated "/home/core/.ssh/authorized_keys" Sep 13 00:27:44.034997 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 13 00:27:44.045793 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 00:27:44.046544 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:27:44.057720 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 13 00:27:44.063902 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:27:44.811411 systemd-networkd[1487]: eth0: Gained IPv6LL Sep 13 00:27:44.812185 locksmithd[1586]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 13 00:27:44.824994 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 13 00:27:44.831552 systemd[1]: Reached target network-online.target - Network is Online. Sep 13 00:27:44.839942 kernel: kvm_amd: TSC scaling supported Sep 13 00:27:44.840023 kernel: kvm_amd: Nested Virtualization enabled Sep 13 00:27:44.840039 kernel: kvm_amd: Nested Paging enabled Sep 13 00:27:44.840053 kernel: kvm_amd: LBR virtualization supported Sep 13 00:27:44.841470 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 13 00:27:44.841511 kernel: kvm_amd: Virtual GIF supported Sep 13 00:27:44.841912 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 13 00:27:44.850535 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:27:44.855262 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 13 00:27:44.959785 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:27:44.984735 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 13 00:27:44.985158 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 13 00:27:44.986995 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 13 00:27:44.998995 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 13 00:27:45.300379 kernel: EDAC MC: Ver: 3.0.0 Sep 13 00:27:45.368995 sshd_keygen[1569]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 13 00:27:45.531533 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 13 00:27:45.539461 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 13 00:27:45.700682 containerd[1571]: time="2025-09-13T00:27:45Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 13 00:27:45.701299 containerd[1571]: time="2025-09-13T00:27:45.701083506Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 13 00:27:45.705981 systemd[1]: issuegen.service: Deactivated successfully. Sep 13 00:27:45.706309 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 13 00:27:45.716161 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 13 00:27:45.735243 containerd[1571]: time="2025-09-13T00:27:45.735165044Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="16.431µs" Sep 13 00:27:45.735316 containerd[1571]: time="2025-09-13T00:27:45.735239904Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 13 00:27:45.735316 containerd[1571]: time="2025-09-13T00:27:45.735273828Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.735712941Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.735743829Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.735794284Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.735898469Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.735917445Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.736384862Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.736408576Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.736563907Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.736578655Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.736707046Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737573 containerd[1571]: time="2025-09-13T00:27:45.737230858Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737878 containerd[1571]: time="2025-09-13T00:27:45.737287284Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 13 00:27:45.737878 containerd[1571]: time="2025-09-13T00:27:45.737302302Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 13 00:27:45.737878 containerd[1571]: time="2025-09-13T00:27:45.737358498Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 13 00:27:45.739398 containerd[1571]: time="2025-09-13T00:27:45.739321279Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 13 00:27:45.740610 containerd[1571]: time="2025-09-13T00:27:45.739457695Z" level=info msg="metadata content store policy set" policy=shared Sep 13 00:27:45.781528 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 13 00:27:45.793066 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 13 00:27:45.805989 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 13 00:27:45.815458 systemd[1]: Reached target getty.target - Login Prompts. Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880106833Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880225375Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880246274Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880261363Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880284085Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880297120Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880311827Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880328439Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880392609Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880432353Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880448273Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880467399Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880691580Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 13 00:27:45.881318 containerd[1571]: time="2025-09-13T00:27:45.880725974Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880746673Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880763685Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880778312Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880792479Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880808669Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880822445Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880836512Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880865626Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880882478Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.880984539Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.881012552Z" level=info msg="Start snapshots syncer" Sep 13 00:27:45.881866 containerd[1571]: time="2025-09-13T00:27:45.881060722Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 13 00:27:45.882171 containerd[1571]: time="2025-09-13T00:27:45.881441597Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 13 00:27:45.882171 containerd[1571]: time="2025-09-13T00:27:45.881582290Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881727673Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881869720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881898453Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881914654Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881929422Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881958516Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881973945Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.881989534Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.882026464Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.882041372Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.882054767Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.882093840Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.882114198Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 13 00:27:45.882456 containerd[1571]: time="2025-09-13T00:27:45.882126962Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882140938Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882151718Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882164472Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882177507Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882200290Z" level=info msg="runtime interface created" Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882219395Z" level=info msg="created NRI interface" Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882229895Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882242439Z" level=info msg="Connect containerd service" Sep 13 00:27:45.882814 containerd[1571]: time="2025-09-13T00:27:45.882270411Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 13 00:27:45.884455 containerd[1571]: time="2025-09-13T00:27:45.883677821Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 13 00:27:46.028771 tar[1555]: linux-amd64/LICENSE Sep 13 00:27:46.028771 tar[1555]: linux-amd64/README.md Sep 13 00:27:46.330052 kernel: hrtimer: interrupt took 3955429 ns Sep 13 00:27:46.345737 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692510991Z" level=info msg="Start subscribing containerd event" Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692622320Z" level=info msg="Start recovering state" Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692895622Z" level=info msg="Start event monitor" Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692920810Z" level=info msg="Start cni network conf syncer for default" Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692932171Z" level=info msg="Start streaming server" Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692963019Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692975212Z" level=info msg="runtime interface starting up..." Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.692984128Z" level=info msg="starting plugins..." Sep 13 00:27:46.693252 containerd[1571]: time="2025-09-13T00:27:46.693006801Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 13 00:27:46.694108 containerd[1571]: time="2025-09-13T00:27:46.694084182Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 13 00:27:46.694243 containerd[1571]: time="2025-09-13T00:27:46.694222972Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 13 00:27:46.694582 systemd[1]: Started containerd.service - containerd container runtime. Sep 13 00:27:46.697528 containerd[1571]: time="2025-09-13T00:27:46.697486223Z" level=info msg="containerd successfully booted in 0.997921s" Sep 13 00:27:48.670121 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 13 00:27:48.673202 systemd[1]: Started sshd@0-10.0.0.95:22-10.0.0.1:44118.service - OpenSSH per-connection server daemon (10.0.0.1:44118). Sep 13 00:27:48.969298 sshd[1666]: Accepted publickey for core from 10.0.0.1 port 44118 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:27:48.971218 sshd-session[1666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:27:48.973732 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:27:48.975671 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 13 00:27:48.976954 systemd[1]: Startup finished in 3.500s (kernel) + 6.430s (initrd) + 7.891s (userspace) = 17.822s. Sep 13 00:27:48.989762 (kubelet)[1673]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:27:48.990536 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 13 00:27:48.993022 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 13 00:27:49.006481 systemd-logind[1546]: New session 1 of user core. Sep 13 00:27:49.029682 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 13 00:27:49.035745 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 13 00:27:49.053915 (systemd)[1680]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 13 00:27:49.057542 systemd-logind[1546]: New session c1 of user core. Sep 13 00:27:49.260187 systemd[1680]: Queued start job for default target default.target. Sep 13 00:27:49.280790 systemd[1680]: Created slice app.slice - User Application Slice. Sep 13 00:27:49.280819 systemd[1680]: Reached target paths.target - Paths. Sep 13 00:27:49.280866 systemd[1680]: Reached target timers.target - Timers. Sep 13 00:27:49.282914 systemd[1680]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 13 00:27:49.297303 systemd[1680]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 13 00:27:49.297497 systemd[1680]: Reached target sockets.target - Sockets. Sep 13 00:27:49.297545 systemd[1680]: Reached target basic.target - Basic System. Sep 13 00:27:49.297589 systemd[1680]: Reached target default.target - Main User Target. Sep 13 00:27:49.297629 systemd[1680]: Startup finished in 212ms. Sep 13 00:27:49.298349 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 13 00:27:49.305501 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 13 00:27:49.376038 systemd[1]: Started sshd@1-10.0.0.95:22-10.0.0.1:44126.service - OpenSSH per-connection server daemon (10.0.0.1:44126). Sep 13 00:27:49.435176 sshd[1696]: Accepted publickey for core from 10.0.0.1 port 44126 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:27:49.437094 sshd-session[1696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:27:49.442114 systemd-logind[1546]: New session 2 of user core. Sep 13 00:27:49.483702 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 13 00:27:49.542123 sshd[1698]: Connection closed by 10.0.0.1 port 44126 Sep 13 00:27:49.542811 sshd-session[1696]: pam_unix(sshd:session): session closed for user core Sep 13 00:27:49.554438 systemd[1]: sshd@1-10.0.0.95:22-10.0.0.1:44126.service: Deactivated successfully. Sep 13 00:27:49.556097 systemd[1]: session-2.scope: Deactivated successfully. Sep 13 00:27:49.556836 systemd-logind[1546]: Session 2 logged out. Waiting for processes to exit. Sep 13 00:27:49.559863 systemd[1]: Started sshd@2-10.0.0.95:22-10.0.0.1:44138.service - OpenSSH per-connection server daemon (10.0.0.1:44138). Sep 13 00:27:49.560601 systemd-logind[1546]: Removed session 2. Sep 13 00:27:49.610708 sshd[1705]: Accepted publickey for core from 10.0.0.1 port 44138 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:27:49.612478 sshd-session[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:27:49.618735 systemd-logind[1546]: New session 3 of user core. Sep 13 00:27:49.634620 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 13 00:27:49.688131 sshd[1707]: Connection closed by 10.0.0.1 port 44138 Sep 13 00:27:49.687601 sshd-session[1705]: pam_unix(sshd:session): session closed for user core Sep 13 00:27:49.701523 systemd[1]: sshd@2-10.0.0.95:22-10.0.0.1:44138.service: Deactivated successfully. Sep 13 00:27:49.703998 systemd[1]: session-3.scope: Deactivated successfully. Sep 13 00:27:49.704905 systemd-logind[1546]: Session 3 logged out. Waiting for processes to exit. Sep 13 00:27:49.707628 kubelet[1673]: E0913 00:27:49.707547 1673 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:27:49.709606 systemd[1]: Started sshd@3-10.0.0.95:22-10.0.0.1:44162.service - OpenSSH per-connection server daemon (10.0.0.1:44162). Sep 13 00:27:49.710211 systemd-logind[1546]: Removed session 3. Sep 13 00:27:49.712560 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:27:49.712790 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:27:49.713243 systemd[1]: kubelet.service: Consumed 3.139s CPU time, 267.4M memory peak. Sep 13 00:27:49.756372 sshd[1713]: Accepted publickey for core from 10.0.0.1 port 44162 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:27:49.758374 sshd-session[1713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:27:49.763837 systemd-logind[1546]: New session 4 of user core. Sep 13 00:27:49.773781 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 13 00:27:49.829031 sshd[1716]: Connection closed by 10.0.0.1 port 44162 Sep 13 00:27:49.829148 sshd-session[1713]: pam_unix(sshd:session): session closed for user core Sep 13 00:27:49.842832 systemd[1]: sshd@3-10.0.0.95:22-10.0.0.1:44162.service: Deactivated successfully. Sep 13 00:27:49.844833 systemd[1]: session-4.scope: Deactivated successfully. Sep 13 00:27:49.845757 systemd-logind[1546]: Session 4 logged out. Waiting for processes to exit. Sep 13 00:27:49.849319 systemd[1]: Started sshd@4-10.0.0.95:22-10.0.0.1:44178.service - OpenSSH per-connection server daemon (10.0.0.1:44178). Sep 13 00:27:49.850194 systemd-logind[1546]: Removed session 4. Sep 13 00:27:49.906088 sshd[1722]: Accepted publickey for core from 10.0.0.1 port 44178 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:27:49.907844 sshd-session[1722]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:27:49.912949 systemd-logind[1546]: New session 5 of user core. Sep 13 00:27:49.922497 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 13 00:27:49.983603 sudo[1725]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 13 00:27:49.983973 sudo[1725]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:27:50.009457 sudo[1725]: pam_unix(sudo:session): session closed for user root Sep 13 00:27:50.011433 sshd[1724]: Connection closed by 10.0.0.1 port 44178 Sep 13 00:27:50.011779 sshd-session[1722]: pam_unix(sshd:session): session closed for user core Sep 13 00:27:50.025890 systemd[1]: sshd@4-10.0.0.95:22-10.0.0.1:44178.service: Deactivated successfully. Sep 13 00:27:50.028122 systemd[1]: session-5.scope: Deactivated successfully. Sep 13 00:27:50.028957 systemd-logind[1546]: Session 5 logged out. Waiting for processes to exit. Sep 13 00:27:50.032538 systemd[1]: Started sshd@5-10.0.0.95:22-10.0.0.1:50170.service - OpenSSH per-connection server daemon (10.0.0.1:50170). Sep 13 00:27:50.033157 systemd-logind[1546]: Removed session 5. Sep 13 00:27:50.082461 sshd[1731]: Accepted publickey for core from 10.0.0.1 port 50170 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:27:50.084140 sshd-session[1731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:27:50.089623 systemd-logind[1546]: New session 6 of user core. Sep 13 00:27:50.103587 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 13 00:27:50.159949 sudo[1735]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 13 00:27:50.160309 sudo[1735]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:27:50.651589 sudo[1735]: pam_unix(sudo:session): session closed for user root Sep 13 00:27:50.659866 sudo[1734]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 13 00:27:50.660299 sudo[1734]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:27:50.672536 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 13 00:27:50.726534 augenrules[1757]: No rules Sep 13 00:27:50.728301 systemd[1]: audit-rules.service: Deactivated successfully. Sep 13 00:27:50.728680 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 13 00:27:50.730057 sudo[1734]: pam_unix(sudo:session): session closed for user root Sep 13 00:27:50.731853 sshd[1733]: Connection closed by 10.0.0.1 port 50170 Sep 13 00:27:50.732189 sshd-session[1731]: pam_unix(sshd:session): session closed for user core Sep 13 00:27:50.741444 systemd[1]: sshd@5-10.0.0.95:22-10.0.0.1:50170.service: Deactivated successfully. Sep 13 00:27:50.743499 systemd[1]: session-6.scope: Deactivated successfully. Sep 13 00:27:50.744378 systemd-logind[1546]: Session 6 logged out. Waiting for processes to exit. Sep 13 00:27:50.747704 systemd[1]: Started sshd@6-10.0.0.95:22-10.0.0.1:50182.service - OpenSSH per-connection server daemon (10.0.0.1:50182). Sep 13 00:27:50.748471 systemd-logind[1546]: Removed session 6. Sep 13 00:27:50.790026 sshd[1766]: Accepted publickey for core from 10.0.0.1 port 50182 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:27:50.791613 sshd-session[1766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:27:50.796187 systemd-logind[1546]: New session 7 of user core. Sep 13 00:27:50.805459 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 13 00:27:50.857531 sudo[1769]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 13 00:27:50.857862 sudo[1769]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:27:51.536055 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 13 00:27:51.565897 (dockerd)[1789]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 13 00:27:52.161851 dockerd[1789]: time="2025-09-13T00:27:52.161764738Z" level=info msg="Starting up" Sep 13 00:27:52.162813 dockerd[1789]: time="2025-09-13T00:27:52.162782887Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 13 00:27:52.286256 dockerd[1789]: time="2025-09-13T00:27:52.286178021Z" level=info msg="Loading containers: start." Sep 13 00:27:52.298375 kernel: Initializing XFRM netlink socket Sep 13 00:27:52.575882 systemd-networkd[1487]: docker0: Link UP Sep 13 00:27:52.582492 dockerd[1789]: time="2025-09-13T00:27:52.582405860Z" level=info msg="Loading containers: done." Sep 13 00:27:52.602426 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2710925147-merged.mount: Deactivated successfully. Sep 13 00:27:52.665027 dockerd[1789]: time="2025-09-13T00:27:52.664941994Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 13 00:27:52.665455 dockerd[1789]: time="2025-09-13T00:27:52.665399372Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 13 00:27:52.665805 dockerd[1789]: time="2025-09-13T00:27:52.665761622Z" level=info msg="Initializing buildkit" Sep 13 00:27:52.709908 dockerd[1789]: time="2025-09-13T00:27:52.709718883Z" level=info msg="Completed buildkit initialization" Sep 13 00:27:52.719356 dockerd[1789]: time="2025-09-13T00:27:52.719288453Z" level=info msg="Daemon has completed initialization" Sep 13 00:27:52.719504 dockerd[1789]: time="2025-09-13T00:27:52.719384864Z" level=info msg="API listen on /run/docker.sock" Sep 13 00:27:52.719608 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 13 00:27:53.824186 containerd[1571]: time="2025-09-13T00:27:53.824116424Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 13 00:27:54.706468 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3875465063.mount: Deactivated successfully. Sep 13 00:27:55.753880 containerd[1571]: time="2025-09-13T00:27:55.753802674Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:55.754996 containerd[1571]: time="2025-09-13T00:27:55.754954294Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117124" Sep 13 00:27:55.756413 containerd[1571]: time="2025-09-13T00:27:55.756360141Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:55.762080 containerd[1571]: time="2025-09-13T00:27:55.761998237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:55.763190 containerd[1571]: time="2025-09-13T00:27:55.763109511Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 1.938940789s" Sep 13 00:27:55.763190 containerd[1571]: time="2025-09-13T00:27:55.763177949Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 13 00:27:55.764138 containerd[1571]: time="2025-09-13T00:27:55.764033214Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 13 00:27:57.330177 containerd[1571]: time="2025-09-13T00:27:57.330048721Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:57.330957 containerd[1571]: time="2025-09-13T00:27:57.330893035Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716632" Sep 13 00:27:57.332001 containerd[1571]: time="2025-09-13T00:27:57.331954967Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:57.334803 containerd[1571]: time="2025-09-13T00:27:57.334747845Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:57.335904 containerd[1571]: time="2025-09-13T00:27:57.335847819Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 1.57177968s" Sep 13 00:27:57.335904 containerd[1571]: time="2025-09-13T00:27:57.335886611Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 13 00:27:57.336703 containerd[1571]: time="2025-09-13T00:27:57.336525981Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 13 00:27:58.746049 containerd[1571]: time="2025-09-13T00:27:58.745959174Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:58.747838 containerd[1571]: time="2025-09-13T00:27:58.747768699Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787698" Sep 13 00:27:58.749365 containerd[1571]: time="2025-09-13T00:27:58.749312925Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:58.752349 containerd[1571]: time="2025-09-13T00:27:58.752295169Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:27:58.753304 containerd[1571]: time="2025-09-13T00:27:58.753251923Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 1.416686188s" Sep 13 00:27:58.753363 containerd[1571]: time="2025-09-13T00:27:58.753304793Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 13 00:27:58.754039 containerd[1571]: time="2025-09-13T00:27:58.753806744Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 13 00:27:59.927046 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 13 00:27:59.928964 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:28:00.149457 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2113416022.mount: Deactivated successfully. Sep 13 00:28:00.240672 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:28:00.246958 (kubelet)[2076]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:28:00.643536 kubelet[2076]: E0913 00:28:00.643176 2076 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:28:00.771478 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:28:00.771736 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:28:00.772642 systemd[1]: kubelet.service: Consumed 370ms CPU time, 110.7M memory peak. Sep 13 00:28:02.858293 containerd[1571]: time="2025-09-13T00:28:02.858023231Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:02.860014 containerd[1571]: time="2025-09-13T00:28:02.859925028Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410252" Sep 13 00:28:02.862446 containerd[1571]: time="2025-09-13T00:28:02.862367350Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:02.865324 containerd[1571]: time="2025-09-13T00:28:02.865226262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:02.866179 containerd[1571]: time="2025-09-13T00:28:02.866105732Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 4.112221121s" Sep 13 00:28:02.866179 containerd[1571]: time="2025-09-13T00:28:02.866167998Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 13 00:28:02.867216 containerd[1571]: time="2025-09-13T00:28:02.866890213Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 13 00:28:03.756940 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2121891263.mount: Deactivated successfully. Sep 13 00:28:05.300937 containerd[1571]: time="2025-09-13T00:28:05.300844446Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:05.302468 containerd[1571]: time="2025-09-13T00:28:05.302359487Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 13 00:28:05.304146 containerd[1571]: time="2025-09-13T00:28:05.304101986Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:05.307257 containerd[1571]: time="2025-09-13T00:28:05.307209404Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:05.308325 containerd[1571]: time="2025-09-13T00:28:05.308254535Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.441320449s" Sep 13 00:28:05.308325 containerd[1571]: time="2025-09-13T00:28:05.308312674Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 13 00:28:05.309176 containerd[1571]: time="2025-09-13T00:28:05.309147560Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 13 00:28:05.845964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2098796915.mount: Deactivated successfully. Sep 13 00:28:05.855721 containerd[1571]: time="2025-09-13T00:28:05.855666463Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:28:05.856618 containerd[1571]: time="2025-09-13T00:28:05.856554709Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 13 00:28:05.857914 containerd[1571]: time="2025-09-13T00:28:05.857857242Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:28:05.860442 containerd[1571]: time="2025-09-13T00:28:05.860405222Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:28:05.861303 containerd[1571]: time="2025-09-13T00:28:05.861248904Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 552.058424ms" Sep 13 00:28:05.861361 containerd[1571]: time="2025-09-13T00:28:05.861300030Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 13 00:28:05.862095 containerd[1571]: time="2025-09-13T00:28:05.861889886Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 13 00:28:06.492243 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1486939335.mount: Deactivated successfully. Sep 13 00:28:08.842097 containerd[1571]: time="2025-09-13T00:28:08.842032532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:08.842973 containerd[1571]: time="2025-09-13T00:28:08.842945675Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 13 00:28:08.844487 containerd[1571]: time="2025-09-13T00:28:08.844453423Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:08.847417 containerd[1571]: time="2025-09-13T00:28:08.847372749Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:08.848717 containerd[1571]: time="2025-09-13T00:28:08.848640146Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.986712328s" Sep 13 00:28:08.848717 containerd[1571]: time="2025-09-13T00:28:08.848701181Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 13 00:28:10.927058 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 13 00:28:10.929038 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:28:11.247820 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:28:11.260703 (kubelet)[2230]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:28:11.398408 kubelet[2230]: E0913 00:28:11.398322 2230 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:28:11.403896 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:28:11.404128 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:28:11.404570 systemd[1]: kubelet.service: Consumed 451ms CPU time, 108.5M memory peak. Sep 13 00:28:11.675525 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:28:11.675706 systemd[1]: kubelet.service: Consumed 451ms CPU time, 108.5M memory peak. Sep 13 00:28:11.678794 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:28:11.713703 systemd[1]: Reload requested from client PID 2246 ('systemctl') (unit session-7.scope)... Sep 13 00:28:11.713728 systemd[1]: Reloading... Sep 13 00:28:11.805389 zram_generator::config[2295]: No configuration found. Sep 13 00:28:12.331622 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:28:12.481869 systemd[1]: Reloading finished in 767 ms. Sep 13 00:28:12.553006 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 13 00:28:12.553104 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 13 00:28:12.553401 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:28:12.553442 systemd[1]: kubelet.service: Consumed 175ms CPU time, 98.2M memory peak. Sep 13 00:28:12.555103 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:28:12.732390 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:28:12.742637 (kubelet)[2337]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 00:28:12.895147 kubelet[2337]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:28:12.895147 kubelet[2337]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 13 00:28:12.895147 kubelet[2337]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:28:12.895147 kubelet[2337]: I0913 00:28:12.894388 2337 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 00:28:13.181129 kubelet[2337]: I0913 00:28:13.181077 2337 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 13 00:28:13.181129 kubelet[2337]: I0913 00:28:13.181114 2337 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 00:28:13.181423 kubelet[2337]: I0913 00:28:13.181402 2337 server.go:934] "Client rotation is on, will bootstrap in background" Sep 13 00:28:13.324094 kubelet[2337]: E0913 00:28:13.324035 2337 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.95:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:13.326136 kubelet[2337]: I0913 00:28:13.326102 2337 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 00:28:13.335379 kubelet[2337]: I0913 00:28:13.335314 2337 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 13 00:28:13.341906 kubelet[2337]: I0913 00:28:13.341881 2337 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 00:28:13.342499 kubelet[2337]: I0913 00:28:13.342473 2337 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 13 00:28:13.342661 kubelet[2337]: I0913 00:28:13.342623 2337 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 00:28:13.342884 kubelet[2337]: I0913 00:28:13.342653 2337 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 00:28:13.342996 kubelet[2337]: I0913 00:28:13.342891 2337 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 00:28:13.342996 kubelet[2337]: I0913 00:28:13.342901 2337 container_manager_linux.go:300] "Creating device plugin manager" Sep 13 00:28:13.343093 kubelet[2337]: I0913 00:28:13.343077 2337 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:28:13.348255 kubelet[2337]: I0913 00:28:13.348211 2337 kubelet.go:408] "Attempting to sync node with API server" Sep 13 00:28:13.348255 kubelet[2337]: I0913 00:28:13.348256 2337 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 00:28:13.348397 kubelet[2337]: I0913 00:28:13.348313 2337 kubelet.go:314] "Adding apiserver pod source" Sep 13 00:28:13.348397 kubelet[2337]: I0913 00:28:13.348375 2337 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 00:28:13.352635 kubelet[2337]: I0913 00:28:13.352601 2337 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 13 00:28:13.353626 kubelet[2337]: I0913 00:28:13.353094 2337 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 00:28:13.353626 kubelet[2337]: W0913 00:28:13.353179 2337 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 13 00:28:13.353736 kubelet[2337]: W0913 00:28:13.353599 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.95:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:13.353736 kubelet[2337]: E0913 00:28:13.353684 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.95:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:13.354016 kubelet[2337]: W0913 00:28:13.353966 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.95:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:13.354102 kubelet[2337]: E0913 00:28:13.354019 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.95:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:13.355896 kubelet[2337]: I0913 00:28:13.355864 2337 server.go:1274] "Started kubelet" Sep 13 00:28:13.356056 kubelet[2337]: I0913 00:28:13.355945 2337 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 00:28:13.357237 kubelet[2337]: I0913 00:28:13.356629 2337 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 00:28:13.357237 kubelet[2337]: I0913 00:28:13.356986 2337 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 00:28:13.358379 kubelet[2337]: I0913 00:28:13.357923 2337 server.go:449] "Adding debug handlers to kubelet server" Sep 13 00:28:13.358379 kubelet[2337]: I0913 00:28:13.357991 2337 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 00:28:13.360436 kubelet[2337]: I0913 00:28:13.359703 2337 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 00:28:13.363586 kubelet[2337]: E0913 00:28:13.362617 2337 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 00:28:13.365554 kubelet[2337]: E0913 00:28:13.365505 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:13.365679 kubelet[2337]: I0913 00:28:13.365583 2337 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 13 00:28:13.366174 kubelet[2337]: I0913 00:28:13.366159 2337 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 13 00:28:13.366982 kubelet[2337]: W0913 00:28:13.366929 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.95:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:13.367045 kubelet[2337]: E0913 00:28:13.366993 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.95:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:13.367045 kubelet[2337]: I0913 00:28:13.367032 2337 reconciler.go:26] "Reconciler: start to sync state" Sep 13 00:28:13.367045 kubelet[2337]: E0913 00:28:13.364847 2337 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.95:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.95:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1864b005c1e77df1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-13 00:28:13.355834865 +0000 UTC m=+0.605368890,LastTimestamp:2025-09-13 00:28:13.355834865 +0000 UTC m=+0.605368890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 13 00:28:13.367169 kubelet[2337]: E0913 00:28:13.367056 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.95:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.95:6443: connect: connection refused" interval="200ms" Sep 13 00:28:13.368660 kubelet[2337]: I0913 00:28:13.368620 2337 factory.go:221] Registration of the containerd container factory successfully Sep 13 00:28:13.368660 kubelet[2337]: I0913 00:28:13.368641 2337 factory.go:221] Registration of the systemd container factory successfully Sep 13 00:28:13.368744 kubelet[2337]: I0913 00:28:13.368720 2337 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 00:28:13.381100 kubelet[2337]: I0913 00:28:13.381067 2337 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 13 00:28:13.381100 kubelet[2337]: I0913 00:28:13.381087 2337 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 13 00:28:13.381219 kubelet[2337]: I0913 00:28:13.381108 2337 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:28:13.388366 kubelet[2337]: I0913 00:28:13.388315 2337 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 00:28:13.389664 kubelet[2337]: I0913 00:28:13.389614 2337 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 00:28:13.389720 kubelet[2337]: I0913 00:28:13.389693 2337 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 13 00:28:13.389763 kubelet[2337]: I0913 00:28:13.389733 2337 kubelet.go:2321] "Starting kubelet main sync loop" Sep 13 00:28:13.389831 kubelet[2337]: E0913 00:28:13.389798 2337 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 00:28:13.466699 kubelet[2337]: E0913 00:28:13.466600 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:13.489925 kubelet[2337]: E0913 00:28:13.489883 2337 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 13 00:28:13.567228 kubelet[2337]: E0913 00:28:13.567161 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:13.567827 kubelet[2337]: E0913 00:28:13.567766 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.95:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.95:6443: connect: connection refused" interval="400ms" Sep 13 00:28:13.668268 kubelet[2337]: E0913 00:28:13.668198 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:13.690493 kubelet[2337]: E0913 00:28:13.690419 2337 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 13 00:28:13.768947 kubelet[2337]: E0913 00:28:13.768787 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:13.869623 kubelet[2337]: E0913 00:28:13.869568 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:13.968539 kubelet[2337]: E0913 00:28:13.968471 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.95:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.95:6443: connect: connection refused" interval="800ms" Sep 13 00:28:13.970521 kubelet[2337]: E0913 00:28:13.970489 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.071087 kubelet[2337]: E0913 00:28:14.070943 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.091194 kubelet[2337]: E0913 00:28:14.091156 2337 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 13 00:28:14.171819 kubelet[2337]: E0913 00:28:14.171756 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.272321 kubelet[2337]: E0913 00:28:14.272235 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.373008 kubelet[2337]: E0913 00:28:14.372850 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.440187 kubelet[2337]: W0913 00:28:14.440097 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.95:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:14.440187 kubelet[2337]: E0913 00:28:14.440184 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.95:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:14.473761 kubelet[2337]: E0913 00:28:14.473701 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.493376 kubelet[2337]: W0913 00:28:14.493271 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.95:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:14.493376 kubelet[2337]: E0913 00:28:14.493314 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.95:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:14.573938 kubelet[2337]: E0913 00:28:14.573880 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.674517 kubelet[2337]: E0913 00:28:14.674366 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.769292 kubelet[2337]: E0913 00:28:14.769191 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.95:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.95:6443: connect: connection refused" interval="1.6s" Sep 13 00:28:14.775351 kubelet[2337]: E0913 00:28:14.775294 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.876236 kubelet[2337]: E0913 00:28:14.876182 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:14.891439 kubelet[2337]: E0913 00:28:14.891384 2337 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 13 00:28:14.937106 kubelet[2337]: W0913 00:28:14.937041 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.95:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:14.937202 kubelet[2337]: E0913 00:28:14.937119 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.95:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:14.976850 kubelet[2337]: E0913 00:28:14.976778 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:15.012905 kubelet[2337]: W0913 00:28:15.012798 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.95:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:15.059067 kubelet[2337]: E0913 00:28:15.012912 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.95:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:15.061442 kubelet[2337]: I0913 00:28:15.061394 2337 policy_none.go:49] "None policy: Start" Sep 13 00:28:15.062869 kubelet[2337]: I0913 00:28:15.062490 2337 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 13 00:28:15.062869 kubelet[2337]: I0913 00:28:15.062528 2337 state_mem.go:35] "Initializing new in-memory state store" Sep 13 00:28:15.077403 kubelet[2337]: E0913 00:28:15.077358 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:15.078868 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 13 00:28:15.093217 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 13 00:28:15.096853 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 13 00:28:15.104812 kubelet[2337]: I0913 00:28:15.104276 2337 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 00:28:15.104812 kubelet[2337]: I0913 00:28:15.104536 2337 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 00:28:15.104812 kubelet[2337]: I0913 00:28:15.104547 2337 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 00:28:15.104946 kubelet[2337]: I0913 00:28:15.104798 2337 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 00:28:15.106186 kubelet[2337]: E0913 00:28:15.106148 2337 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 13 00:28:15.206490 kubelet[2337]: I0913 00:28:15.205956 2337 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 13 00:28:15.206616 kubelet[2337]: E0913 00:28:15.206501 2337 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.95:6443/api/v1/nodes\": dial tcp 10.0.0.95:6443: connect: connection refused" node="localhost" Sep 13 00:28:15.408160 kubelet[2337]: I0913 00:28:15.408109 2337 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 13 00:28:15.408487 kubelet[2337]: E0913 00:28:15.408438 2337 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.95:6443/api/v1/nodes\": dial tcp 10.0.0.95:6443: connect: connection refused" node="localhost" Sep 13 00:28:15.425688 kubelet[2337]: E0913 00:28:15.425642 2337 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.95:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:15.810272 kubelet[2337]: I0913 00:28:15.810231 2337 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 13 00:28:15.810807 kubelet[2337]: E0913 00:28:15.810746 2337 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.95:6443/api/v1/nodes\": dial tcp 10.0.0.95:6443: connect: connection refused" node="localhost" Sep 13 00:28:15.848526 kubelet[2337]: W0913 00:28:15.848439 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.95:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:15.848526 kubelet[2337]: E0913 00:28:15.848516 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.95:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:16.303484 kubelet[2337]: W0913 00:28:16.303366 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.95:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:16.303484 kubelet[2337]: E0913 00:28:16.303442 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.95:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:16.347853 kubelet[2337]: W0913 00:28:16.347703 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.95:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:16.347853 kubelet[2337]: E0913 00:28:16.347777 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.95:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:16.369853 kubelet[2337]: E0913 00:28:16.369739 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.95:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.95:6443: connect: connection refused" interval="3.2s" Sep 13 00:28:16.505078 systemd[1]: Created slice kubepods-burstable-pod71d8bf7bd9b7c7432927bee9d50592b5.slice - libcontainer container kubepods-burstable-pod71d8bf7bd9b7c7432927bee9d50592b5.slice. Sep 13 00:28:16.536039 systemd[1]: Created slice kubepods-burstable-podfe5e332fba00ba0b5b33a25fe2e8fd7b.slice - libcontainer container kubepods-burstable-podfe5e332fba00ba0b5b33a25fe2e8fd7b.slice. Sep 13 00:28:16.540905 systemd[1]: Created slice kubepods-burstable-pod9982cd9780527eec9f0ce7118aada200.slice - libcontainer container kubepods-burstable-pod9982cd9780527eec9f0ce7118aada200.slice. Sep 13 00:28:16.586374 kubelet[2337]: I0913 00:28:16.586184 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9982cd9780527eec9f0ce7118aada200-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"9982cd9780527eec9f0ce7118aada200\") " pod="kube-system/kube-apiserver-localhost" Sep 13 00:28:16.586374 kubelet[2337]: I0913 00:28:16.586244 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9982cd9780527eec9f0ce7118aada200-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"9982cd9780527eec9f0ce7118aada200\") " pod="kube-system/kube-apiserver-localhost" Sep 13 00:28:16.586374 kubelet[2337]: I0913 00:28:16.586268 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9982cd9780527eec9f0ce7118aada200-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"9982cd9780527eec9f0ce7118aada200\") " pod="kube-system/kube-apiserver-localhost" Sep 13 00:28:16.586374 kubelet[2337]: I0913 00:28:16.586298 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:16.586374 kubelet[2337]: I0913 00:28:16.586370 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fe5e332fba00ba0b5b33a25fe2e8fd7b-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"fe5e332fba00ba0b5b33a25fe2e8fd7b\") " pod="kube-system/kube-scheduler-localhost" Sep 13 00:28:16.586652 kubelet[2337]: I0913 00:28:16.586439 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:16.586652 kubelet[2337]: I0913 00:28:16.586523 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:16.586652 kubelet[2337]: I0913 00:28:16.586591 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:16.586652 kubelet[2337]: I0913 00:28:16.586634 2337 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:16.613045 kubelet[2337]: I0913 00:28:16.612925 2337 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 13 00:28:16.613545 kubelet[2337]: E0913 00:28:16.613488 2337 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.95:6443/api/v1/nodes\": dial tcp 10.0.0.95:6443: connect: connection refused" node="localhost" Sep 13 00:28:16.833240 containerd[1571]: time="2025-09-13T00:28:16.833180504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:71d8bf7bd9b7c7432927bee9d50592b5,Namespace:kube-system,Attempt:0,}" Sep 13 00:28:16.840062 containerd[1571]: time="2025-09-13T00:28:16.839253025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:fe5e332fba00ba0b5b33a25fe2e8fd7b,Namespace:kube-system,Attempt:0,}" Sep 13 00:28:16.844189 containerd[1571]: time="2025-09-13T00:28:16.844059683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:9982cd9780527eec9f0ce7118aada200,Namespace:kube-system,Attempt:0,}" Sep 13 00:28:16.887252 containerd[1571]: time="2025-09-13T00:28:16.887198021Z" level=info msg="connecting to shim 742afbdda3b307a32fe8fe012c3c72b1e6084e336de476b95ac5edd93bbf5a1d" address="unix:///run/containerd/s/931599db2b8682944840d7b821b2069cff696630ac64f99aeab8f36956a8f529" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:28:16.920390 containerd[1571]: time="2025-09-13T00:28:16.910468315Z" level=info msg="connecting to shim 483b260b0d93301a4d8196e1ea2f85d31aabc1b24435637f45d49b9b2ebd9fae" address="unix:///run/containerd/s/aadb58b36275ca44e04f626faeb57c7fa36abbb05d9045efc127e6d79e5fdbe5" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:28:17.002306 containerd[1571]: time="2025-09-13T00:28:17.002234175Z" level=info msg="connecting to shim 848498cda0273f6545e747592cbff04e8ecd0a9362b06cb9e1918761c4408184" address="unix:///run/containerd/s/074213cf5bb42e54c579f8be3c124998dabb37a2fa0e01242b654025ecf1bfec" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:28:17.013649 systemd[1]: Started cri-containerd-483b260b0d93301a4d8196e1ea2f85d31aabc1b24435637f45d49b9b2ebd9fae.scope - libcontainer container 483b260b0d93301a4d8196e1ea2f85d31aabc1b24435637f45d49b9b2ebd9fae. Sep 13 00:28:17.015783 systemd[1]: Started cri-containerd-742afbdda3b307a32fe8fe012c3c72b1e6084e336de476b95ac5edd93bbf5a1d.scope - libcontainer container 742afbdda3b307a32fe8fe012c3c72b1e6084e336de476b95ac5edd93bbf5a1d. Sep 13 00:28:17.041482 systemd[1]: Started cri-containerd-848498cda0273f6545e747592cbff04e8ecd0a9362b06cb9e1918761c4408184.scope - libcontainer container 848498cda0273f6545e747592cbff04e8ecd0a9362b06cb9e1918761c4408184. Sep 13 00:28:17.219893 containerd[1571]: time="2025-09-13T00:28:17.219779429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:fe5e332fba00ba0b5b33a25fe2e8fd7b,Namespace:kube-system,Attempt:0,} returns sandbox id \"483b260b0d93301a4d8196e1ea2f85d31aabc1b24435637f45d49b9b2ebd9fae\"" Sep 13 00:28:17.223682 containerd[1571]: time="2025-09-13T00:28:17.223611904Z" level=info msg="CreateContainer within sandbox \"483b260b0d93301a4d8196e1ea2f85d31aabc1b24435637f45d49b9b2ebd9fae\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 13 00:28:17.282102 containerd[1571]: time="2025-09-13T00:28:17.282032555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:71d8bf7bd9b7c7432927bee9d50592b5,Namespace:kube-system,Attempt:0,} returns sandbox id \"742afbdda3b307a32fe8fe012c3c72b1e6084e336de476b95ac5edd93bbf5a1d\"" Sep 13 00:28:17.283773 containerd[1571]: time="2025-09-13T00:28:17.283730453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:9982cd9780527eec9f0ce7118aada200,Namespace:kube-system,Attempt:0,} returns sandbox id \"848498cda0273f6545e747592cbff04e8ecd0a9362b06cb9e1918761c4408184\"" Sep 13 00:28:17.285407 containerd[1571]: time="2025-09-13T00:28:17.285322528Z" level=info msg="CreateContainer within sandbox \"742afbdda3b307a32fe8fe012c3c72b1e6084e336de476b95ac5edd93bbf5a1d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 13 00:28:17.286314 containerd[1571]: time="2025-09-13T00:28:17.286288923Z" level=info msg="CreateContainer within sandbox \"848498cda0273f6545e747592cbff04e8ecd0a9362b06cb9e1918761c4408184\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 13 00:28:17.295112 containerd[1571]: time="2025-09-13T00:28:17.295053532Z" level=info msg="Container 17b0c124a0c41b6b437a8eddb464f34257f352668daee6005d13961d02485421: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:17.306067 containerd[1571]: time="2025-09-13T00:28:17.306010078Z" level=info msg="Container d11b67ae4765fe30a391eda87efaaf18216e2f05fbfb75d41b5373a0c46f9c34: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:17.309379 containerd[1571]: time="2025-09-13T00:28:17.309301785Z" level=info msg="Container aa39ba6f541f483c5f9f1b48a7c7285f084e0d48e0596b7cbeb52da8d3bb5f3f: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:17.312865 containerd[1571]: time="2025-09-13T00:28:17.312815087Z" level=info msg="CreateContainer within sandbox \"483b260b0d93301a4d8196e1ea2f85d31aabc1b24435637f45d49b9b2ebd9fae\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"17b0c124a0c41b6b437a8eddb464f34257f352668daee6005d13961d02485421\"" Sep 13 00:28:17.313520 containerd[1571]: time="2025-09-13T00:28:17.313478781Z" level=info msg="StartContainer for \"17b0c124a0c41b6b437a8eddb464f34257f352668daee6005d13961d02485421\"" Sep 13 00:28:17.314742 containerd[1571]: time="2025-09-13T00:28:17.314709492Z" level=info msg="connecting to shim 17b0c124a0c41b6b437a8eddb464f34257f352668daee6005d13961d02485421" address="unix:///run/containerd/s/aadb58b36275ca44e04f626faeb57c7fa36abbb05d9045efc127e6d79e5fdbe5" protocol=ttrpc version=3 Sep 13 00:28:17.316441 containerd[1571]: time="2025-09-13T00:28:17.316405227Z" level=info msg="CreateContainer within sandbox \"848498cda0273f6545e747592cbff04e8ecd0a9362b06cb9e1918761c4408184\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d11b67ae4765fe30a391eda87efaaf18216e2f05fbfb75d41b5373a0c46f9c34\"" Sep 13 00:28:17.316750 containerd[1571]: time="2025-09-13T00:28:17.316727134Z" level=info msg="StartContainer for \"d11b67ae4765fe30a391eda87efaaf18216e2f05fbfb75d41b5373a0c46f9c34\"" Sep 13 00:28:17.317722 containerd[1571]: time="2025-09-13T00:28:17.317694771Z" level=info msg="connecting to shim d11b67ae4765fe30a391eda87efaaf18216e2f05fbfb75d41b5373a0c46f9c34" address="unix:///run/containerd/s/074213cf5bb42e54c579f8be3c124998dabb37a2fa0e01242b654025ecf1bfec" protocol=ttrpc version=3 Sep 13 00:28:17.324408 containerd[1571]: time="2025-09-13T00:28:17.322840065Z" level=info msg="CreateContainer within sandbox \"742afbdda3b307a32fe8fe012c3c72b1e6084e336de476b95ac5edd93bbf5a1d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"aa39ba6f541f483c5f9f1b48a7c7285f084e0d48e0596b7cbeb52da8d3bb5f3f\"" Sep 13 00:28:17.324408 containerd[1571]: time="2025-09-13T00:28:17.323255863Z" level=info msg="StartContainer for \"aa39ba6f541f483c5f9f1b48a7c7285f084e0d48e0596b7cbeb52da8d3bb5f3f\"" Sep 13 00:28:17.324535 containerd[1571]: time="2025-09-13T00:28:17.324512896Z" level=info msg="connecting to shim aa39ba6f541f483c5f9f1b48a7c7285f084e0d48e0596b7cbeb52da8d3bb5f3f" address="unix:///run/containerd/s/931599db2b8682944840d7b821b2069cff696630ac64f99aeab8f36956a8f529" protocol=ttrpc version=3 Sep 13 00:28:17.341483 systemd[1]: Started cri-containerd-17b0c124a0c41b6b437a8eddb464f34257f352668daee6005d13961d02485421.scope - libcontainer container 17b0c124a0c41b6b437a8eddb464f34257f352668daee6005d13961d02485421. Sep 13 00:28:17.347119 systemd[1]: Started cri-containerd-aa39ba6f541f483c5f9f1b48a7c7285f084e0d48e0596b7cbeb52da8d3bb5f3f.scope - libcontainer container aa39ba6f541f483c5f9f1b48a7c7285f084e0d48e0596b7cbeb52da8d3bb5f3f. Sep 13 00:28:17.349092 systemd[1]: Started cri-containerd-d11b67ae4765fe30a391eda87efaaf18216e2f05fbfb75d41b5373a0c46f9c34.scope - libcontainer container d11b67ae4765fe30a391eda87efaaf18216e2f05fbfb75d41b5373a0c46f9c34. Sep 13 00:28:17.473418 containerd[1571]: time="2025-09-13T00:28:17.472741659Z" level=info msg="StartContainer for \"17b0c124a0c41b6b437a8eddb464f34257f352668daee6005d13961d02485421\" returns successfully" Sep 13 00:28:17.526134 containerd[1571]: time="2025-09-13T00:28:17.525695740Z" level=info msg="StartContainer for \"aa39ba6f541f483c5f9f1b48a7c7285f084e0d48e0596b7cbeb52da8d3bb5f3f\" returns successfully" Sep 13 00:28:17.526295 containerd[1571]: time="2025-09-13T00:28:17.526110666Z" level=info msg="StartContainer for \"d11b67ae4765fe30a391eda87efaaf18216e2f05fbfb75d41b5373a0c46f9c34\" returns successfully" Sep 13 00:28:17.545364 kubelet[2337]: W0913 00:28:17.543683 2337 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.95:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.95:6443: connect: connection refused Sep 13 00:28:17.545364 kubelet[2337]: E0913 00:28:17.543779 2337 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.95:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.95:6443: connect: connection refused" logger="UnhandledError" Sep 13 00:28:18.215150 kubelet[2337]: I0913 00:28:18.215095 2337 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 13 00:28:19.493840 kubelet[2337]: I0913 00:28:19.493759 2337 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 13 00:28:19.495488 kubelet[2337]: E0913 00:28:19.494132 2337 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 13 00:28:19.508328 kubelet[2337]: E0913 00:28:19.508266 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:19.608680 kubelet[2337]: E0913 00:28:19.608566 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:19.709267 kubelet[2337]: E0913 00:28:19.709205 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:19.809921 kubelet[2337]: E0913 00:28:19.809745 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:19.910183 kubelet[2337]: E0913 00:28:19.910139 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.011288 kubelet[2337]: E0913 00:28:20.011149 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.112473 kubelet[2337]: E0913 00:28:20.112296 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.213487 kubelet[2337]: E0913 00:28:20.213400 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.314426 kubelet[2337]: E0913 00:28:20.314367 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.415119 kubelet[2337]: E0913 00:28:20.414989 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.516157 kubelet[2337]: E0913 00:28:20.516113 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.616895 kubelet[2337]: E0913 00:28:20.616833 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.717527 kubelet[2337]: E0913 00:28:20.717472 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.817774 kubelet[2337]: E0913 00:28:20.817714 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:20.918466 kubelet[2337]: E0913 00:28:20.918371 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.019151 kubelet[2337]: E0913 00:28:21.019000 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.119728 kubelet[2337]: E0913 00:28:21.119647 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.220800 kubelet[2337]: E0913 00:28:21.220650 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.322018 kubelet[2337]: E0913 00:28:21.321828 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.422443 kubelet[2337]: E0913 00:28:21.422390 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.522844 kubelet[2337]: E0913 00:28:21.522778 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.623709 kubelet[2337]: E0913 00:28:21.623527 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.724631 kubelet[2337]: E0913 00:28:21.724535 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.817266 systemd[1]: Reload requested from client PID 2610 ('systemctl') (unit session-7.scope)... Sep 13 00:28:21.817286 systemd[1]: Reloading... Sep 13 00:28:21.825526 kubelet[2337]: E0913 00:28:21.825464 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.927434 kubelet[2337]: E0913 00:28:21.925972 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:21.954372 zram_generator::config[2656]: No configuration found. Sep 13 00:28:22.026995 kubelet[2337]: E0913 00:28:22.026944 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:22.058198 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:28:22.127567 kubelet[2337]: E0913 00:28:22.127511 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:22.211085 systemd[1]: Reloading finished in 393 ms. Sep 13 00:28:22.228902 kubelet[2337]: E0913 00:28:22.228844 2337 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 13 00:28:22.241854 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:28:22.257277 systemd[1]: kubelet.service: Deactivated successfully. Sep 13 00:28:22.257635 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:28:22.257706 systemd[1]: kubelet.service: Consumed 1.123s CPU time, 131.1M memory peak. Sep 13 00:28:22.260050 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:28:22.515776 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:28:22.520373 (kubelet)[2698]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 00:28:22.565664 kubelet[2698]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:28:22.565664 kubelet[2698]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 13 00:28:22.565664 kubelet[2698]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:28:22.566123 kubelet[2698]: I0913 00:28:22.565720 2698 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 00:28:22.572669 kubelet[2698]: I0913 00:28:22.572632 2698 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 13 00:28:22.572669 kubelet[2698]: I0913 00:28:22.572654 2698 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 00:28:22.580320 kubelet[2698]: I0913 00:28:22.572869 2698 server.go:934] "Client rotation is on, will bootstrap in background" Sep 13 00:28:22.580891 kubelet[2698]: I0913 00:28:22.580863 2698 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 13 00:28:22.582868 kubelet[2698]: I0913 00:28:22.582831 2698 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 00:28:22.586715 kubelet[2698]: I0913 00:28:22.586671 2698 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 13 00:28:22.591818 kubelet[2698]: I0913 00:28:22.591796 2698 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 00:28:22.591947 kubelet[2698]: I0913 00:28:22.591933 2698 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 13 00:28:22.592106 kubelet[2698]: I0913 00:28:22.592085 2698 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 00:28:22.592288 kubelet[2698]: I0913 00:28:22.592106 2698 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 00:28:22.592288 kubelet[2698]: I0913 00:28:22.592288 2698 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 00:28:22.592475 kubelet[2698]: I0913 00:28:22.592328 2698 container_manager_linux.go:300] "Creating device plugin manager" Sep 13 00:28:22.592475 kubelet[2698]: I0913 00:28:22.592383 2698 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:28:22.592536 kubelet[2698]: I0913 00:28:22.592519 2698 kubelet.go:408] "Attempting to sync node with API server" Sep 13 00:28:22.592536 kubelet[2698]: I0913 00:28:22.592534 2698 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 00:28:22.592602 kubelet[2698]: I0913 00:28:22.592575 2698 kubelet.go:314] "Adding apiserver pod source" Sep 13 00:28:22.592602 kubelet[2698]: I0913 00:28:22.592589 2698 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 00:28:22.594604 kubelet[2698]: I0913 00:28:22.594577 2698 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 13 00:28:22.595088 kubelet[2698]: I0913 00:28:22.595069 2698 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 13 00:28:22.596764 kubelet[2698]: I0913 00:28:22.596741 2698 server.go:1274] "Started kubelet" Sep 13 00:28:22.601816 kubelet[2698]: I0913 00:28:22.601616 2698 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 00:28:22.605261 kubelet[2698]: I0913 00:28:22.605220 2698 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 00:28:22.605810 kubelet[2698]: I0913 00:28:22.605772 2698 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 00:28:22.606079 kubelet[2698]: I0913 00:28:22.606010 2698 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 00:28:22.606313 kubelet[2698]: I0913 00:28:22.606288 2698 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 00:28:22.606408 kubelet[2698]: I0913 00:28:22.606325 2698 server.go:449] "Adding debug handlers to kubelet server" Sep 13 00:28:22.606974 kubelet[2698]: I0913 00:28:22.606470 2698 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 13 00:28:22.606974 kubelet[2698]: I0913 00:28:22.606579 2698 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 13 00:28:22.606974 kubelet[2698]: I0913 00:28:22.606745 2698 reconciler.go:26] "Reconciler: start to sync state" Sep 13 00:28:22.607316 kubelet[2698]: I0913 00:28:22.607191 2698 factory.go:221] Registration of the systemd container factory successfully Sep 13 00:28:22.607316 kubelet[2698]: I0913 00:28:22.607290 2698 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 00:28:22.607515 kubelet[2698]: E0913 00:28:22.607495 2698 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 00:28:22.609500 kubelet[2698]: I0913 00:28:22.609478 2698 factory.go:221] Registration of the containerd container factory successfully Sep 13 00:28:22.616007 kubelet[2698]: I0913 00:28:22.615951 2698 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 13 00:28:22.617593 kubelet[2698]: I0913 00:28:22.617577 2698 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 13 00:28:22.617713 kubelet[2698]: I0913 00:28:22.617669 2698 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 13 00:28:22.618553 kubelet[2698]: I0913 00:28:22.618068 2698 kubelet.go:2321] "Starting kubelet main sync loop" Sep 13 00:28:22.618553 kubelet[2698]: E0913 00:28:22.618154 2698 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 00:28:22.647082 kubelet[2698]: I0913 00:28:22.647042 2698 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 13 00:28:22.647082 kubelet[2698]: I0913 00:28:22.647060 2698 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 13 00:28:22.647082 kubelet[2698]: I0913 00:28:22.647077 2698 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:28:22.647308 kubelet[2698]: I0913 00:28:22.647230 2698 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 13 00:28:22.647308 kubelet[2698]: I0913 00:28:22.647240 2698 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 13 00:28:22.647308 kubelet[2698]: I0913 00:28:22.647259 2698 policy_none.go:49] "None policy: Start" Sep 13 00:28:22.647931 kubelet[2698]: I0913 00:28:22.647895 2698 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 13 00:28:22.647931 kubelet[2698]: I0913 00:28:22.647931 2698 state_mem.go:35] "Initializing new in-memory state store" Sep 13 00:28:22.648154 kubelet[2698]: I0913 00:28:22.648125 2698 state_mem.go:75] "Updated machine memory state" Sep 13 00:28:22.653364 kubelet[2698]: I0913 00:28:22.652963 2698 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 13 00:28:22.653364 kubelet[2698]: I0913 00:28:22.653151 2698 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 00:28:22.653364 kubelet[2698]: I0913 00:28:22.653164 2698 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 00:28:22.653456 kubelet[2698]: I0913 00:28:22.653432 2698 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 00:28:22.757692 kubelet[2698]: I0913 00:28:22.757615 2698 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 13 00:28:22.808847 kubelet[2698]: I0913 00:28:22.808067 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9982cd9780527eec9f0ce7118aada200-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"9982cd9780527eec9f0ce7118aada200\") " pod="kube-system/kube-apiserver-localhost" Sep 13 00:28:22.808847 kubelet[2698]: I0913 00:28:22.808125 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:22.808847 kubelet[2698]: I0913 00:28:22.808146 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:22.808847 kubelet[2698]: I0913 00:28:22.808175 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9982cd9780527eec9f0ce7118aada200-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"9982cd9780527eec9f0ce7118aada200\") " pod="kube-system/kube-apiserver-localhost" Sep 13 00:28:22.808847 kubelet[2698]: I0913 00:28:22.808192 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9982cd9780527eec9f0ce7118aada200-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"9982cd9780527eec9f0ce7118aada200\") " pod="kube-system/kube-apiserver-localhost" Sep 13 00:28:22.809243 kubelet[2698]: I0913 00:28:22.808250 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:22.809243 kubelet[2698]: I0913 00:28:22.808294 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:22.809243 kubelet[2698]: I0913 00:28:22.808319 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/71d8bf7bd9b7c7432927bee9d50592b5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"71d8bf7bd9b7c7432927bee9d50592b5\") " pod="kube-system/kube-controller-manager-localhost" Sep 13 00:28:22.809243 kubelet[2698]: I0913 00:28:22.808367 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fe5e332fba00ba0b5b33a25fe2e8fd7b-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"fe5e332fba00ba0b5b33a25fe2e8fd7b\") " pod="kube-system/kube-scheduler-localhost" Sep 13 00:28:22.978101 kubelet[2698]: I0913 00:28:22.977556 2698 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 13 00:28:22.978714 kubelet[2698]: I0913 00:28:22.978633 2698 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 13 00:28:23.593928 kubelet[2698]: I0913 00:28:23.593861 2698 apiserver.go:52] "Watching apiserver" Sep 13 00:28:23.607224 kubelet[2698]: I0913 00:28:23.607151 2698 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 13 00:28:23.637849 kubelet[2698]: E0913 00:28:23.637791 2698 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Sep 13 00:28:23.652909 kubelet[2698]: I0913 00:28:23.652842 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.652766568 podStartE2EDuration="1.652766568s" podCreationTimestamp="2025-09-13 00:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:28:23.652271955 +0000 UTC m=+1.127302197" watchObservedRunningTime="2025-09-13 00:28:23.652766568 +0000 UTC m=+1.127796810" Sep 13 00:28:23.660507 kubelet[2698]: I0913 00:28:23.660438 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.660413897 podStartE2EDuration="1.660413897s" podCreationTimestamp="2025-09-13 00:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:28:23.660170163 +0000 UTC m=+1.135200415" watchObservedRunningTime="2025-09-13 00:28:23.660413897 +0000 UTC m=+1.135444139" Sep 13 00:28:26.399045 kubelet[2698]: I0913 00:28:26.399005 2698 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 13 00:28:26.399601 containerd[1571]: time="2025-09-13T00:28:26.399441438Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 13 00:28:26.399881 kubelet[2698]: I0913 00:28:26.399626 2698 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 13 00:28:27.256652 kubelet[2698]: I0913 00:28:27.256576 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=5.256552011 podStartE2EDuration="5.256552011s" podCreationTimestamp="2025-09-13 00:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:28:23.669713083 +0000 UTC m=+1.144743325" watchObservedRunningTime="2025-09-13 00:28:27.256552011 +0000 UTC m=+4.731582253" Sep 13 00:28:27.265776 systemd[1]: Created slice kubepods-besteffort-pod21519400_0534_4ebd_a46d_d58c423fae1f.slice - libcontainer container kubepods-besteffort-pod21519400_0534_4ebd_a46d_d58c423fae1f.slice. Sep 13 00:28:27.333757 kubelet[2698]: I0913 00:28:27.333719 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rcfz\" (UniqueName: \"kubernetes.io/projected/21519400-0534-4ebd-a46d-d58c423fae1f-kube-api-access-8rcfz\") pod \"kube-proxy-mvhsm\" (UID: \"21519400-0534-4ebd-a46d-d58c423fae1f\") " pod="kube-system/kube-proxy-mvhsm" Sep 13 00:28:27.333886 kubelet[2698]: I0913 00:28:27.333760 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/21519400-0534-4ebd-a46d-d58c423fae1f-kube-proxy\") pod \"kube-proxy-mvhsm\" (UID: \"21519400-0534-4ebd-a46d-d58c423fae1f\") " pod="kube-system/kube-proxy-mvhsm" Sep 13 00:28:27.333886 kubelet[2698]: I0913 00:28:27.333784 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/21519400-0534-4ebd-a46d-d58c423fae1f-xtables-lock\") pod \"kube-proxy-mvhsm\" (UID: \"21519400-0534-4ebd-a46d-d58c423fae1f\") " pod="kube-system/kube-proxy-mvhsm" Sep 13 00:28:27.333886 kubelet[2698]: I0913 00:28:27.333804 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21519400-0534-4ebd-a46d-d58c423fae1f-lib-modules\") pod \"kube-proxy-mvhsm\" (UID: \"21519400-0534-4ebd-a46d-d58c423fae1f\") " pod="kube-system/kube-proxy-mvhsm" Sep 13 00:28:27.529027 systemd[1]: Created slice kubepods-besteffort-pod4cb5f543_ab50_40fd_9ca4_f26ddf68058e.slice - libcontainer container kubepods-besteffort-pod4cb5f543_ab50_40fd_9ca4_f26ddf68058e.slice. Sep 13 00:28:27.535003 kubelet[2698]: I0913 00:28:27.534961 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4cb5f543-ab50-40fd-9ca4-f26ddf68058e-var-lib-calico\") pod \"tigera-operator-58fc44c59b-gttgt\" (UID: \"4cb5f543-ab50-40fd-9ca4-f26ddf68058e\") " pod="tigera-operator/tigera-operator-58fc44c59b-gttgt" Sep 13 00:28:27.535003 kubelet[2698]: I0913 00:28:27.535005 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmghm\" (UniqueName: \"kubernetes.io/projected/4cb5f543-ab50-40fd-9ca4-f26ddf68058e-kube-api-access-pmghm\") pod \"tigera-operator-58fc44c59b-gttgt\" (UID: \"4cb5f543-ab50-40fd-9ca4-f26ddf68058e\") " pod="tigera-operator/tigera-operator-58fc44c59b-gttgt" Sep 13 00:28:27.576986 containerd[1571]: time="2025-09-13T00:28:27.576936789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mvhsm,Uid:21519400-0534-4ebd-a46d-d58c423fae1f,Namespace:kube-system,Attempt:0,}" Sep 13 00:28:27.611365 containerd[1571]: time="2025-09-13T00:28:27.611291219Z" level=info msg="connecting to shim 5419ca52cb53c71b3f55556c39b5ad98f542fa943d5aebdd84906dd79cbd8c7b" address="unix:///run/containerd/s/5aa8222d602084e983533a23cf875cdd4501ce89d81e76cc8c6249d5bd590dc3" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:28:27.648546 systemd[1]: Started cri-containerd-5419ca52cb53c71b3f55556c39b5ad98f542fa943d5aebdd84906dd79cbd8c7b.scope - libcontainer container 5419ca52cb53c71b3f55556c39b5ad98f542fa943d5aebdd84906dd79cbd8c7b. Sep 13 00:28:27.679078 containerd[1571]: time="2025-09-13T00:28:27.679034454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mvhsm,Uid:21519400-0534-4ebd-a46d-d58c423fae1f,Namespace:kube-system,Attempt:0,} returns sandbox id \"5419ca52cb53c71b3f55556c39b5ad98f542fa943d5aebdd84906dd79cbd8c7b\"" Sep 13 00:28:27.681640 containerd[1571]: time="2025-09-13T00:28:27.681610144Z" level=info msg="CreateContainer within sandbox \"5419ca52cb53c71b3f55556c39b5ad98f542fa943d5aebdd84906dd79cbd8c7b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 13 00:28:27.696560 containerd[1571]: time="2025-09-13T00:28:27.696515901Z" level=info msg="Container ca7a354177c12915ea8ffb8d7491fc8062d1fdea6b7542c5a0ce48f012b9e6be: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:27.705444 containerd[1571]: time="2025-09-13T00:28:27.705407762Z" level=info msg="CreateContainer within sandbox \"5419ca52cb53c71b3f55556c39b5ad98f542fa943d5aebdd84906dd79cbd8c7b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ca7a354177c12915ea8ffb8d7491fc8062d1fdea6b7542c5a0ce48f012b9e6be\"" Sep 13 00:28:27.705969 containerd[1571]: time="2025-09-13T00:28:27.705942497Z" level=info msg="StartContainer for \"ca7a354177c12915ea8ffb8d7491fc8062d1fdea6b7542c5a0ce48f012b9e6be\"" Sep 13 00:28:27.707459 containerd[1571]: time="2025-09-13T00:28:27.707420062Z" level=info msg="connecting to shim ca7a354177c12915ea8ffb8d7491fc8062d1fdea6b7542c5a0ce48f012b9e6be" address="unix:///run/containerd/s/5aa8222d602084e983533a23cf875cdd4501ce89d81e76cc8c6249d5bd590dc3" protocol=ttrpc version=3 Sep 13 00:28:27.735489 systemd[1]: Started cri-containerd-ca7a354177c12915ea8ffb8d7491fc8062d1fdea6b7542c5a0ce48f012b9e6be.scope - libcontainer container ca7a354177c12915ea8ffb8d7491fc8062d1fdea6b7542c5a0ce48f012b9e6be. Sep 13 00:28:27.782269 containerd[1571]: time="2025-09-13T00:28:27.781848399Z" level=info msg="StartContainer for \"ca7a354177c12915ea8ffb8d7491fc8062d1fdea6b7542c5a0ce48f012b9e6be\" returns successfully" Sep 13 00:28:27.833444 containerd[1571]: time="2025-09-13T00:28:27.833389165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-gttgt,Uid:4cb5f543-ab50-40fd-9ca4-f26ddf68058e,Namespace:tigera-operator,Attempt:0,}" Sep 13 00:28:27.857210 containerd[1571]: time="2025-09-13T00:28:27.857153931Z" level=info msg="connecting to shim 101f49461ddae4500b98f50273f79b9fe5ff84f8621cc52528250eb228a8e5b7" address="unix:///run/containerd/s/9c53ddcdeff3c232d295a253ee609f963c46550cd9e01aec51595131a6106b78" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:28:27.884517 systemd[1]: Started cri-containerd-101f49461ddae4500b98f50273f79b9fe5ff84f8621cc52528250eb228a8e5b7.scope - libcontainer container 101f49461ddae4500b98f50273f79b9fe5ff84f8621cc52528250eb228a8e5b7. Sep 13 00:28:28.021075 containerd[1571]: time="2025-09-13T00:28:28.021009170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-gttgt,Uid:4cb5f543-ab50-40fd-9ca4-f26ddf68058e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"101f49461ddae4500b98f50273f79b9fe5ff84f8621cc52528250eb228a8e5b7\"" Sep 13 00:28:28.022698 containerd[1571]: time="2025-09-13T00:28:28.022657217Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 13 00:28:28.743035 kubelet[2698]: I0913 00:28:28.742951 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mvhsm" podStartSLOduration=1.742927355 podStartE2EDuration="1.742927355s" podCreationTimestamp="2025-09-13 00:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:28:28.741396441 +0000 UTC m=+6.216426693" watchObservedRunningTime="2025-09-13 00:28:28.742927355 +0000 UTC m=+6.217957597" Sep 13 00:28:29.155812 update_engine[1548]: I20250913 00:28:29.155625 1548 update_attempter.cc:509] Updating boot flags... Sep 13 00:28:29.965841 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3746811615.mount: Deactivated successfully. Sep 13 00:28:31.155926 containerd[1571]: time="2025-09-13T00:28:31.155028662Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:31.155926 containerd[1571]: time="2025-09-13T00:28:31.155879262Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 13 00:28:31.158456 containerd[1571]: time="2025-09-13T00:28:31.158402148Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:31.160663 containerd[1571]: time="2025-09-13T00:28:31.160604799Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:31.161282 containerd[1571]: time="2025-09-13T00:28:31.161221647Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.138515288s" Sep 13 00:28:31.161364 containerd[1571]: time="2025-09-13T00:28:31.161283604Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 13 00:28:31.163779 containerd[1571]: time="2025-09-13T00:28:31.163750284Z" level=info msg="CreateContainer within sandbox \"101f49461ddae4500b98f50273f79b9fe5ff84f8621cc52528250eb228a8e5b7\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 13 00:28:31.173607 containerd[1571]: time="2025-09-13T00:28:31.173547293Z" level=info msg="Container 99a48b91e2e85669ed3bb49b4a0cd40d7331776fbb53c84c0c3d7e8e10543466: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:31.181466 containerd[1571]: time="2025-09-13T00:28:31.181413195Z" level=info msg="CreateContainer within sandbox \"101f49461ddae4500b98f50273f79b9fe5ff84f8621cc52528250eb228a8e5b7\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"99a48b91e2e85669ed3bb49b4a0cd40d7331776fbb53c84c0c3d7e8e10543466\"" Sep 13 00:28:31.182001 containerd[1571]: time="2025-09-13T00:28:31.181975519Z" level=info msg="StartContainer for \"99a48b91e2e85669ed3bb49b4a0cd40d7331776fbb53c84c0c3d7e8e10543466\"" Sep 13 00:28:31.183050 containerd[1571]: time="2025-09-13T00:28:31.183022281Z" level=info msg="connecting to shim 99a48b91e2e85669ed3bb49b4a0cd40d7331776fbb53c84c0c3d7e8e10543466" address="unix:///run/containerd/s/9c53ddcdeff3c232d295a253ee609f963c46550cd9e01aec51595131a6106b78" protocol=ttrpc version=3 Sep 13 00:28:31.255519 systemd[1]: Started cri-containerd-99a48b91e2e85669ed3bb49b4a0cd40d7331776fbb53c84c0c3d7e8e10543466.scope - libcontainer container 99a48b91e2e85669ed3bb49b4a0cd40d7331776fbb53c84c0c3d7e8e10543466. Sep 13 00:28:31.288645 containerd[1571]: time="2025-09-13T00:28:31.288602762Z" level=info msg="StartContainer for \"99a48b91e2e85669ed3bb49b4a0cd40d7331776fbb53c84c0c3d7e8e10543466\" returns successfully" Sep 13 00:28:31.697258 kubelet[2698]: I0913 00:28:31.697181 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-gttgt" podStartSLOduration=1.557041383 podStartE2EDuration="4.697159035s" podCreationTimestamp="2025-09-13 00:28:27 +0000 UTC" firstStartedPulling="2025-09-13 00:28:28.022209888 +0000 UTC m=+5.497240130" lastFinishedPulling="2025-09-13 00:28:31.16232754 +0000 UTC m=+8.637357782" observedRunningTime="2025-09-13 00:28:31.696449201 +0000 UTC m=+9.171479453" watchObservedRunningTime="2025-09-13 00:28:31.697159035 +0000 UTC m=+9.172189287" Sep 13 00:28:36.844913 sudo[1769]: pam_unix(sudo:session): session closed for user root Sep 13 00:28:36.855004 sshd[1768]: Connection closed by 10.0.0.1 port 50182 Sep 13 00:28:36.857323 sshd-session[1766]: pam_unix(sshd:session): session closed for user core Sep 13 00:28:36.869055 systemd-logind[1546]: Session 7 logged out. Waiting for processes to exit. Sep 13 00:28:36.869978 systemd[1]: sshd@6-10.0.0.95:22-10.0.0.1:50182.service: Deactivated successfully. Sep 13 00:28:36.875139 systemd[1]: session-7.scope: Deactivated successfully. Sep 13 00:28:36.875965 systemd[1]: session-7.scope: Consumed 5.737s CPU time, 225.7M memory peak. Sep 13 00:28:36.879805 systemd-logind[1546]: Removed session 7. Sep 13 00:28:39.921894 kubelet[2698]: I0913 00:28:39.920365 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b55b1651-436c-4cf5-8d21-8b52204f2cd8-typha-certs\") pod \"calico-typha-746d5db55b-rblhk\" (UID: \"b55b1651-436c-4cf5-8d21-8b52204f2cd8\") " pod="calico-system/calico-typha-746d5db55b-rblhk" Sep 13 00:28:39.921894 kubelet[2698]: I0913 00:28:39.920414 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r968c\" (UniqueName: \"kubernetes.io/projected/b55b1651-436c-4cf5-8d21-8b52204f2cd8-kube-api-access-r968c\") pod \"calico-typha-746d5db55b-rblhk\" (UID: \"b55b1651-436c-4cf5-8d21-8b52204f2cd8\") " pod="calico-system/calico-typha-746d5db55b-rblhk" Sep 13 00:28:39.921894 kubelet[2698]: I0913 00:28:39.920442 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b55b1651-436c-4cf5-8d21-8b52204f2cd8-tigera-ca-bundle\") pod \"calico-typha-746d5db55b-rblhk\" (UID: \"b55b1651-436c-4cf5-8d21-8b52204f2cd8\") " pod="calico-system/calico-typha-746d5db55b-rblhk" Sep 13 00:28:39.921065 systemd[1]: Created slice kubepods-besteffort-podb55b1651_436c_4cf5_8d21_8b52204f2cd8.slice - libcontainer container kubepods-besteffort-podb55b1651_436c_4cf5_8d21_8b52204f2cd8.slice. Sep 13 00:28:40.226244 containerd[1571]: time="2025-09-13T00:28:40.226191140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-746d5db55b-rblhk,Uid:b55b1651-436c-4cf5-8d21-8b52204f2cd8,Namespace:calico-system,Attempt:0,}" Sep 13 00:28:40.343670 systemd[1]: Created slice kubepods-besteffort-pod7da0003f_d9b9_464d_ab15_bb4be177bd84.slice - libcontainer container kubepods-besteffort-pod7da0003f_d9b9_464d_ab15_bb4be177bd84.slice. Sep 13 00:28:40.346979 containerd[1571]: time="2025-09-13T00:28:40.346913045Z" level=info msg="connecting to shim 2b2209a3e7f9fed2f77b80cd4b0fb75fdf847b33945a70a5c7ef84fd2b007a6d" address="unix:///run/containerd/s/5161bf39d331f2c746c6725757ceee04c5864a41d5f392791bdc1b80ad9e1f2a" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:28:40.375587 systemd[1]: Started cri-containerd-2b2209a3e7f9fed2f77b80cd4b0fb75fdf847b33945a70a5c7ef84fd2b007a6d.scope - libcontainer container 2b2209a3e7f9fed2f77b80cd4b0fb75fdf847b33945a70a5c7ef84fd2b007a6d. Sep 13 00:28:40.426196 kubelet[2698]: I0913 00:28:40.425400 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-cni-bin-dir\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426196 kubelet[2698]: I0913 00:28:40.425444 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-policysync\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426196 kubelet[2698]: I0913 00:28:40.425463 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-var-run-calico\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426196 kubelet[2698]: I0913 00:28:40.425480 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7da0003f-d9b9-464d-ab15-bb4be177bd84-tigera-ca-bundle\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426196 kubelet[2698]: I0913 00:28:40.425497 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-var-lib-calico\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426495 kubelet[2698]: I0913 00:28:40.425516 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-xtables-lock\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426495 kubelet[2698]: I0913 00:28:40.425546 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-lib-modules\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426495 kubelet[2698]: I0913 00:28:40.425562 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-cni-net-dir\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426495 kubelet[2698]: I0913 00:28:40.425579 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-flexvol-driver-host\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426495 kubelet[2698]: I0913 00:28:40.425599 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/7da0003f-d9b9-464d-ab15-bb4be177bd84-cni-log-dir\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426660 containerd[1571]: time="2025-09-13T00:28:40.426215662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-746d5db55b-rblhk,Uid:b55b1651-436c-4cf5-8d21-8b52204f2cd8,Namespace:calico-system,Attempt:0,} returns sandbox id \"2b2209a3e7f9fed2f77b80cd4b0fb75fdf847b33945a70a5c7ef84fd2b007a6d\"" Sep 13 00:28:40.426710 kubelet[2698]: I0913 00:28:40.425614 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/7da0003f-d9b9-464d-ab15-bb4be177bd84-node-certs\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.426710 kubelet[2698]: I0913 00:28:40.425635 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8cr\" (UniqueName: \"kubernetes.io/projected/7da0003f-d9b9-464d-ab15-bb4be177bd84-kube-api-access-dz8cr\") pod \"calico-node-fhjbl\" (UID: \"7da0003f-d9b9-464d-ab15-bb4be177bd84\") " pod="calico-system/calico-node-fhjbl" Sep 13 00:28:40.428209 containerd[1571]: time="2025-09-13T00:28:40.428177211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 13 00:28:40.533672 kubelet[2698]: E0913 00:28:40.533411 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.533672 kubelet[2698]: W0913 00:28:40.533438 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.533672 kubelet[2698]: E0913 00:28:40.533462 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.536799 kubelet[2698]: E0913 00:28:40.536743 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.536799 kubelet[2698]: W0913 00:28:40.536772 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.536799 kubelet[2698]: E0913 00:28:40.536797 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.586254 kubelet[2698]: E0913 00:28:40.586157 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:40.618015 kubelet[2698]: E0913 00:28:40.617964 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.618015 kubelet[2698]: W0913 00:28:40.617997 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.618015 kubelet[2698]: E0913 00:28:40.618023 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.618320 kubelet[2698]: E0913 00:28:40.618289 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.618320 kubelet[2698]: W0913 00:28:40.618302 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.618320 kubelet[2698]: E0913 00:28:40.618314 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.618653 kubelet[2698]: E0913 00:28:40.618603 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.618653 kubelet[2698]: W0913 00:28:40.618639 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.618734 kubelet[2698]: E0913 00:28:40.618677 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.618960 kubelet[2698]: E0913 00:28:40.618941 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.618960 kubelet[2698]: W0913 00:28:40.618953 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.618960 kubelet[2698]: E0913 00:28:40.618962 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.619244 kubelet[2698]: E0913 00:28:40.619215 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.619244 kubelet[2698]: W0913 00:28:40.619228 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.619244 kubelet[2698]: E0913 00:28:40.619238 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.619538 kubelet[2698]: E0913 00:28:40.619482 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.619538 kubelet[2698]: W0913 00:28:40.619491 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.619538 kubelet[2698]: E0913 00:28:40.619500 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.619746 kubelet[2698]: E0913 00:28:40.619729 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.619746 kubelet[2698]: W0913 00:28:40.619744 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.619814 kubelet[2698]: E0913 00:28:40.619753 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.619983 kubelet[2698]: E0913 00:28:40.619967 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.619983 kubelet[2698]: W0913 00:28:40.619979 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.620035 kubelet[2698]: E0913 00:28:40.619988 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.620232 kubelet[2698]: E0913 00:28:40.620204 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.620232 kubelet[2698]: W0913 00:28:40.620224 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.620232 kubelet[2698]: E0913 00:28:40.620234 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.620686 kubelet[2698]: E0913 00:28:40.620563 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.620686 kubelet[2698]: W0913 00:28:40.620575 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.620686 kubelet[2698]: E0913 00:28:40.620588 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.620956 kubelet[2698]: E0913 00:28:40.620851 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.620956 kubelet[2698]: W0913 00:28:40.620864 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.620956 kubelet[2698]: E0913 00:28:40.620875 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.621138 kubelet[2698]: E0913 00:28:40.621112 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.621170 kubelet[2698]: W0913 00:28:40.621137 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.621194 kubelet[2698]: E0913 00:28:40.621168 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.621504 kubelet[2698]: E0913 00:28:40.621473 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.621580 kubelet[2698]: W0913 00:28:40.621548 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.621580 kubelet[2698]: E0913 00:28:40.621563 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.621807 kubelet[2698]: E0913 00:28:40.621790 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.621807 kubelet[2698]: W0913 00:28:40.621803 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.621875 kubelet[2698]: E0913 00:28:40.621816 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.622018 kubelet[2698]: E0913 00:28:40.622000 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.622018 kubelet[2698]: W0913 00:28:40.622013 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.622067 kubelet[2698]: E0913 00:28:40.622022 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.622214 kubelet[2698]: E0913 00:28:40.622197 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.622214 kubelet[2698]: W0913 00:28:40.622210 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.622260 kubelet[2698]: E0913 00:28:40.622220 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.622448 kubelet[2698]: E0913 00:28:40.622432 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.622448 kubelet[2698]: W0913 00:28:40.622444 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.622511 kubelet[2698]: E0913 00:28:40.622455 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.622654 kubelet[2698]: E0913 00:28:40.622634 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.622654 kubelet[2698]: W0913 00:28:40.622649 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.622718 kubelet[2698]: E0913 00:28:40.622659 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.622841 kubelet[2698]: E0913 00:28:40.622824 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.622841 kubelet[2698]: W0913 00:28:40.622835 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.622887 kubelet[2698]: E0913 00:28:40.622845 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.623030 kubelet[2698]: E0913 00:28:40.623013 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.623057 kubelet[2698]: W0913 00:28:40.623026 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.623057 kubelet[2698]: E0913 00:28:40.623041 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.627439 kubelet[2698]: E0913 00:28:40.627410 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.627439 kubelet[2698]: W0913 00:28:40.627426 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.627439 kubelet[2698]: E0913 00:28:40.627438 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.627551 kubelet[2698]: I0913 00:28:40.627470 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8759312a-5948-41bc-8173-0a88fa7fe6fe-kubelet-dir\") pod \"csi-node-driver-xwvxw\" (UID: \"8759312a-5948-41bc-8173-0a88fa7fe6fe\") " pod="calico-system/csi-node-driver-xwvxw" Sep 13 00:28:40.627731 kubelet[2698]: E0913 00:28:40.627704 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.627731 kubelet[2698]: W0913 00:28:40.627720 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.627797 kubelet[2698]: E0913 00:28:40.627737 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.627797 kubelet[2698]: I0913 00:28:40.627755 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8759312a-5948-41bc-8173-0a88fa7fe6fe-socket-dir\") pod \"csi-node-driver-xwvxw\" (UID: \"8759312a-5948-41bc-8173-0a88fa7fe6fe\") " pod="calico-system/csi-node-driver-xwvxw" Sep 13 00:28:40.628012 kubelet[2698]: E0913 00:28:40.627981 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.628012 kubelet[2698]: W0913 00:28:40.628000 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.628077 kubelet[2698]: E0913 00:28:40.628017 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.628077 kubelet[2698]: I0913 00:28:40.628050 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhh2\" (UniqueName: \"kubernetes.io/projected/8759312a-5948-41bc-8173-0a88fa7fe6fe-kube-api-access-hwhh2\") pod \"csi-node-driver-xwvxw\" (UID: \"8759312a-5948-41bc-8173-0a88fa7fe6fe\") " pod="calico-system/csi-node-driver-xwvxw" Sep 13 00:28:40.628243 kubelet[2698]: E0913 00:28:40.628225 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.628243 kubelet[2698]: W0913 00:28:40.628240 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.628295 kubelet[2698]: E0913 00:28:40.628260 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.628492 kubelet[2698]: E0913 00:28:40.628475 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.628492 kubelet[2698]: W0913 00:28:40.628488 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.628564 kubelet[2698]: E0913 00:28:40.628502 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.628728 kubelet[2698]: E0913 00:28:40.628708 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.628728 kubelet[2698]: W0913 00:28:40.628720 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.628784 kubelet[2698]: E0913 00:28:40.628734 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.628942 kubelet[2698]: E0913 00:28:40.628926 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.628971 kubelet[2698]: W0913 00:28:40.628944 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.628971 kubelet[2698]: E0913 00:28:40.628958 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.629160 kubelet[2698]: E0913 00:28:40.629138 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.629160 kubelet[2698]: W0913 00:28:40.629154 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.629227 kubelet[2698]: E0913 00:28:40.629174 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.629227 kubelet[2698]: I0913 00:28:40.629195 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8759312a-5948-41bc-8173-0a88fa7fe6fe-registration-dir\") pod \"csi-node-driver-xwvxw\" (UID: \"8759312a-5948-41bc-8173-0a88fa7fe6fe\") " pod="calico-system/csi-node-driver-xwvxw" Sep 13 00:28:40.629432 kubelet[2698]: E0913 00:28:40.629413 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.629432 kubelet[2698]: W0913 00:28:40.629429 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.629478 kubelet[2698]: E0913 00:28:40.629447 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.629677 kubelet[2698]: E0913 00:28:40.629657 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.629677 kubelet[2698]: W0913 00:28:40.629669 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.629758 kubelet[2698]: E0913 00:28:40.629682 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.629891 kubelet[2698]: E0913 00:28:40.629875 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.629891 kubelet[2698]: W0913 00:28:40.629886 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.629944 kubelet[2698]: E0913 00:28:40.629900 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.630105 kubelet[2698]: E0913 00:28:40.630089 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.630105 kubelet[2698]: W0913 00:28:40.630100 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.630154 kubelet[2698]: E0913 00:28:40.630113 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.630361 kubelet[2698]: E0913 00:28:40.630322 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.630399 kubelet[2698]: W0913 00:28:40.630367 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.630399 kubelet[2698]: E0913 00:28:40.630378 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.630454 kubelet[2698]: I0913 00:28:40.630402 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8759312a-5948-41bc-8173-0a88fa7fe6fe-varrun\") pod \"csi-node-driver-xwvxw\" (UID: \"8759312a-5948-41bc-8173-0a88fa7fe6fe\") " pod="calico-system/csi-node-driver-xwvxw" Sep 13 00:28:40.630661 kubelet[2698]: E0913 00:28:40.630638 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.630661 kubelet[2698]: W0913 00:28:40.630651 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.630661 kubelet[2698]: E0913 00:28:40.630662 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.630858 kubelet[2698]: E0913 00:28:40.630843 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.630858 kubelet[2698]: W0913 00:28:40.630854 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.630914 kubelet[2698]: E0913 00:28:40.630864 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.649947 containerd[1571]: time="2025-09-13T00:28:40.649895396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fhjbl,Uid:7da0003f-d9b9-464d-ab15-bb4be177bd84,Namespace:calico-system,Attempt:0,}" Sep 13 00:28:40.688561 containerd[1571]: time="2025-09-13T00:28:40.688467445Z" level=info msg="connecting to shim 3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165" address="unix:///run/containerd/s/4327f4a70424b9273dffc5ad31f2b3ec86e2b124ecbd383919075ca1a106986c" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:28:40.731536 kubelet[2698]: E0913 00:28:40.731474 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.731536 kubelet[2698]: W0913 00:28:40.731498 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.731536 kubelet[2698]: E0913 00:28:40.731531 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.731754 kubelet[2698]: E0913 00:28:40.731734 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.731754 kubelet[2698]: W0913 00:28:40.731742 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.731814 kubelet[2698]: E0913 00:28:40.731767 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.732062 kubelet[2698]: E0913 00:28:40.732010 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.732062 kubelet[2698]: W0913 00:28:40.732029 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.732062 kubelet[2698]: E0913 00:28:40.732047 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.732264 kubelet[2698]: E0913 00:28:40.732240 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.732264 kubelet[2698]: W0913 00:28:40.732252 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.732264 kubelet[2698]: E0913 00:28:40.732265 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.732537 systemd[1]: Started cri-containerd-3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165.scope - libcontainer container 3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165. Sep 13 00:28:40.732668 kubelet[2698]: E0913 00:28:40.732547 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.732668 kubelet[2698]: W0913 00:28:40.732556 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.732668 kubelet[2698]: E0913 00:28:40.732566 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.732813 kubelet[2698]: E0913 00:28:40.732801 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.732813 kubelet[2698]: W0913 00:28:40.732810 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.732875 kubelet[2698]: E0913 00:28:40.732820 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.733025 kubelet[2698]: E0913 00:28:40.733006 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.733025 kubelet[2698]: W0913 00:28:40.733016 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.733025 kubelet[2698]: E0913 00:28:40.733026 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.733228 kubelet[2698]: E0913 00:28:40.733212 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.733228 kubelet[2698]: W0913 00:28:40.733223 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.733299 kubelet[2698]: E0913 00:28:40.733281 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.733565 kubelet[2698]: E0913 00:28:40.733546 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.733565 kubelet[2698]: W0913 00:28:40.733559 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.733683 kubelet[2698]: E0913 00:28:40.733648 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.733743 kubelet[2698]: E0913 00:28:40.733726 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.733743 kubelet[2698]: W0913 00:28:40.733738 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.733797 kubelet[2698]: E0913 00:28:40.733758 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.733936 kubelet[2698]: E0913 00:28:40.733920 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.733936 kubelet[2698]: W0913 00:28:40.733932 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.733988 kubelet[2698]: E0913 00:28:40.733945 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.734133 kubelet[2698]: E0913 00:28:40.734117 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.734133 kubelet[2698]: W0913 00:28:40.734128 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.734189 kubelet[2698]: E0913 00:28:40.734145 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.734424 kubelet[2698]: E0913 00:28:40.734398 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.734424 kubelet[2698]: W0913 00:28:40.734415 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.734487 kubelet[2698]: E0913 00:28:40.734428 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.734887 kubelet[2698]: E0913 00:28:40.734655 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.734887 kubelet[2698]: W0913 00:28:40.734671 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.734887 kubelet[2698]: E0913 00:28:40.734693 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.735487 kubelet[2698]: E0913 00:28:40.735446 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.735487 kubelet[2698]: W0913 00:28:40.735475 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.735741 kubelet[2698]: E0913 00:28:40.735551 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.735779 kubelet[2698]: E0913 00:28:40.735754 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.735779 kubelet[2698]: W0913 00:28:40.735764 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.735837 kubelet[2698]: E0913 00:28:40.735823 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.736022 kubelet[2698]: E0913 00:28:40.736000 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.736022 kubelet[2698]: W0913 00:28:40.736016 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.736103 kubelet[2698]: E0913 00:28:40.736062 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.736271 kubelet[2698]: E0913 00:28:40.736253 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.736271 kubelet[2698]: W0913 00:28:40.736265 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.736369 kubelet[2698]: E0913 00:28:40.736281 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.736549 kubelet[2698]: E0913 00:28:40.736530 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.736549 kubelet[2698]: W0913 00:28:40.736543 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.736623 kubelet[2698]: E0913 00:28:40.736558 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.736854 kubelet[2698]: E0913 00:28:40.736836 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.736854 kubelet[2698]: W0913 00:28:40.736848 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.736929 kubelet[2698]: E0913 00:28:40.736862 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.737376 kubelet[2698]: E0913 00:28:40.737315 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.737376 kubelet[2698]: W0913 00:28:40.737372 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.737475 kubelet[2698]: E0913 00:28:40.737394 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.737860 kubelet[2698]: E0913 00:28:40.737826 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.737860 kubelet[2698]: W0913 00:28:40.737840 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.737979 kubelet[2698]: E0913 00:28:40.737959 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.738321 kubelet[2698]: E0913 00:28:40.738286 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.738321 kubelet[2698]: W0913 00:28:40.738300 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.738321 kubelet[2698]: E0913 00:28:40.738321 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.738778 kubelet[2698]: E0913 00:28:40.738717 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.738778 kubelet[2698]: W0913 00:28:40.738733 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.738891 kubelet[2698]: E0913 00:28:40.738744 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.739316 kubelet[2698]: E0913 00:28:40.739265 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.739316 kubelet[2698]: W0913 00:28:40.739302 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.739316 kubelet[2698]: E0913 00:28:40.739316 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.748719 kubelet[2698]: E0913 00:28:40.748679 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:40.748719 kubelet[2698]: W0913 00:28:40.748701 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:40.748719 kubelet[2698]: E0913 00:28:40.748727 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:40.768989 containerd[1571]: time="2025-09-13T00:28:40.768942924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fhjbl,Uid:7da0003f-d9b9-464d-ab15-bb4be177bd84,Namespace:calico-system,Attempt:0,} returns sandbox id \"3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165\"" Sep 13 00:28:42.305789 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount491632291.mount: Deactivated successfully. Sep 13 00:28:42.621678 kubelet[2698]: E0913 00:28:42.620970 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:43.332768 containerd[1571]: time="2025-09-13T00:28:43.332694890Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:43.333908 containerd[1571]: time="2025-09-13T00:28:43.333876216Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 13 00:28:43.335256 containerd[1571]: time="2025-09-13T00:28:43.335220909Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:43.337525 containerd[1571]: time="2025-09-13T00:28:43.337486788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:43.338231 containerd[1571]: time="2025-09-13T00:28:43.338104942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.909803909s" Sep 13 00:28:43.338231 containerd[1571]: time="2025-09-13T00:28:43.338139617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 13 00:28:43.341362 containerd[1571]: time="2025-09-13T00:28:43.341316652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 13 00:28:43.353317 containerd[1571]: time="2025-09-13T00:28:43.353256635Z" level=info msg="CreateContainer within sandbox \"2b2209a3e7f9fed2f77b80cd4b0fb75fdf847b33945a70a5c7ef84fd2b007a6d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 13 00:28:43.362823 containerd[1571]: time="2025-09-13T00:28:43.362770086Z" level=info msg="Container 0b6b9523abacd1d924cdd702f031c1bdd43ea6df4f658f4f7fb192a6cddd6902: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:43.372191 containerd[1571]: time="2025-09-13T00:28:43.372147602Z" level=info msg="CreateContainer within sandbox \"2b2209a3e7f9fed2f77b80cd4b0fb75fdf847b33945a70a5c7ef84fd2b007a6d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0b6b9523abacd1d924cdd702f031c1bdd43ea6df4f658f4f7fb192a6cddd6902\"" Sep 13 00:28:43.372608 containerd[1571]: time="2025-09-13T00:28:43.372576860Z" level=info msg="StartContainer for \"0b6b9523abacd1d924cdd702f031c1bdd43ea6df4f658f4f7fb192a6cddd6902\"" Sep 13 00:28:43.373787 containerd[1571]: time="2025-09-13T00:28:43.373762754Z" level=info msg="connecting to shim 0b6b9523abacd1d924cdd702f031c1bdd43ea6df4f658f4f7fb192a6cddd6902" address="unix:///run/containerd/s/5161bf39d331f2c746c6725757ceee04c5864a41d5f392791bdc1b80ad9e1f2a" protocol=ttrpc version=3 Sep 13 00:28:43.397500 systemd[1]: Started cri-containerd-0b6b9523abacd1d924cdd702f031c1bdd43ea6df4f658f4f7fb192a6cddd6902.scope - libcontainer container 0b6b9523abacd1d924cdd702f031c1bdd43ea6df4f658f4f7fb192a6cddd6902. Sep 13 00:28:43.460456 containerd[1571]: time="2025-09-13T00:28:43.460382278Z" level=info msg="StartContainer for \"0b6b9523abacd1d924cdd702f031c1bdd43ea6df4f658f4f7fb192a6cddd6902\" returns successfully" Sep 13 00:28:43.744626 kubelet[2698]: I0913 00:28:43.744275 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-746d5db55b-rblhk" podStartSLOduration=1.830826993 podStartE2EDuration="4.744129101s" podCreationTimestamp="2025-09-13 00:28:39 +0000 UTC" firstStartedPulling="2025-09-13 00:28:40.427813625 +0000 UTC m=+17.902843867" lastFinishedPulling="2025-09-13 00:28:43.341115713 +0000 UTC m=+20.816145975" observedRunningTime="2025-09-13 00:28:43.73801347 +0000 UTC m=+21.213043712" watchObservedRunningTime="2025-09-13 00:28:43.744129101 +0000 UTC m=+21.219159353" Sep 13 00:28:43.756559 kubelet[2698]: E0913 00:28:43.753426 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.756559 kubelet[2698]: W0913 00:28:43.753910 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.759731 kubelet[2698]: E0913 00:28:43.758612 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.759731 kubelet[2698]: E0913 00:28:43.759481 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.759731 kubelet[2698]: W0913 00:28:43.759520 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.759731 kubelet[2698]: E0913 00:28:43.759539 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.761048 kubelet[2698]: E0913 00:28:43.760459 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.761048 kubelet[2698]: W0913 00:28:43.760484 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.761048 kubelet[2698]: E0913 00:28:43.760498 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.762002 kubelet[2698]: E0913 00:28:43.761395 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.762002 kubelet[2698]: W0913 00:28:43.761414 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.762002 kubelet[2698]: E0913 00:28:43.761447 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.762002 kubelet[2698]: E0913 00:28:43.761844 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.762002 kubelet[2698]: W0913 00:28:43.761856 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.762002 kubelet[2698]: E0913 00:28:43.761867 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.768226 kubelet[2698]: E0913 00:28:43.766038 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.768226 kubelet[2698]: W0913 00:28:43.766077 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.768226 kubelet[2698]: E0913 00:28:43.766107 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.768226 kubelet[2698]: E0913 00:28:43.766572 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.768226 kubelet[2698]: W0913 00:28:43.766585 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.768226 kubelet[2698]: E0913 00:28:43.766625 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.785030 kubelet[2698]: E0913 00:28:43.784520 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.785030 kubelet[2698]: W0913 00:28:43.784554 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.785030 kubelet[2698]: E0913 00:28:43.784620 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.790511 kubelet[2698]: E0913 00:28:43.789126 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.790511 kubelet[2698]: W0913 00:28:43.789156 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.790511 kubelet[2698]: E0913 00:28:43.790013 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.791657 kubelet[2698]: E0913 00:28:43.791206 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.791657 kubelet[2698]: W0913 00:28:43.791226 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.791657 kubelet[2698]: E0913 00:28:43.791239 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.793205 kubelet[2698]: E0913 00:28:43.792546 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.793205 kubelet[2698]: W0913 00:28:43.792737 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.794147 kubelet[2698]: E0913 00:28:43.793665 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.797705 kubelet[2698]: E0913 00:28:43.797207 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.797705 kubelet[2698]: W0913 00:28:43.797375 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.797705 kubelet[2698]: E0913 00:28:43.797399 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.798740 kubelet[2698]: E0913 00:28:43.798030 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.798740 kubelet[2698]: W0913 00:28:43.798200 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.799756 kubelet[2698]: E0913 00:28:43.799278 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.800923 kubelet[2698]: E0913 00:28:43.800504 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.801705 kubelet[2698]: W0913 00:28:43.801544 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.802712 kubelet[2698]: E0913 00:28:43.801793 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.802712 kubelet[2698]: E0913 00:28:43.802038 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.802712 kubelet[2698]: W0913 00:28:43.802046 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.802712 kubelet[2698]: E0913 00:28:43.802055 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.802712 kubelet[2698]: E0913 00:28:43.802488 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.802712 kubelet[2698]: W0913 00:28:43.802499 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.802712 kubelet[2698]: E0913 00:28:43.802509 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.806362 kubelet[2698]: E0913 00:28:43.804216 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.806362 kubelet[2698]: W0913 00:28:43.804487 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.806362 kubelet[2698]: E0913 00:28:43.805566 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.806362 kubelet[2698]: E0913 00:28:43.805874 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.806362 kubelet[2698]: W0913 00:28:43.805930 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.806362 kubelet[2698]: E0913 00:28:43.805981 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.806362 kubelet[2698]: E0913 00:28:43.806179 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.806362 kubelet[2698]: W0913 00:28:43.806192 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.806362 kubelet[2698]: E0913 00:28:43.806330 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.806723 kubelet[2698]: E0913 00:28:43.806428 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.806723 kubelet[2698]: W0913 00:28:43.806437 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.806723 kubelet[2698]: E0913 00:28:43.806448 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.806723 kubelet[2698]: E0913 00:28:43.806681 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.806723 kubelet[2698]: W0913 00:28:43.806691 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.806723 kubelet[2698]: E0913 00:28:43.806702 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.806973 kubelet[2698]: E0913 00:28:43.806918 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.806973 kubelet[2698]: W0913 00:28:43.806936 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.806973 kubelet[2698]: E0913 00:28:43.806947 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.808365 kubelet[2698]: E0913 00:28:43.807850 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.808365 kubelet[2698]: W0913 00:28:43.808142 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.808365 kubelet[2698]: E0913 00:28:43.808197 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.808540 kubelet[2698]: E0913 00:28:43.808386 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.808540 kubelet[2698]: W0913 00:28:43.808395 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.808540 kubelet[2698]: E0913 00:28:43.808446 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.811526 kubelet[2698]: E0913 00:28:43.810417 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.811526 kubelet[2698]: W0913 00:28:43.810435 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.811526 kubelet[2698]: E0913 00:28:43.810453 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.811526 kubelet[2698]: E0913 00:28:43.810802 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.811526 kubelet[2698]: W0913 00:28:43.810812 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.811526 kubelet[2698]: E0913 00:28:43.810888 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.811526 kubelet[2698]: E0913 00:28:43.811119 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.811526 kubelet[2698]: W0913 00:28:43.811129 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.811526 kubelet[2698]: E0913 00:28:43.811221 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.811900 kubelet[2698]: E0913 00:28:43.811826 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.811900 kubelet[2698]: W0913 00:28:43.811836 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.811977 kubelet[2698]: E0913 00:28:43.811929 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.812809 kubelet[2698]: E0913 00:28:43.812489 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.812809 kubelet[2698]: W0913 00:28:43.812507 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.812809 kubelet[2698]: E0913 00:28:43.812521 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.813877 kubelet[2698]: E0913 00:28:43.813835 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.813877 kubelet[2698]: W0913 00:28:43.813851 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.813981 kubelet[2698]: E0913 00:28:43.813905 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.816013 kubelet[2698]: E0913 00:28:43.815975 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.816013 kubelet[2698]: W0913 00:28:43.815992 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.816013 kubelet[2698]: E0913 00:28:43.816002 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.816629 kubelet[2698]: E0913 00:28:43.816592 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.816629 kubelet[2698]: W0913 00:28:43.816607 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.816629 kubelet[2698]: E0913 00:28:43.816618 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:43.818316 kubelet[2698]: E0913 00:28:43.818273 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:43.818316 kubelet[2698]: W0913 00:28:43.818289 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:43.818316 kubelet[2698]: E0913 00:28:43.818300 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.630614 kubelet[2698]: E0913 00:28:44.627833 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:44.706558 kubelet[2698]: I0913 00:28:44.704845 2698 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 00:28:44.730895 kubelet[2698]: E0913 00:28:44.730422 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.730895 kubelet[2698]: W0913 00:28:44.730468 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.730895 kubelet[2698]: E0913 00:28:44.730500 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.732088 kubelet[2698]: E0913 00:28:44.731916 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.732088 kubelet[2698]: W0913 00:28:44.731939 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.732088 kubelet[2698]: E0913 00:28:44.731960 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.734699 kubelet[2698]: E0913 00:28:44.733707 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.734699 kubelet[2698]: W0913 00:28:44.734531 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.735234 kubelet[2698]: E0913 00:28:44.734861 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.735234 kubelet[2698]: E0913 00:28:44.735090 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.735234 kubelet[2698]: W0913 00:28:44.735098 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.735234 kubelet[2698]: E0913 00:28:44.735108 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.737110 kubelet[2698]: E0913 00:28:44.737083 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.737503 kubelet[2698]: W0913 00:28:44.737265 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.737503 kubelet[2698]: E0913 00:28:44.737289 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.737876 kubelet[2698]: E0913 00:28:44.737822 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.737876 kubelet[2698]: W0913 00:28:44.737835 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.737876 kubelet[2698]: E0913 00:28:44.737846 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.738523 kubelet[2698]: E0913 00:28:44.738003 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.738523 kubelet[2698]: W0913 00:28:44.738011 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.738523 kubelet[2698]: E0913 00:28:44.738021 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.738523 kubelet[2698]: E0913 00:28:44.738178 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.738523 kubelet[2698]: W0913 00:28:44.738185 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.738523 kubelet[2698]: E0913 00:28:44.738194 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.738793 kubelet[2698]: E0913 00:28:44.738735 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.738793 kubelet[2698]: W0913 00:28:44.738749 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.738793 kubelet[2698]: E0913 00:28:44.738761 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.741803 kubelet[2698]: E0913 00:28:44.741743 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.741803 kubelet[2698]: W0913 00:28:44.741796 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.741946 kubelet[2698]: E0913 00:28:44.741818 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.743946 kubelet[2698]: E0913 00:28:44.743865 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.743946 kubelet[2698]: W0913 00:28:44.743888 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.743946 kubelet[2698]: E0913 00:28:44.743922 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.744442 kubelet[2698]: E0913 00:28:44.744253 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.744442 kubelet[2698]: W0913 00:28:44.744264 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.744442 kubelet[2698]: E0913 00:28:44.744279 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.744886 kubelet[2698]: E0913 00:28:44.744627 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.744886 kubelet[2698]: W0913 00:28:44.744639 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.744886 kubelet[2698]: E0913 00:28:44.744654 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.745973 kubelet[2698]: E0913 00:28:44.744931 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.745973 kubelet[2698]: W0913 00:28:44.744943 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.745973 kubelet[2698]: E0913 00:28:44.744955 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.745973 kubelet[2698]: E0913 00:28:44.745230 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.745973 kubelet[2698]: W0913 00:28:44.745240 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.745973 kubelet[2698]: E0913 00:28:44.745273 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.827372 kubelet[2698]: E0913 00:28:44.826218 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.827372 kubelet[2698]: W0913 00:28:44.826254 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.827372 kubelet[2698]: E0913 00:28:44.826283 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.828108 kubelet[2698]: E0913 00:28:44.828070 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.828108 kubelet[2698]: W0913 00:28:44.828089 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.828325 kubelet[2698]: E0913 00:28:44.828261 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.828823 kubelet[2698]: E0913 00:28:44.828791 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.829363 kubelet[2698]: W0913 00:28:44.828937 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.829549 kubelet[2698]: E0913 00:28:44.829471 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.830125 kubelet[2698]: E0913 00:28:44.830030 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.830125 kubelet[2698]: W0913 00:28:44.830044 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.830125 kubelet[2698]: E0913 00:28:44.830072 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.832915 kubelet[2698]: E0913 00:28:44.832889 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.833146 kubelet[2698]: W0913 00:28:44.833030 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.833600 kubelet[2698]: E0913 00:28:44.833494 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.833985 kubelet[2698]: E0913 00:28:44.833901 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.833985 kubelet[2698]: W0913 00:28:44.833916 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.833985 kubelet[2698]: E0913 00:28:44.833935 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.834764 kubelet[2698]: E0913 00:28:44.834644 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.834764 kubelet[2698]: W0913 00:28:44.834659 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.834916 kubelet[2698]: E0913 00:28:44.834896 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.835118 kubelet[2698]: E0913 00:28:44.835082 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.835118 kubelet[2698]: W0913 00:28:44.835098 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.835316 kubelet[2698]: E0913 00:28:44.835297 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.836049 kubelet[2698]: E0913 00:28:44.836032 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.836224 kubelet[2698]: W0913 00:28:44.836206 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.837502 kubelet[2698]: E0913 00:28:44.837464 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.838037 kubelet[2698]: E0913 00:28:44.838004 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.838037 kubelet[2698]: W0913 00:28:44.838020 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.838644 kubelet[2698]: E0913 00:28:44.838300 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.838896 kubelet[2698]: E0913 00:28:44.838881 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.839411 kubelet[2698]: W0913 00:28:44.838977 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.839818 kubelet[2698]: E0913 00:28:44.839799 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.841706 kubelet[2698]: E0913 00:28:44.841682 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.841824 kubelet[2698]: W0913 00:28:44.841786 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.842099 kubelet[2698]: E0913 00:28:44.842058 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.844366 kubelet[2698]: E0913 00:28:44.844057 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.844366 kubelet[2698]: W0913 00:28:44.844092 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.844366 kubelet[2698]: E0913 00:28:44.844209 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.845406 kubelet[2698]: E0913 00:28:44.845296 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.845592 kubelet[2698]: W0913 00:28:44.845561 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.846548 kubelet[2698]: E0913 00:28:44.845708 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.847314 kubelet[2698]: E0913 00:28:44.846863 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.847314 kubelet[2698]: W0913 00:28:44.846879 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.847314 kubelet[2698]: E0913 00:28:44.846896 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.848150 kubelet[2698]: E0913 00:28:44.847578 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.848150 kubelet[2698]: W0913 00:28:44.847613 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.848150 kubelet[2698]: E0913 00:28:44.847648 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.848745 kubelet[2698]: E0913 00:28:44.848727 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.848857 kubelet[2698]: W0913 00:28:44.848839 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.849859 kubelet[2698]: E0913 00:28:44.849581 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:44.851325 kubelet[2698]: E0913 00:28:44.850712 2698 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:28:44.851325 kubelet[2698]: W0913 00:28:44.850759 2698 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:28:44.851325 kubelet[2698]: E0913 00:28:44.850775 2698 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:28:45.055298 containerd[1571]: time="2025-09-13T00:28:45.055139115Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:45.057718 containerd[1571]: time="2025-09-13T00:28:45.057354427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 13 00:28:45.059491 containerd[1571]: time="2025-09-13T00:28:45.058997490Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:45.067630 containerd[1571]: time="2025-09-13T00:28:45.067574439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:45.068316 containerd[1571]: time="2025-09-13T00:28:45.068288293Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.726792103s" Sep 13 00:28:45.068410 containerd[1571]: time="2025-09-13T00:28:45.068394173Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 13 00:28:45.076422 containerd[1571]: time="2025-09-13T00:28:45.075872182Z" level=info msg="CreateContainer within sandbox \"3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 13 00:28:45.128024 containerd[1571]: time="2025-09-13T00:28:45.125683447Z" level=info msg="Container 0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:45.153715 containerd[1571]: time="2025-09-13T00:28:45.153592365Z" level=info msg="CreateContainer within sandbox \"3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f\"" Sep 13 00:28:45.156360 containerd[1571]: time="2025-09-13T00:28:45.156237155Z" level=info msg="StartContainer for \"0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f\"" Sep 13 00:28:45.158128 containerd[1571]: time="2025-09-13T00:28:45.158062953Z" level=info msg="connecting to shim 0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f" address="unix:///run/containerd/s/4327f4a70424b9273dffc5ad31f2b3ec86e2b124ecbd383919075ca1a106986c" protocol=ttrpc version=3 Sep 13 00:28:45.232594 systemd[1]: Started cri-containerd-0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f.scope - libcontainer container 0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f. Sep 13 00:28:45.331699 containerd[1571]: time="2025-09-13T00:28:45.330768517Z" level=info msg="StartContainer for \"0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f\" returns successfully" Sep 13 00:28:45.341129 systemd[1]: cri-containerd-0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f.scope: Deactivated successfully. Sep 13 00:28:45.346672 containerd[1571]: time="2025-09-13T00:28:45.341317979Z" level=info msg="received exit event container_id:\"0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f\" id:\"0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f\" pid:3434 exited_at:{seconds:1757723325 nanos:340533121}" Sep 13 00:28:45.346672 containerd[1571]: time="2025-09-13T00:28:45.342732482Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f\" id:\"0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f\" pid:3434 exited_at:{seconds:1757723325 nanos:340533121}" Sep 13 00:28:45.459884 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0f3b081d98d9e8f6a328f56adda635a0b9a99f7342c08dd1e8986fe771745a6f-rootfs.mount: Deactivated successfully. Sep 13 00:28:46.619239 kubelet[2698]: E0913 00:28:46.619179 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:46.717616 containerd[1571]: time="2025-09-13T00:28:46.717426630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 13 00:28:48.618729 kubelet[2698]: E0913 00:28:48.618603 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:50.619623 kubelet[2698]: E0913 00:28:50.619540 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:51.228707 containerd[1571]: time="2025-09-13T00:28:51.228566172Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:51.231892 containerd[1571]: time="2025-09-13T00:28:51.231480212Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 13 00:28:51.234125 containerd[1571]: time="2025-09-13T00:28:51.234056036Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:51.240359 containerd[1571]: time="2025-09-13T00:28:51.240173520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:28:51.241194 containerd[1571]: time="2025-09-13T00:28:51.241077570Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.523602859s" Sep 13 00:28:51.241194 containerd[1571]: time="2025-09-13T00:28:51.241127955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 13 00:28:51.249884 containerd[1571]: time="2025-09-13T00:28:51.249731384Z" level=info msg="CreateContainer within sandbox \"3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 13 00:28:51.282860 containerd[1571]: time="2025-09-13T00:28:51.282716003Z" level=info msg="Container b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:28:51.305759 containerd[1571]: time="2025-09-13T00:28:51.305550103Z" level=info msg="CreateContainer within sandbox \"3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8\"" Sep 13 00:28:51.306956 containerd[1571]: time="2025-09-13T00:28:51.306525908Z" level=info msg="StartContainer for \"b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8\"" Sep 13 00:28:51.308617 containerd[1571]: time="2025-09-13T00:28:51.308380757Z" level=info msg="connecting to shim b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8" address="unix:///run/containerd/s/4327f4a70424b9273dffc5ad31f2b3ec86e2b124ecbd383919075ca1a106986c" protocol=ttrpc version=3 Sep 13 00:28:51.353579 systemd[1]: Started cri-containerd-b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8.scope - libcontainer container b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8. Sep 13 00:28:51.446642 containerd[1571]: time="2025-09-13T00:28:51.446530403Z" level=info msg="StartContainer for \"b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8\" returns successfully" Sep 13 00:28:52.619703 kubelet[2698]: E0913 00:28:52.619614 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:52.894766 containerd[1571]: time="2025-09-13T00:28:52.894598934Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 13 00:28:52.899152 systemd[1]: cri-containerd-b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8.scope: Deactivated successfully. Sep 13 00:28:52.901390 containerd[1571]: time="2025-09-13T00:28:52.901293592Z" level=info msg="received exit event container_id:\"b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8\" id:\"b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8\" pid:3493 exited_at:{seconds:1757723332 nanos:900695968}" Sep 13 00:28:52.901507 containerd[1571]: time="2025-09-13T00:28:52.901432502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8\" id:\"b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8\" pid:3493 exited_at:{seconds:1757723332 nanos:900695968}" Sep 13 00:28:52.901651 systemd[1]: cri-containerd-b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8.scope: Consumed 721ms CPU time, 180.5M memory peak, 3.9M read from disk, 171.3M written to disk. Sep 13 00:28:52.928361 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b0d7b87d1f5acb849ea332c7e4f1652a1655458e94fd5454deaa6ed14c49d0f8-rootfs.mount: Deactivated successfully. Sep 13 00:28:52.985480 kubelet[2698]: I0913 00:28:52.985120 2698 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 13 00:28:53.210704 systemd[1]: Created slice kubepods-besteffort-pod257fb627_9843_427c_b592_c2472be0c288.slice - libcontainer container kubepods-besteffort-pod257fb627_9843_427c_b592_c2472be0c288.slice. Sep 13 00:28:53.220063 systemd[1]: Created slice kubepods-besteffort-pod6df7a665_f2da_401b_9fe7_13fb91ea1673.slice - libcontainer container kubepods-besteffort-pod6df7a665_f2da_401b_9fe7_13fb91ea1673.slice. Sep 13 00:28:53.228043 systemd[1]: Created slice kubepods-burstable-podc041bc9d_e4f0_4799_959f_fb73ab87bb7d.slice - libcontainer container kubepods-burstable-podc041bc9d_e4f0_4799_959f_fb73ab87bb7d.slice. Sep 13 00:28:53.237954 systemd[1]: Created slice kubepods-burstable-pod4acb061a_e88e_4499_9ff6_e87ab4883853.slice - libcontainer container kubepods-burstable-pod4acb061a_e88e_4499_9ff6_e87ab4883853.slice. Sep 13 00:28:53.246882 systemd[1]: Created slice kubepods-besteffort-pod316678ae_ae53_4c3e_b02b_987dc2ed4041.slice - libcontainer container kubepods-besteffort-pod316678ae_ae53_4c3e_b02b_987dc2ed4041.slice. Sep 13 00:28:53.252990 kubelet[2698]: I0913 00:28:53.252473 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/257fb627-9843-427c-b592-c2472be0c288-tigera-ca-bundle\") pod \"calico-kube-controllers-56848bfb48-n8vcn\" (UID: \"257fb627-9843-427c-b592-c2472be0c288\") " pod="calico-system/calico-kube-controllers-56848bfb48-n8vcn" Sep 13 00:28:53.252990 kubelet[2698]: I0913 00:28:53.252949 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4acb061a-e88e-4499-9ff6-e87ab4883853-config-volume\") pod \"coredns-7c65d6cfc9-hbctv\" (UID: \"4acb061a-e88e-4499-9ff6-e87ab4883853\") " pod="kube-system/coredns-7c65d6cfc9-hbctv" Sep 13 00:28:53.252990 kubelet[2698]: I0913 00:28:53.252979 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/316678ae-ae53-4c3e-b02b-987dc2ed4041-goldmane-ca-bundle\") pod \"goldmane-7988f88666-gx8np\" (UID: \"316678ae-ae53-4c3e-b02b-987dc2ed4041\") " pod="calico-system/goldmane-7988f88666-gx8np" Sep 13 00:28:53.252990 kubelet[2698]: I0913 00:28:53.253003 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6df7a665-f2da-401b-9fe7-13fb91ea1673-calico-apiserver-certs\") pod \"calico-apiserver-c5df58fcc-6mngl\" (UID: \"6df7a665-f2da-401b-9fe7-13fb91ea1673\") " pod="calico-apiserver/calico-apiserver-c5df58fcc-6mngl" Sep 13 00:28:53.253890 kubelet[2698]: I0913 00:28:53.253025 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8n5j\" (UniqueName: \"kubernetes.io/projected/4acb061a-e88e-4499-9ff6-e87ab4883853-kube-api-access-d8n5j\") pod \"coredns-7c65d6cfc9-hbctv\" (UID: \"4acb061a-e88e-4499-9ff6-e87ab4883853\") " pod="kube-system/coredns-7c65d6cfc9-hbctv" Sep 13 00:28:53.253890 kubelet[2698]: I0913 00:28:53.253050 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgpl\" (UniqueName: \"kubernetes.io/projected/6df7a665-f2da-401b-9fe7-13fb91ea1673-kube-api-access-6hgpl\") pod \"calico-apiserver-c5df58fcc-6mngl\" (UID: \"6df7a665-f2da-401b-9fe7-13fb91ea1673\") " pod="calico-apiserver/calico-apiserver-c5df58fcc-6mngl" Sep 13 00:28:53.253890 kubelet[2698]: I0913 00:28:53.253081 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4vtd\" (UniqueName: \"kubernetes.io/projected/51ca438a-e2b6-4381-8e60-f5629fa0787e-kube-api-access-x4vtd\") pod \"calico-apiserver-c5df58fcc-l7vmt\" (UID: \"51ca438a-e2b6-4381-8e60-f5629fa0787e\") " pod="calico-apiserver/calico-apiserver-c5df58fcc-l7vmt" Sep 13 00:28:53.253890 kubelet[2698]: I0913 00:28:53.253118 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sshcl\" (UniqueName: \"kubernetes.io/projected/257fb627-9843-427c-b592-c2472be0c288-kube-api-access-sshcl\") pod \"calico-kube-controllers-56848bfb48-n8vcn\" (UID: \"257fb627-9843-427c-b592-c2472be0c288\") " pod="calico-system/calico-kube-controllers-56848bfb48-n8vcn" Sep 13 00:28:53.253890 kubelet[2698]: I0913 00:28:53.253144 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/51ca438a-e2b6-4381-8e60-f5629fa0787e-calico-apiserver-certs\") pod \"calico-apiserver-c5df58fcc-l7vmt\" (UID: \"51ca438a-e2b6-4381-8e60-f5629fa0787e\") " pod="calico-apiserver/calico-apiserver-c5df58fcc-l7vmt" Sep 13 00:28:53.254045 kubelet[2698]: I0913 00:28:53.253164 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/316678ae-ae53-4c3e-b02b-987dc2ed4041-goldmane-key-pair\") pod \"goldmane-7988f88666-gx8np\" (UID: \"316678ae-ae53-4c3e-b02b-987dc2ed4041\") " pod="calico-system/goldmane-7988f88666-gx8np" Sep 13 00:28:53.254045 kubelet[2698]: I0913 00:28:53.253182 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-ca-bundle\") pod \"whisker-56d74c46fd-2xkjd\" (UID: \"ebbca511-d600-4da7-8c41-e873d3aeebe8\") " pod="calico-system/whisker-56d74c46fd-2xkjd" Sep 13 00:28:53.254045 kubelet[2698]: I0913 00:28:53.253227 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c041bc9d-e4f0-4799-959f-fb73ab87bb7d-config-volume\") pod \"coredns-7c65d6cfc9-gz9hh\" (UID: \"c041bc9d-e4f0-4799-959f-fb73ab87bb7d\") " pod="kube-system/coredns-7c65d6cfc9-gz9hh" Sep 13 00:28:53.254045 kubelet[2698]: I0913 00:28:53.253250 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmtp\" (UniqueName: \"kubernetes.io/projected/c041bc9d-e4f0-4799-959f-fb73ab87bb7d-kube-api-access-2mmtp\") pod \"coredns-7c65d6cfc9-gz9hh\" (UID: \"c041bc9d-e4f0-4799-959f-fb73ab87bb7d\") " pod="kube-system/coredns-7c65d6cfc9-gz9hh" Sep 13 00:28:53.254045 kubelet[2698]: I0913 00:28:53.253375 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-backend-key-pair\") pod \"whisker-56d74c46fd-2xkjd\" (UID: \"ebbca511-d600-4da7-8c41-e873d3aeebe8\") " pod="calico-system/whisker-56d74c46fd-2xkjd" Sep 13 00:28:53.254183 kubelet[2698]: I0913 00:28:53.253399 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9sqg\" (UniqueName: \"kubernetes.io/projected/ebbca511-d600-4da7-8c41-e873d3aeebe8-kube-api-access-s9sqg\") pod \"whisker-56d74c46fd-2xkjd\" (UID: \"ebbca511-d600-4da7-8c41-e873d3aeebe8\") " pod="calico-system/whisker-56d74c46fd-2xkjd" Sep 13 00:28:53.254183 kubelet[2698]: I0913 00:28:53.253643 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/316678ae-ae53-4c3e-b02b-987dc2ed4041-config\") pod \"goldmane-7988f88666-gx8np\" (UID: \"316678ae-ae53-4c3e-b02b-987dc2ed4041\") " pod="calico-system/goldmane-7988f88666-gx8np" Sep 13 00:28:53.254183 kubelet[2698]: I0913 00:28:53.253677 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvcq\" (UniqueName: \"kubernetes.io/projected/316678ae-ae53-4c3e-b02b-987dc2ed4041-kube-api-access-tnvcq\") pod \"goldmane-7988f88666-gx8np\" (UID: \"316678ae-ae53-4c3e-b02b-987dc2ed4041\") " pod="calico-system/goldmane-7988f88666-gx8np" Sep 13 00:28:53.255150 systemd[1]: Created slice kubepods-besteffort-podebbca511_d600_4da7_8c41_e873d3aeebe8.slice - libcontainer container kubepods-besteffort-podebbca511_d600_4da7_8c41_e873d3aeebe8.slice. Sep 13 00:28:53.261782 systemd[1]: Created slice kubepods-besteffort-pod51ca438a_e2b6_4381_8e60_f5629fa0787e.slice - libcontainer container kubepods-besteffort-pod51ca438a_e2b6_4381_8e60_f5629fa0787e.slice. Sep 13 00:28:53.516611 containerd[1571]: time="2025-09-13T00:28:53.516450868Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56848bfb48-n8vcn,Uid:257fb627-9843-427c-b592-c2472be0c288,Namespace:calico-system,Attempt:0,}" Sep 13 00:28:53.525584 containerd[1571]: time="2025-09-13T00:28:53.525542260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-6mngl,Uid:6df7a665-f2da-401b-9fe7-13fb91ea1673,Namespace:calico-apiserver,Attempt:0,}" Sep 13 00:28:53.533476 containerd[1571]: time="2025-09-13T00:28:53.533387438Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gz9hh,Uid:c041bc9d-e4f0-4799-959f-fb73ab87bb7d,Namespace:kube-system,Attempt:0,}" Sep 13 00:28:53.544467 containerd[1571]: time="2025-09-13T00:28:53.544429096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hbctv,Uid:4acb061a-e88e-4499-9ff6-e87ab4883853,Namespace:kube-system,Attempt:0,}" Sep 13 00:28:53.551201 containerd[1571]: time="2025-09-13T00:28:53.551158256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-gx8np,Uid:316678ae-ae53-4c3e-b02b-987dc2ed4041,Namespace:calico-system,Attempt:0,}" Sep 13 00:28:53.562364 containerd[1571]: time="2025-09-13T00:28:53.562029805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56d74c46fd-2xkjd,Uid:ebbca511-d600-4da7-8c41-e873d3aeebe8,Namespace:calico-system,Attempt:0,}" Sep 13 00:28:53.566310 containerd[1571]: time="2025-09-13T00:28:53.566268072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-l7vmt,Uid:51ca438a-e2b6-4381-8e60-f5629fa0787e,Namespace:calico-apiserver,Attempt:0,}" Sep 13 00:28:53.637423 containerd[1571]: time="2025-09-13T00:28:53.637328343Z" level=error msg="Failed to destroy network for sandbox \"f46840c23e55aa11821c522de0d9c4d61f20bc484c9721c897d57e4023ad0b92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.653692 containerd[1571]: time="2025-09-13T00:28:53.653637904Z" level=error msg="Failed to destroy network for sandbox \"63558eb1483c922d957b062ffee9a99891fafb11210dbd4297907b8af83940d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.681567 containerd[1571]: time="2025-09-13T00:28:53.681489503Z" level=error msg="Failed to destroy network for sandbox \"c78f5d9e536b3bcd8bd602649d5455aea31b7adaf18e3ca3da8798748ae4520d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.700743 containerd[1571]: time="2025-09-13T00:28:53.700674280Z" level=error msg="Failed to destroy network for sandbox \"ca91724d42e6e7ffd17a6324b13043cf8ab82078f0bc45e0e1dfc43052793db8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.724530 containerd[1571]: time="2025-09-13T00:28:53.716194217Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56848bfb48-n8vcn,Uid:257fb627-9843-427c-b592-c2472be0c288,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f46840c23e55aa11821c522de0d9c4d61f20bc484c9721c897d57e4023ad0b92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.725188 containerd[1571]: time="2025-09-13T00:28:53.716217971Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-l7vmt,Uid:51ca438a-e2b6-4381-8e60-f5629fa0787e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca91724d42e6e7ffd17a6324b13043cf8ab82078f0bc45e0e1dfc43052793db8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.725188 containerd[1571]: time="2025-09-13T00:28:53.716454847Z" level=error msg="Failed to destroy network for sandbox \"2c3aa981989d664a70a10e9c5c9e20ea9caa0653df5b8b7dff09bcf74933820e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.726456 containerd[1571]: time="2025-09-13T00:28:53.716237578Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-gx8np,Uid:316678ae-ae53-4c3e-b02b-987dc2ed4041,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c78f5d9e536b3bcd8bd602649d5455aea31b7adaf18e3ca3da8798748ae4520d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.726597 containerd[1571]: time="2025-09-13T00:28:53.716267805Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-6mngl,Uid:6df7a665-f2da-401b-9fe7-13fb91ea1673,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"63558eb1483c922d957b062ffee9a99891fafb11210dbd4297907b8af83940d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.726597 containerd[1571]: time="2025-09-13T00:28:53.716327126Z" level=error msg="Failed to destroy network for sandbox \"c81e2566b01c6a5141bc2962a57c43495b6d968fd36863524d95549e3396e57f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.727527 containerd[1571]: time="2025-09-13T00:28:53.727489261Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-56d74c46fd-2xkjd,Uid:ebbca511-d600-4da7-8c41-e873d3aeebe8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3aa981989d664a70a10e9c5c9e20ea9caa0653df5b8b7dff09bcf74933820e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.728688 containerd[1571]: time="2025-09-13T00:28:53.728648470Z" level=error msg="Failed to destroy network for sandbox \"7099a7cf0e864c86ff5bb69971cb19e51debdfcf57ed5445cd1f1d5f7bcea166\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.729583 containerd[1571]: time="2025-09-13T00:28:53.729493068Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gz9hh,Uid:c041bc9d-e4f0-4799-959f-fb73ab87bb7d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c81e2566b01c6a5141bc2962a57c43495b6d968fd36863524d95549e3396e57f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.730590 containerd[1571]: time="2025-09-13T00:28:53.730532182Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hbctv,Uid:4acb061a-e88e-4499-9ff6-e87ab4883853,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7099a7cf0e864c86ff5bb69971cb19e51debdfcf57ed5445cd1f1d5f7bcea166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.743363 kubelet[2698]: E0913 00:28:53.743260 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca91724d42e6e7ffd17a6324b13043cf8ab82078f0bc45e0e1dfc43052793db8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.743363 kubelet[2698]: E0913 00:28:53.743319 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63558eb1483c922d957b062ffee9a99891fafb11210dbd4297907b8af83940d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.743363 kubelet[2698]: E0913 00:28:53.743241 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c78f5d9e536b3bcd8bd602649d5455aea31b7adaf18e3ca3da8798748ae4520d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.743869 kubelet[2698]: E0913 00:28:53.743364 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3aa981989d664a70a10e9c5c9e20ea9caa0653df5b8b7dff09bcf74933820e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.743869 kubelet[2698]: E0913 00:28:53.743375 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c81e2566b01c6a5141bc2962a57c43495b6d968fd36863524d95549e3396e57f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.743869 kubelet[2698]: E0913 00:28:53.743237 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f46840c23e55aa11821c522de0d9c4d61f20bc484c9721c897d57e4023ad0b92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.743869 kubelet[2698]: E0913 00:28:53.743545 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7099a7cf0e864c86ff5bb69971cb19e51debdfcf57ed5445cd1f1d5f7bcea166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:53.744306 kubelet[2698]: E0913 00:28:53.744048 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca91724d42e6e7ffd17a6324b13043cf8ab82078f0bc45e0e1dfc43052793db8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5df58fcc-l7vmt" Sep 13 00:28:53.744306 kubelet[2698]: E0913 00:28:53.744060 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3aa981989d664a70a10e9c5c9e20ea9caa0653df5b8b7dff09bcf74933820e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-56d74c46fd-2xkjd" Sep 13 00:28:53.744306 kubelet[2698]: E0913 00:28:53.744077 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca91724d42e6e7ffd17a6324b13043cf8ab82078f0bc45e0e1dfc43052793db8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5df58fcc-l7vmt" Sep 13 00:28:53.744306 kubelet[2698]: E0913 00:28:53.744085 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c3aa981989d664a70a10e9c5c9e20ea9caa0653df5b8b7dff09bcf74933820e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-56d74c46fd-2xkjd" Sep 13 00:28:53.744483 kubelet[2698]: E0913 00:28:53.744100 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63558eb1483c922d957b062ffee9a99891fafb11210dbd4297907b8af83940d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5df58fcc-6mngl" Sep 13 00:28:53.744483 kubelet[2698]: E0913 00:28:53.744129 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c78f5d9e536b3bcd8bd602649d5455aea31b7adaf18e3ca3da8798748ae4520d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-gx8np" Sep 13 00:28:53.744483 kubelet[2698]: E0913 00:28:53.744139 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63558eb1483c922d957b062ffee9a99891fafb11210dbd4297907b8af83940d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c5df58fcc-6mngl" Sep 13 00:28:53.744483 kubelet[2698]: E0913 00:28:53.744152 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c78f5d9e536b3bcd8bd602649d5455aea31b7adaf18e3ca3da8798748ae4520d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-gx8np" Sep 13 00:28:53.744594 kubelet[2698]: E0913 00:28:53.744162 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f46840c23e55aa11821c522de0d9c4d61f20bc484c9721c897d57e4023ad0b92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56848bfb48-n8vcn" Sep 13 00:28:53.744594 kubelet[2698]: E0913 00:28:53.744174 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7099a7cf0e864c86ff5bb69971cb19e51debdfcf57ed5445cd1f1d5f7bcea166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hbctv" Sep 13 00:28:53.744594 kubelet[2698]: E0913 00:28:53.744181 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f46840c23e55aa11821c522de0d9c4d61f20bc484c9721c897d57e4023ad0b92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-56848bfb48-n8vcn" Sep 13 00:28:53.744682 kubelet[2698]: E0913 00:28:53.744179 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c5df58fcc-6mngl_calico-apiserver(6df7a665-f2da-401b-9fe7-13fb91ea1673)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c5df58fcc-6mngl_calico-apiserver(6df7a665-f2da-401b-9fe7-13fb91ea1673)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63558eb1483c922d957b062ffee9a99891fafb11210dbd4297907b8af83940d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c5df58fcc-6mngl" podUID="6df7a665-f2da-401b-9fe7-13fb91ea1673" Sep 13 00:28:53.744682 kubelet[2698]: E0913 00:28:53.744192 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7099a7cf0e864c86ff5bb69971cb19e51debdfcf57ed5445cd1f1d5f7bcea166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hbctv" Sep 13 00:28:53.744682 kubelet[2698]: E0913 00:28:53.744101 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c81e2566b01c6a5141bc2962a57c43495b6d968fd36863524d95549e3396e57f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-gz9hh" Sep 13 00:28:53.744806 kubelet[2698]: E0913 00:28:53.744225 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-hbctv_kube-system(4acb061a-e88e-4499-9ff6-e87ab4883853)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-hbctv_kube-system(4acb061a-e88e-4499-9ff6-e87ab4883853)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7099a7cf0e864c86ff5bb69971cb19e51debdfcf57ed5445cd1f1d5f7bcea166\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-hbctv" podUID="4acb061a-e88e-4499-9ff6-e87ab4883853" Sep 13 00:28:53.744806 kubelet[2698]: E0913 00:28:53.744228 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c81e2566b01c6a5141bc2962a57c43495b6d968fd36863524d95549e3396e57f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-gz9hh" Sep 13 00:28:53.744806 kubelet[2698]: E0913 00:28:53.744129 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c5df58fcc-l7vmt_calico-apiserver(51ca438a-e2b6-4381-8e60-f5629fa0787e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c5df58fcc-l7vmt_calico-apiserver(51ca438a-e2b6-4381-8e60-f5629fa0787e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca91724d42e6e7ffd17a6324b13043cf8ab82078f0bc45e0e1dfc43052793db8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c5df58fcc-l7vmt" podUID="51ca438a-e2b6-4381-8e60-f5629fa0787e" Sep 13 00:28:53.744919 kubelet[2698]: E0913 00:28:53.744129 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-56d74c46fd-2xkjd_calico-system(ebbca511-d600-4da7-8c41-e873d3aeebe8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-56d74c46fd-2xkjd_calico-system(ebbca511-d600-4da7-8c41-e873d3aeebe8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c3aa981989d664a70a10e9c5c9e20ea9caa0653df5b8b7dff09bcf74933820e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-56d74c46fd-2xkjd" podUID="ebbca511-d600-4da7-8c41-e873d3aeebe8" Sep 13 00:28:53.744919 kubelet[2698]: E0913 00:28:53.744213 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-gx8np_calico-system(316678ae-ae53-4c3e-b02b-987dc2ed4041)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-gx8np_calico-system(316678ae-ae53-4c3e-b02b-987dc2ed4041)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c78f5d9e536b3bcd8bd602649d5455aea31b7adaf18e3ca3da8798748ae4520d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-gx8np" podUID="316678ae-ae53-4c3e-b02b-987dc2ed4041" Sep 13 00:28:53.745011 kubelet[2698]: E0913 00:28:53.744208 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-56848bfb48-n8vcn_calico-system(257fb627-9843-427c-b592-c2472be0c288)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-56848bfb48-n8vcn_calico-system(257fb627-9843-427c-b592-c2472be0c288)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f46840c23e55aa11821c522de0d9c4d61f20bc484c9721c897d57e4023ad0b92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-56848bfb48-n8vcn" podUID="257fb627-9843-427c-b592-c2472be0c288" Sep 13 00:28:53.745011 kubelet[2698]: E0913 00:28:53.744282 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-gz9hh_kube-system(c041bc9d-e4f0-4799-959f-fb73ab87bb7d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-gz9hh_kube-system(c041bc9d-e4f0-4799-959f-fb73ab87bb7d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c81e2566b01c6a5141bc2962a57c43495b6d968fd36863524d95549e3396e57f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-gz9hh" podUID="c041bc9d-e4f0-4799-959f-fb73ab87bb7d" Sep 13 00:28:54.177948 containerd[1571]: time="2025-09-13T00:28:54.177877736Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 13 00:28:54.628472 systemd[1]: Created slice kubepods-besteffort-pod8759312a_5948_41bc_8173_0a88fa7fe6fe.slice - libcontainer container kubepods-besteffort-pod8759312a_5948_41bc_8173_0a88fa7fe6fe.slice. Sep 13 00:28:54.633103 containerd[1571]: time="2025-09-13T00:28:54.633029089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwvxw,Uid:8759312a-5948-41bc-8173-0a88fa7fe6fe,Namespace:calico-system,Attempt:0,}" Sep 13 00:28:54.712174 containerd[1571]: time="2025-09-13T00:28:54.712113934Z" level=error msg="Failed to destroy network for sandbox \"984d91a8550ca5154e4c094604df07dd114ab5ae4000c4f4cd3937e60edc8026\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:54.713946 containerd[1571]: time="2025-09-13T00:28:54.713838586Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwvxw,Uid:8759312a-5948-41bc-8173-0a88fa7fe6fe,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"984d91a8550ca5154e4c094604df07dd114ab5ae4000c4f4cd3937e60edc8026\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:54.714207 kubelet[2698]: E0913 00:28:54.714144 2698 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"984d91a8550ca5154e4c094604df07dd114ab5ae4000c4f4cd3937e60edc8026\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:28:54.714309 kubelet[2698]: E0913 00:28:54.714217 2698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"984d91a8550ca5154e4c094604df07dd114ab5ae4000c4f4cd3937e60edc8026\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xwvxw" Sep 13 00:28:54.714309 kubelet[2698]: E0913 00:28:54.714250 2698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"984d91a8550ca5154e4c094604df07dd114ab5ae4000c4f4cd3937e60edc8026\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xwvxw" Sep 13 00:28:54.714511 kubelet[2698]: E0913 00:28:54.714296 2698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xwvxw_calico-system(8759312a-5948-41bc-8173-0a88fa7fe6fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xwvxw_calico-system(8759312a-5948-41bc-8173-0a88fa7fe6fe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"984d91a8550ca5154e4c094604df07dd114ab5ae4000c4f4cd3937e60edc8026\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xwvxw" podUID="8759312a-5948-41bc-8173-0a88fa7fe6fe" Sep 13 00:28:54.715216 systemd[1]: run-netns-cni\x2d729651c7\x2dbdac\x2d58b9\x2dbe56\x2d3b0d0cee281e.mount: Deactivated successfully. Sep 13 00:29:00.401579 kubelet[2698]: I0913 00:29:00.400238 2698 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 00:29:00.869520 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4114858226.mount: Deactivated successfully. Sep 13 00:29:01.857121 containerd[1571]: time="2025-09-13T00:29:01.857024894Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:01.859828 containerd[1571]: time="2025-09-13T00:29:01.859763939Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 13 00:29:01.861013 containerd[1571]: time="2025-09-13T00:29:01.860896537Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:01.863236 containerd[1571]: time="2025-09-13T00:29:01.863191407Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:01.863843 containerd[1571]: time="2025-09-13T00:29:01.863779822Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.685830462s" Sep 13 00:29:01.863843 containerd[1571]: time="2025-09-13T00:29:01.863832321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 13 00:29:01.880154 containerd[1571]: time="2025-09-13T00:29:01.880084306Z" level=info msg="CreateContainer within sandbox \"3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 13 00:29:02.142911 containerd[1571]: time="2025-09-13T00:29:02.142713947Z" level=info msg="Container e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:02.167547 containerd[1571]: time="2025-09-13T00:29:02.167436218Z" level=info msg="CreateContainer within sandbox \"3b3edfc04d44b1d1479e536192778f93dfc75b0714dec0b2d65df308a492d165\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\"" Sep 13 00:29:02.169417 containerd[1571]: time="2025-09-13T00:29:02.168228306Z" level=info msg="StartContainer for \"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\"" Sep 13 00:29:02.169859 containerd[1571]: time="2025-09-13T00:29:02.169832848Z" level=info msg="connecting to shim e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66" address="unix:///run/containerd/s/4327f4a70424b9273dffc5ad31f2b3ec86e2b124ecbd383919075ca1a106986c" protocol=ttrpc version=3 Sep 13 00:29:02.217557 systemd[1]: Started cri-containerd-e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66.scope - libcontainer container e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66. Sep 13 00:29:02.278503 containerd[1571]: time="2025-09-13T00:29:02.278456233Z" level=info msg="StartContainer for \"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\" returns successfully" Sep 13 00:29:02.367095 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 13 00:29:02.367804 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 13 00:29:02.622260 kubelet[2698]: I0913 00:29:02.621491 2698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9sqg\" (UniqueName: \"kubernetes.io/projected/ebbca511-d600-4da7-8c41-e873d3aeebe8-kube-api-access-s9sqg\") pod \"ebbca511-d600-4da7-8c41-e873d3aeebe8\" (UID: \"ebbca511-d600-4da7-8c41-e873d3aeebe8\") " Sep 13 00:29:02.622260 kubelet[2698]: I0913 00:29:02.621542 2698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-backend-key-pair\") pod \"ebbca511-d600-4da7-8c41-e873d3aeebe8\" (UID: \"ebbca511-d600-4da7-8c41-e873d3aeebe8\") " Sep 13 00:29:02.622260 kubelet[2698]: I0913 00:29:02.621582 2698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-ca-bundle\") pod \"ebbca511-d600-4da7-8c41-e873d3aeebe8\" (UID: \"ebbca511-d600-4da7-8c41-e873d3aeebe8\") " Sep 13 00:29:02.622260 kubelet[2698]: I0913 00:29:02.622153 2698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ebbca511-d600-4da7-8c41-e873d3aeebe8" (UID: "ebbca511-d600-4da7-8c41-e873d3aeebe8"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 13 00:29:02.626221 kubelet[2698]: I0913 00:29:02.626184 2698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbca511-d600-4da7-8c41-e873d3aeebe8-kube-api-access-s9sqg" (OuterVolumeSpecName: "kube-api-access-s9sqg") pod "ebbca511-d600-4da7-8c41-e873d3aeebe8" (UID: "ebbca511-d600-4da7-8c41-e873d3aeebe8"). InnerVolumeSpecName "kube-api-access-s9sqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 13 00:29:02.629024 kubelet[2698]: I0913 00:29:02.628975 2698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ebbca511-d600-4da7-8c41-e873d3aeebe8" (UID: "ebbca511-d600-4da7-8c41-e873d3aeebe8"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 13 00:29:02.722457 kubelet[2698]: I0913 00:29:02.722390 2698 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 13 00:29:02.722457 kubelet[2698]: I0913 00:29:02.722432 2698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9sqg\" (UniqueName: \"kubernetes.io/projected/ebbca511-d600-4da7-8c41-e873d3aeebe8-kube-api-access-s9sqg\") on node \"localhost\" DevicePath \"\"" Sep 13 00:29:02.722457 kubelet[2698]: I0913 00:29:02.722442 2698 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebbca511-d600-4da7-8c41-e873d3aeebe8-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 13 00:29:02.871739 systemd[1]: var-lib-kubelet-pods-ebbca511\x2dd600\x2d4da7\x2d8c41\x2de873d3aeebe8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ds9sqg.mount: Deactivated successfully. Sep 13 00:29:02.871868 systemd[1]: var-lib-kubelet-pods-ebbca511\x2dd600\x2d4da7\x2d8c41\x2de873d3aeebe8-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 13 00:29:03.216841 systemd[1]: Removed slice kubepods-besteffort-podebbca511_d600_4da7_8c41_e873d3aeebe8.slice - libcontainer container kubepods-besteffort-podebbca511_d600_4da7_8c41_e873d3aeebe8.slice. Sep 13 00:29:03.236681 kubelet[2698]: I0913 00:29:03.236614 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-fhjbl" podStartSLOduration=2.142408702 podStartE2EDuration="23.236592913s" podCreationTimestamp="2025-09-13 00:28:40 +0000 UTC" firstStartedPulling="2025-09-13 00:28:40.770497114 +0000 UTC m=+18.245527357" lastFinishedPulling="2025-09-13 00:29:01.864681326 +0000 UTC m=+39.339711568" observedRunningTime="2025-09-13 00:29:03.235057629 +0000 UTC m=+40.710087871" watchObservedRunningTime="2025-09-13 00:29:03.236592913 +0000 UTC m=+40.711623155" Sep 13 00:29:03.332958 systemd[1]: Created slice kubepods-besteffort-podaf76ecae_d4d9_4bf5_b157_643b893bc664.slice - libcontainer container kubepods-besteffort-podaf76ecae_d4d9_4bf5_b157_643b893bc664.slice. Sep 13 00:29:03.427751 kubelet[2698]: I0913 00:29:03.427557 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af76ecae-d4d9-4bf5-b157-643b893bc664-whisker-ca-bundle\") pod \"whisker-647bb8d697-72k8l\" (UID: \"af76ecae-d4d9-4bf5-b157-643b893bc664\") " pod="calico-system/whisker-647bb8d697-72k8l" Sep 13 00:29:03.427751 kubelet[2698]: I0913 00:29:03.427630 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh4fc\" (UniqueName: \"kubernetes.io/projected/af76ecae-d4d9-4bf5-b157-643b893bc664-kube-api-access-vh4fc\") pod \"whisker-647bb8d697-72k8l\" (UID: \"af76ecae-d4d9-4bf5-b157-643b893bc664\") " pod="calico-system/whisker-647bb8d697-72k8l" Sep 13 00:29:03.427751 kubelet[2698]: I0913 00:29:03.427650 2698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/af76ecae-d4d9-4bf5-b157-643b893bc664-whisker-backend-key-pair\") pod \"whisker-647bb8d697-72k8l\" (UID: \"af76ecae-d4d9-4bf5-b157-643b893bc664\") " pod="calico-system/whisker-647bb8d697-72k8l" Sep 13 00:29:03.458089 containerd[1571]: time="2025-09-13T00:29:03.455964638Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\" id:\"0931de36efff374109cae7516467356e223a4523131bb9436ee110366044da82\" pid:3884 exit_status:1 exited_at:{seconds:1757723343 nanos:455353391}" Sep 13 00:29:03.642050 containerd[1571]: time="2025-09-13T00:29:03.640219598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-647bb8d697-72k8l,Uid:af76ecae-d4d9-4bf5-b157-643b893bc664,Namespace:calico-system,Attempt:0,}" Sep 13 00:29:04.585047 containerd[1571]: time="2025-09-13T00:29:04.584443962Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\" id:\"7de659a206b3201b7f3cd38effe133b547bd050ce4ec5a399f16273d965f811a\" pid:4025 exit_status:1 exited_at:{seconds:1757723344 nanos:583839888}" Sep 13 00:29:04.620860 containerd[1571]: time="2025-09-13T00:29:04.620719701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-gx8np,Uid:316678ae-ae53-4c3e-b02b-987dc2ed4041,Namespace:calico-system,Attempt:0,}" Sep 13 00:29:05.921203 kubelet[2698]: I0913 00:29:05.919251 2698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbca511-d600-4da7-8c41-e873d3aeebe8" path="/var/lib/kubelet/pods/ebbca511-d600-4da7-8c41-e873d3aeebe8/volumes" Sep 13 00:29:05.921985 kubelet[2698]: E0913 00:29:05.921425 2698 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.303s" Sep 13 00:29:05.924108 containerd[1571]: time="2025-09-13T00:29:05.923733189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-6mngl,Uid:6df7a665-f2da-401b-9fe7-13fb91ea1673,Namespace:calico-apiserver,Attempt:0,}" Sep 13 00:29:05.924108 containerd[1571]: time="2025-09-13T00:29:05.923921132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-l7vmt,Uid:51ca438a-e2b6-4381-8e60-f5629fa0787e,Namespace:calico-apiserver,Attempt:0,}" Sep 13 00:29:05.926499 containerd[1571]: time="2025-09-13T00:29:05.925867436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gz9hh,Uid:c041bc9d-e4f0-4799-959f-fb73ab87bb7d,Namespace:kube-system,Attempt:0,}" Sep 13 00:29:05.926499 containerd[1571]: time="2025-09-13T00:29:05.926081919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hbctv,Uid:4acb061a-e88e-4499-9ff6-e87ab4883853,Namespace:kube-system,Attempt:0,}" Sep 13 00:29:05.961030 systemd-networkd[1487]: cali112c41821bc: Link UP Sep 13 00:29:05.967912 systemd-networkd[1487]: cali112c41821bc: Gained carrier Sep 13 00:29:06.038505 systemd-networkd[1487]: vxlan.calico: Link UP Sep 13 00:29:06.038518 systemd-networkd[1487]: vxlan.calico: Gained carrier Sep 13 00:29:06.106619 containerd[1571]: 2025-09-13 00:29:03.773 [INFO][3898] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 00:29:06.106619 containerd[1571]: 2025-09-13 00:29:03.987 [INFO][3898] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--647bb8d697--72k8l-eth0 whisker-647bb8d697- calico-system af76ecae-d4d9-4bf5-b157-643b893bc664 908 0 2025-09-13 00:29:03 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:647bb8d697 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-647bb8d697-72k8l eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali112c41821bc [] [] }} ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-" Sep 13 00:29:06.106619 containerd[1571]: 2025-09-13 00:29:03.987 [INFO][3898] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-eth0" Sep 13 00:29:06.106619 containerd[1571]: 2025-09-13 00:29:04.565 [INFO][4002] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" HandleID="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Workload="localhost-k8s-whisker--647bb8d697--72k8l-eth0" Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.566 [INFO][4002] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" HandleID="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Workload="localhost-k8s-whisker--647bb8d697--72k8l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004c6fc0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-647bb8d697-72k8l", "timestamp":"2025-09-13 00:29:04.565532971 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.566 [INFO][4002] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.567 [INFO][4002] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.567 [INFO][4002] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.583 [INFO][4002] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" host="localhost" Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.603 [INFO][4002] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.737 [INFO][4002] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.743 [INFO][4002] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.747 [INFO][4002] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:06.106872 containerd[1571]: 2025-09-13 00:29:04.748 [INFO][4002] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" host="localhost" Sep 13 00:29:06.107148 containerd[1571]: 2025-09-13 00:29:04.750 [INFO][4002] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21 Sep 13 00:29:06.107148 containerd[1571]: 2025-09-13 00:29:04.935 [INFO][4002] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" host="localhost" Sep 13 00:29:06.107148 containerd[1571]: 2025-09-13 00:29:04.964 [INFO][4002] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" host="localhost" Sep 13 00:29:06.107148 containerd[1571]: 2025-09-13 00:29:04.965 [INFO][4002] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" host="localhost" Sep 13 00:29:06.107148 containerd[1571]: 2025-09-13 00:29:04.965 [INFO][4002] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:06.107148 containerd[1571]: 2025-09-13 00:29:04.965 [INFO][4002] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" HandleID="k8s-pod-network.d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Workload="localhost-k8s-whisker--647bb8d697--72k8l-eth0" Sep 13 00:29:06.110022 containerd[1571]: 2025-09-13 00:29:04.983 [INFO][3898] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--647bb8d697--72k8l-eth0", GenerateName:"whisker-647bb8d697-", Namespace:"calico-system", SelfLink:"", UID:"af76ecae-d4d9-4bf5-b157-643b893bc664", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 29, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"647bb8d697", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-647bb8d697-72k8l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali112c41821bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:06.110022 containerd[1571]: 2025-09-13 00:29:04.983 [INFO][3898] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-eth0" Sep 13 00:29:06.110127 containerd[1571]: 2025-09-13 00:29:04.983 [INFO][3898] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali112c41821bc ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-eth0" Sep 13 00:29:06.110127 containerd[1571]: 2025-09-13 00:29:05.966 [INFO][3898] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-eth0" Sep 13 00:29:06.110178 containerd[1571]: 2025-09-13 00:29:05.969 [INFO][3898] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--647bb8d697--72k8l-eth0", GenerateName:"whisker-647bb8d697-", Namespace:"calico-system", SelfLink:"", UID:"af76ecae-d4d9-4bf5-b157-643b893bc664", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 29, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"647bb8d697", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21", Pod:"whisker-647bb8d697-72k8l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali112c41821bc", MAC:"a6:cc:0e:7e:30:23", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:06.110240 containerd[1571]: 2025-09-13 00:29:06.101 [INFO][3898] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" Namespace="calico-system" Pod="whisker-647bb8d697-72k8l" WorkloadEndpoint="localhost-k8s-whisker--647bb8d697--72k8l-eth0" Sep 13 00:29:06.630374 containerd[1571]: time="2025-09-13T00:29:06.630294050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56848bfb48-n8vcn,Uid:257fb627-9843-427c-b592-c2472be0c288,Namespace:calico-system,Attempt:0,}" Sep 13 00:29:07.118478 systemd-networkd[1487]: cali3efe442a2d5: Link UP Sep 13 00:29:07.122452 systemd-networkd[1487]: cali3efe442a2d5: Gained carrier Sep 13 00:29:07.174315 containerd[1571]: 2025-09-13 00:29:06.618 [INFO][4122] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0 coredns-7c65d6cfc9- kube-system 4acb061a-e88e-4499-9ff6-e87ab4883853 832 0 2025-09-13 00:28:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-hbctv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3efe442a2d5 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-" Sep 13 00:29:07.174315 containerd[1571]: 2025-09-13 00:29:06.618 [INFO][4122] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" Sep 13 00:29:07.174315 containerd[1571]: 2025-09-13 00:29:06.902 [INFO][4195] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" HandleID="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Workload="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.902 [INFO][4195] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" HandleID="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Workload="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004ecc0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-hbctv", "timestamp":"2025-09-13 00:29:06.901907818 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.902 [INFO][4195] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.906 [INFO][4195] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.906 [INFO][4195] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.937 [INFO][4195] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" host="localhost" Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.955 [INFO][4195] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.975 [INFO][4195] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:06.983 [INFO][4195] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:07.002 [INFO][4195] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.175381 containerd[1571]: 2025-09-13 00:29:07.002 [INFO][4195] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" host="localhost" Sep 13 00:29:07.175645 containerd[1571]: 2025-09-13 00:29:07.010 [INFO][4195] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273 Sep 13 00:29:07.175645 containerd[1571]: 2025-09-13 00:29:07.043 [INFO][4195] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" host="localhost" Sep 13 00:29:07.175645 containerd[1571]: 2025-09-13 00:29:07.084 [INFO][4195] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" host="localhost" Sep 13 00:29:07.175645 containerd[1571]: 2025-09-13 00:29:07.084 [INFO][4195] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" host="localhost" Sep 13 00:29:07.175645 containerd[1571]: 2025-09-13 00:29:07.084 [INFO][4195] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:07.175645 containerd[1571]: 2025-09-13 00:29:07.084 [INFO][4195] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" HandleID="k8s-pod-network.57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Workload="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" Sep 13 00:29:07.175956 containerd[1571]: 2025-09-13 00:29:07.098 [INFO][4122] cni-plugin/k8s.go 418: Populated endpoint ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"4acb061a-e88e-4499-9ff6-e87ab4883853", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-hbctv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3efe442a2d5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.176175 containerd[1571]: 2025-09-13 00:29:07.099 [INFO][4122] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" Sep 13 00:29:07.176175 containerd[1571]: 2025-09-13 00:29:07.099 [INFO][4122] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3efe442a2d5 ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" Sep 13 00:29:07.176175 containerd[1571]: 2025-09-13 00:29:07.114 [INFO][4122] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" Sep 13 00:29:07.176291 containerd[1571]: 2025-09-13 00:29:07.115 [INFO][4122] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"4acb061a-e88e-4499-9ff6-e87ab4883853", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273", Pod:"coredns-7c65d6cfc9-hbctv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3efe442a2d5", MAC:"c2:c5:5e:20:52:e2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.176291 containerd[1571]: 2025-09-13 00:29:07.162 [INFO][4122] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hbctv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--hbctv-eth0" Sep 13 00:29:07.213094 containerd[1571]: time="2025-09-13T00:29:07.212946944Z" level=info msg="connecting to shim d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21" address="unix:///run/containerd/s/02071faaefa4b8cdf39e4a5031be3351515ecb60873b058e44012e40b76449a8" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:07.265197 containerd[1571]: time="2025-09-13T00:29:07.264260653Z" level=info msg="connecting to shim 57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273" address="unix:///run/containerd/s/ece961f732a959d4d8e2a19e3f09c81f791eff65a110267d289232318cec6276" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:07.333578 systemd-networkd[1487]: cali6860e577cc3: Link UP Sep 13 00:29:07.340826 systemd-networkd[1487]: cali6860e577cc3: Gained carrier Sep 13 00:29:07.373949 systemd[1]: Started cri-containerd-57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273.scope - libcontainer container 57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273. Sep 13 00:29:07.384326 systemd[1]: Started cri-containerd-d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21.scope - libcontainer container d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21. Sep 13 00:29:07.410837 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:07.441660 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:06.743 [INFO][4155] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0 calico-apiserver-c5df58fcc- calico-apiserver 51ca438a-e2b6-4381-8e60-f5629fa0787e 835 0 2025-09-13 00:28:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c5df58fcc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-c5df58fcc-l7vmt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6860e577cc3 [] [] }} ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:06.744 [INFO][4155] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:06.981 [INFO][4244] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" HandleID="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Workload="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:06.981 [INFO][4244] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" HandleID="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Workload="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bcfa0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-c5df58fcc-l7vmt", "timestamp":"2025-09-13 00:29:06.981421488 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:06.981 [INFO][4244] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.085 [INFO][4244] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.092 [INFO][4244] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.147 [INFO][4244] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.209 [INFO][4244] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.238 [INFO][4244] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.253 [INFO][4244] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.265 [INFO][4244] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.266 [INFO][4244] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.269 [INFO][4244] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0 Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.291 [INFO][4244] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.319 [INFO][4244] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.319 [INFO][4244] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" host="localhost" Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.319 [INFO][4244] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:07.443527 containerd[1571]: 2025-09-13 00:29:07.319 [INFO][4244] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" HandleID="k8s-pod-network.bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Workload="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" Sep 13 00:29:07.444173 containerd[1571]: 2025-09-13 00:29:07.326 [INFO][4155] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0", GenerateName:"calico-apiserver-c5df58fcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"51ca438a-e2b6-4381-8e60-f5629fa0787e", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5df58fcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-c5df58fcc-l7vmt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6860e577cc3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.444173 containerd[1571]: 2025-09-13 00:29:07.326 [INFO][4155] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" Sep 13 00:29:07.444173 containerd[1571]: 2025-09-13 00:29:07.327 [INFO][4155] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6860e577cc3 ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" Sep 13 00:29:07.444173 containerd[1571]: 2025-09-13 00:29:07.360 [INFO][4155] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" Sep 13 00:29:07.444173 containerd[1571]: 2025-09-13 00:29:07.362 [INFO][4155] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0", GenerateName:"calico-apiserver-c5df58fcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"51ca438a-e2b6-4381-8e60-f5629fa0787e", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5df58fcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0", Pod:"calico-apiserver-c5df58fcc-l7vmt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6860e577cc3", MAC:"4a:04:4b:b5:9c:10", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.444173 containerd[1571]: 2025-09-13 00:29:07.428 [INFO][4155] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-l7vmt" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--l7vmt-eth0" Sep 13 00:29:07.543365 systemd-networkd[1487]: calibf58091b572: Link UP Sep 13 00:29:07.543850 systemd-networkd[1487]: calibf58091b572: Gained carrier Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:06.757 [INFO][4201] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0 calico-kube-controllers-56848bfb48- calico-system 257fb627-9843-427c-b592-c2472be0c288 824 0 2025-09-13 00:28:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:56848bfb48 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-56848bfb48-n8vcn eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibf58091b572 [] [] }} ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:06.757 [INFO][4201] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:06.979 [INFO][4250] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" HandleID="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Workload="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:06.981 [INFO][4250] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" HandleID="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Workload="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032c0f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-56848bfb48-n8vcn", "timestamp":"2025-09-13 00:29:06.979586043 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:06.981 [INFO][4250] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.320 [INFO][4250] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.320 [INFO][4250] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.353 [INFO][4250] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.393 [INFO][4250] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.437 [INFO][4250] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.449 [INFO][4250] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.463 [INFO][4250] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.463 [INFO][4250] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.475 [INFO][4250] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81 Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.489 [INFO][4250] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.515 [INFO][4250] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.515 [INFO][4250] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" host="localhost" Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.515 [INFO][4250] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:07.607233 containerd[1571]: 2025-09-13 00:29:07.515 [INFO][4250] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" HandleID="k8s-pod-network.818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Workload="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" Sep 13 00:29:07.608015 containerd[1571]: 2025-09-13 00:29:07.523 [INFO][4201] cni-plugin/k8s.go 418: Populated endpoint ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0", GenerateName:"calico-kube-controllers-56848bfb48-", Namespace:"calico-system", SelfLink:"", UID:"257fb627-9843-427c-b592-c2472be0c288", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56848bfb48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-56848bfb48-n8vcn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibf58091b572", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.608015 containerd[1571]: 2025-09-13 00:29:07.523 [INFO][4201] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" Sep 13 00:29:07.608015 containerd[1571]: 2025-09-13 00:29:07.523 [INFO][4201] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibf58091b572 ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" Sep 13 00:29:07.608015 containerd[1571]: 2025-09-13 00:29:07.544 [INFO][4201] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" Sep 13 00:29:07.608015 containerd[1571]: 2025-09-13 00:29:07.545 [INFO][4201] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0", GenerateName:"calico-kube-controllers-56848bfb48-", Namespace:"calico-system", SelfLink:"", UID:"257fb627-9843-427c-b592-c2472be0c288", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"56848bfb48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81", Pod:"calico-kube-controllers-56848bfb48-n8vcn", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibf58091b572", MAC:"7a:40:26:a7:df:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.608015 containerd[1571]: 2025-09-13 00:29:07.588 [INFO][4201] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" Namespace="calico-system" Pod="calico-kube-controllers-56848bfb48-n8vcn" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--56848bfb48--n8vcn-eth0" Sep 13 00:29:07.615776 containerd[1571]: time="2025-09-13T00:29:07.615727899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hbctv,Uid:4acb061a-e88e-4499-9ff6-e87ab4883853,Namespace:kube-system,Attempt:0,} returns sandbox id \"57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273\"" Sep 13 00:29:07.636677 containerd[1571]: time="2025-09-13T00:29:07.636144790Z" level=info msg="CreateContainer within sandbox \"57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 00:29:07.668502 containerd[1571]: time="2025-09-13T00:29:07.646907878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-647bb8d697-72k8l,Uid:af76ecae-d4d9-4bf5-b157-643b893bc664,Namespace:calico-system,Attempt:0,} returns sandbox id \"d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21\"" Sep 13 00:29:07.668502 containerd[1571]: time="2025-09-13T00:29:07.649091376Z" level=info msg="connecting to shim bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0" address="unix:///run/containerd/s/76a4d7e7e5ef08316031d05a7018e1174dbf067f513138322fb1bc65b78b3f88" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:07.673730 containerd[1571]: time="2025-09-13T00:29:07.673616250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 13 00:29:07.676556 systemd-networkd[1487]: cali112c41821bc: Gained IPv6LL Sep 13 00:29:07.734188 containerd[1571]: time="2025-09-13T00:29:07.734136433Z" level=info msg="Container 1631075732493792996b1e354a82ea4089be5497c2793ee1377fbb7937d7ae7d: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:07.794205 containerd[1571]: time="2025-09-13T00:29:07.794164351Z" level=info msg="CreateContainer within sandbox \"57d110e68d10031440283dcfa9cb7a93caa5be69f8cf194745efcc5d3248b273\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1631075732493792996b1e354a82ea4089be5497c2793ee1377fbb7937d7ae7d\"" Sep 13 00:29:07.795173 systemd-networkd[1487]: cali9bbdba69a7d: Link UP Sep 13 00:29:07.798066 containerd[1571]: time="2025-09-13T00:29:07.796858700Z" level=info msg="StartContainer for \"1631075732493792996b1e354a82ea4089be5497c2793ee1377fbb7937d7ae7d\"" Sep 13 00:29:07.798066 containerd[1571]: time="2025-09-13T00:29:07.797746366Z" level=info msg="connecting to shim 1631075732493792996b1e354a82ea4089be5497c2793ee1377fbb7937d7ae7d" address="unix:///run/containerd/s/ece961f732a959d4d8e2a19e3f09c81f791eff65a110267d289232318cec6276" protocol=ttrpc version=3 Sep 13 00:29:07.802534 systemd-networkd[1487]: cali9bbdba69a7d: Gained carrier Sep 13 00:29:07.816166 containerd[1571]: time="2025-09-13T00:29:07.816089764Z" level=info msg="connecting to shim 818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81" address="unix:///run/containerd/s/983c7c640d269843062a983cc260cf7bb6a468611e39f2aa0b1b27231bb51ebd" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:07.828750 systemd[1]: Started cri-containerd-bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0.scope - libcontainer container bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0. Sep 13 00:29:07.888617 systemd[1]: Started cri-containerd-1631075732493792996b1e354a82ea4089be5497c2793ee1377fbb7937d7ae7d.scope - libcontainer container 1631075732493792996b1e354a82ea4089be5497c2793ee1377fbb7937d7ae7d. Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:06.761 [INFO][4146] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0 calico-apiserver-c5df58fcc- calico-apiserver 6df7a665-f2da-401b-9fe7-13fb91ea1673 834 0 2025-09-13 00:28:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c5df58fcc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-c5df58fcc-6mngl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9bbdba69a7d [] [] }} ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:06.762 [INFO][4146] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:06.980 [INFO][4267] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" HandleID="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Workload="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:06.982 [INFO][4267] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" HandleID="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Workload="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00034f290), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-c5df58fcc-6mngl", "timestamp":"2025-09-13 00:29:06.980404569 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:06.982 [INFO][4267] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.515 [INFO][4267] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.516 [INFO][4267] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.551 [INFO][4267] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.592 [INFO][4267] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.630 [INFO][4267] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.644 [INFO][4267] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.664 [INFO][4267] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.665 [INFO][4267] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.679 [INFO][4267] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.698 [INFO][4267] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.748 [INFO][4267] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.748 [INFO][4267] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" host="localhost" Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.748 [INFO][4267] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:07.921784 containerd[1571]: 2025-09-13 00:29:07.748 [INFO][4267] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" HandleID="k8s-pod-network.4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Workload="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" Sep 13 00:29:07.922575 containerd[1571]: 2025-09-13 00:29:07.782 [INFO][4146] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0", GenerateName:"calico-apiserver-c5df58fcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"6df7a665-f2da-401b-9fe7-13fb91ea1673", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5df58fcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-c5df58fcc-6mngl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9bbdba69a7d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.922575 containerd[1571]: 2025-09-13 00:29:07.786 [INFO][4146] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" Sep 13 00:29:07.922575 containerd[1571]: 2025-09-13 00:29:07.786 [INFO][4146] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9bbdba69a7d ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" Sep 13 00:29:07.922575 containerd[1571]: 2025-09-13 00:29:07.803 [INFO][4146] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" Sep 13 00:29:07.922575 containerd[1571]: 2025-09-13 00:29:07.804 [INFO][4146] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0", GenerateName:"calico-apiserver-c5df58fcc-", Namespace:"calico-apiserver", SelfLink:"", UID:"6df7a665-f2da-401b-9fe7-13fb91ea1673", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5df58fcc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad", Pod:"calico-apiserver-c5df58fcc-6mngl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9bbdba69a7d", MAC:"ca:56:98:60:20:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:07.922575 containerd[1571]: 2025-09-13 00:29:07.894 [INFO][4146] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" Namespace="calico-apiserver" Pod="calico-apiserver-c5df58fcc-6mngl" WorkloadEndpoint="localhost-k8s-calico--apiserver--c5df58fcc--6mngl-eth0" Sep 13 00:29:07.930360 systemd[1]: Started cri-containerd-818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81.scope - libcontainer container 818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81. Sep 13 00:29:07.934328 systemd-networkd[1487]: vxlan.calico: Gained IPv6LL Sep 13 00:29:07.956936 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:08.058036 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:08.108356 containerd[1571]: time="2025-09-13T00:29:08.108239224Z" level=info msg="connecting to shim 4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad" address="unix:///run/containerd/s/5b1aa5b23549233a76901f9365762b0c9af1fe215dc3ac472bc39b0e961c6f0a" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:08.144466 systemd-networkd[1487]: calif812ad2f0df: Link UP Sep 13 00:29:08.158653 systemd-networkd[1487]: calif812ad2f0df: Gained carrier Sep 13 00:29:08.179809 containerd[1571]: time="2025-09-13T00:29:08.179772154Z" level=info msg="StartContainer for \"1631075732493792996b1e354a82ea4089be5497c2793ee1377fbb7937d7ae7d\" returns successfully" Sep 13 00:29:08.194657 containerd[1571]: time="2025-09-13T00:29:08.194598899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-l7vmt,Uid:51ca438a-e2b6-4381-8e60-f5629fa0787e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0\"" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:06.723 [INFO][4134] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--gx8np-eth0 goldmane-7988f88666- calico-system 316678ae-ae53-4c3e-b02b-987dc2ed4041 828 0 2025-09-13 00:28:39 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-gx8np eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif812ad2f0df [] [] }} ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:06.724 [INFO][4134] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-eth0" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.009 [INFO][4251] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" HandleID="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Workload="localhost-k8s-goldmane--7988f88666--gx8np-eth0" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.010 [INFO][4251] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" HandleID="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Workload="localhost-k8s-goldmane--7988f88666--gx8np-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000304c10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-gx8np", "timestamp":"2025-09-13 00:29:07.009818225 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.010 [INFO][4251] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.753 [INFO][4251] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.753 [INFO][4251] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.897 [INFO][4251] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.928 [INFO][4251] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.964 [INFO][4251] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.979 [INFO][4251] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.986 [INFO][4251] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:07.987 [INFO][4251] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:08.009 [INFO][4251] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:08.034 [INFO][4251] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:08.074 [INFO][4251] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:08.074 [INFO][4251] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" host="localhost" Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:08.074 [INFO][4251] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:08.270047 containerd[1571]: 2025-09-13 00:29:08.074 [INFO][4251] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" HandleID="k8s-pod-network.511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Workload="localhost-k8s-goldmane--7988f88666--gx8np-eth0" Sep 13 00:29:08.270723 containerd[1571]: 2025-09-13 00:29:08.107 [INFO][4134] cni-plugin/k8s.go 418: Populated endpoint ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--gx8np-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"316678ae-ae53-4c3e-b02b-987dc2ed4041", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-gx8np", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif812ad2f0df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:08.270723 containerd[1571]: 2025-09-13 00:29:08.108 [INFO][4134] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-eth0" Sep 13 00:29:08.270723 containerd[1571]: 2025-09-13 00:29:08.108 [INFO][4134] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif812ad2f0df ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-eth0" Sep 13 00:29:08.270723 containerd[1571]: 2025-09-13 00:29:08.135 [INFO][4134] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-eth0" Sep 13 00:29:08.270723 containerd[1571]: 2025-09-13 00:29:08.172 [INFO][4134] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--gx8np-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"316678ae-ae53-4c3e-b02b-987dc2ed4041", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e", Pod:"goldmane-7988f88666-gx8np", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif812ad2f0df", MAC:"4e:ba:11:2c:cb:61", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:08.270723 containerd[1571]: 2025-09-13 00:29:08.199 [INFO][4134] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" Namespace="calico-system" Pod="goldmane-7988f88666-gx8np" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--gx8np-eth0" Sep 13 00:29:08.307416 systemd[1]: Started cri-containerd-4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad.scope - libcontainer container 4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad. Sep 13 00:29:08.353432 containerd[1571]: time="2025-09-13T00:29:08.353391754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-56848bfb48-n8vcn,Uid:257fb627-9843-427c-b592-c2472be0c288,Namespace:calico-system,Attempt:0,} returns sandbox id \"818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81\"" Sep 13 00:29:08.416309 systemd-networkd[1487]: cali1dd7cfe29fa: Link UP Sep 13 00:29:08.423136 systemd-networkd[1487]: cali1dd7cfe29fa: Gained carrier Sep 13 00:29:08.451625 systemd-networkd[1487]: cali3efe442a2d5: Gained IPv6LL Sep 13 00:29:08.572062 kubelet[2698]: I0913 00:29:08.555645 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-hbctv" podStartSLOduration=41.552326028 podStartE2EDuration="41.552326028s" podCreationTimestamp="2025-09-13 00:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:29:08.464225515 +0000 UTC m=+45.939255757" watchObservedRunningTime="2025-09-13 00:29:08.552326028 +0000 UTC m=+46.027356260" Sep 13 00:29:08.572678 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:06.908 [INFO][4177] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0 coredns-7c65d6cfc9- kube-system c041bc9d-e4f0-4799-959f-fb73ab87bb7d 831 0 2025-09-13 00:28:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-gz9hh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1dd7cfe29fa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:06.909 [INFO][4177] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:07.035 [INFO][4280] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" HandleID="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Workload="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:07.037 [INFO][4280] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" HandleID="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Workload="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032f2f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-gz9hh", "timestamp":"2025-09-13 00:29:07.034855971 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:07.037 [INFO][4280] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.074 [INFO][4280] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.075 [INFO][4280] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.147 [INFO][4280] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.190 [INFO][4280] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.238 [INFO][4280] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.266 [INFO][4280] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.299 [INFO][4280] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.299 [INFO][4280] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.319 [INFO][4280] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.342 [INFO][4280] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.391 [INFO][4280] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.391 [INFO][4280] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" host="localhost" Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.391 [INFO][4280] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:08.574650 containerd[1571]: 2025-09-13 00:29:08.392 [INFO][4280] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" HandleID="k8s-pod-network.c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Workload="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" Sep 13 00:29:08.578627 containerd[1571]: 2025-09-13 00:29:08.400 [INFO][4177] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"c041bc9d-e4f0-4799-959f-fb73ab87bb7d", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-gz9hh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1dd7cfe29fa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:08.578627 containerd[1571]: 2025-09-13 00:29:08.403 [INFO][4177] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" Sep 13 00:29:08.578627 containerd[1571]: 2025-09-13 00:29:08.403 [INFO][4177] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1dd7cfe29fa ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" Sep 13 00:29:08.578627 containerd[1571]: 2025-09-13 00:29:08.442 [INFO][4177] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" Sep 13 00:29:08.578627 containerd[1571]: 2025-09-13 00:29:08.492 [INFO][4177] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"c041bc9d-e4f0-4799-959f-fb73ab87bb7d", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d", Pod:"coredns-7c65d6cfc9-gz9hh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1dd7cfe29fa", MAC:"5a:a5:8e:1e:dc:5f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:08.578627 containerd[1571]: 2025-09-13 00:29:08.550 [INFO][4177] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gz9hh" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--gz9hh-eth0" Sep 13 00:29:08.586046 systemd-networkd[1487]: cali6860e577cc3: Gained IPv6LL Sep 13 00:29:08.609995 containerd[1571]: time="2025-09-13T00:29:08.609389685Z" level=info msg="connecting to shim 511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e" address="unix:///run/containerd/s/72565944cc1c43bf0fba727d0c00394cf16c83c8e544d6a6f0d9d6d5b5cc577b" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:08.700013 systemd-networkd[1487]: calibf58091b572: Gained IPv6LL Sep 13 00:29:08.714970 systemd[1]: Started cri-containerd-511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e.scope - libcontainer container 511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e. Sep 13 00:29:08.767954 containerd[1571]: time="2025-09-13T00:29:08.767767531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5df58fcc-6mngl,Uid:6df7a665-f2da-401b-9fe7-13fb91ea1673,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad\"" Sep 13 00:29:08.787679 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:08.927079 containerd[1571]: time="2025-09-13T00:29:08.927009009Z" level=info msg="connecting to shim c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d" address="unix:///run/containerd/s/9625b335dbcddf545f241aa42937f3bb94f504a28e322dfeb4467a22aab7454b" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:08.940550 containerd[1571]: time="2025-09-13T00:29:08.940473116Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-gx8np,Uid:316678ae-ae53-4c3e-b02b-987dc2ed4041,Namespace:calico-system,Attempt:0,} returns sandbox id \"511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e\"" Sep 13 00:29:09.008820 systemd[1]: Started cri-containerd-c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d.scope - libcontainer container c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d. Sep 13 00:29:09.071748 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:09.156310 containerd[1571]: time="2025-09-13T00:29:09.156189311Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gz9hh,Uid:c041bc9d-e4f0-4799-959f-fb73ab87bb7d,Namespace:kube-system,Attempt:0,} returns sandbox id \"c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d\"" Sep 13 00:29:09.165924 containerd[1571]: time="2025-09-13T00:29:09.165153027Z" level=info msg="CreateContainer within sandbox \"c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 00:29:09.232811 containerd[1571]: time="2025-09-13T00:29:09.232707461Z" level=info msg="Container 825dafd338296142b3036b76208f1ca25af1df72302d78041d5858c7c7d021f9: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:09.260555 containerd[1571]: time="2025-09-13T00:29:09.260364921Z" level=info msg="CreateContainer within sandbox \"c28c4b03b3589dd781411160b153c913636eda5aac8ea706c14f2f209a136f7d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"825dafd338296142b3036b76208f1ca25af1df72302d78041d5858c7c7d021f9\"" Sep 13 00:29:09.262830 containerd[1571]: time="2025-09-13T00:29:09.262761029Z" level=info msg="StartContainer for \"825dafd338296142b3036b76208f1ca25af1df72302d78041d5858c7c7d021f9\"" Sep 13 00:29:09.267097 containerd[1571]: time="2025-09-13T00:29:09.266987182Z" level=info msg="connecting to shim 825dafd338296142b3036b76208f1ca25af1df72302d78041d5858c7c7d021f9" address="unix:///run/containerd/s/9625b335dbcddf545f241aa42937f3bb94f504a28e322dfeb4467a22aab7454b" protocol=ttrpc version=3 Sep 13 00:29:09.281445 systemd-networkd[1487]: cali9bbdba69a7d: Gained IPv6LL Sep 13 00:29:09.348827 systemd[1]: Started cri-containerd-825dafd338296142b3036b76208f1ca25af1df72302d78041d5858c7c7d021f9.scope - libcontainer container 825dafd338296142b3036b76208f1ca25af1df72302d78041d5858c7c7d021f9. Sep 13 00:29:09.488309 containerd[1571]: time="2025-09-13T00:29:09.488149584Z" level=info msg="StartContainer for \"825dafd338296142b3036b76208f1ca25af1df72302d78041d5858c7c7d021f9\" returns successfully" Sep 13 00:29:09.627642 containerd[1571]: time="2025-09-13T00:29:09.625938284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwvxw,Uid:8759312a-5948-41bc-8173-0a88fa7fe6fe,Namespace:calico-system,Attempt:0,}" Sep 13 00:29:09.977693 systemd-networkd[1487]: cali1dd7cfe29fa: Gained IPv6LL Sep 13 00:29:09.984684 systemd-networkd[1487]: calid9c48ba4d4a: Link UP Sep 13 00:29:09.985503 systemd-networkd[1487]: calid9c48ba4d4a: Gained carrier Sep 13 00:29:09.994417 containerd[1571]: time="2025-09-13T00:29:09.994299982Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:09.997252 containerd[1571]: time="2025-09-13T00:29:09.996871178Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 13 00:29:10.007555 containerd[1571]: time="2025-09-13T00:29:10.003080054Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:10.016177 containerd[1571]: time="2025-09-13T00:29:10.014358545Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.763 [INFO][4752] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--xwvxw-eth0 csi-node-driver- calico-system 8759312a-5948-41bc-8173-0a88fa7fe6fe 717 0 2025-09-13 00:28:40 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-xwvxw eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid9c48ba4d4a [] [] }} ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.764 [INFO][4752] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-eth0" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.830 [INFO][4765] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" HandleID="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Workload="localhost-k8s-csi--node--driver--xwvxw-eth0" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.830 [INFO][4765] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" HandleID="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Workload="localhost-k8s-csi--node--driver--xwvxw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e490), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-xwvxw", "timestamp":"2025-09-13 00:29:09.830390407 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.830 [INFO][4765] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.830 [INFO][4765] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.830 [INFO][4765] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.854 [INFO][4765] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.878 [INFO][4765] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.892 [INFO][4765] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.899 [INFO][4765] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.905 [INFO][4765] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.905 [INFO][4765] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.909 [INFO][4765] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.928 [INFO][4765] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.954 [INFO][4765] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.954 [INFO][4765] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" host="localhost" Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.954 [INFO][4765] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:29:10.017273 containerd[1571]: 2025-09-13 00:29:09.954 [INFO][4765] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" HandleID="k8s-pod-network.0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Workload="localhost-k8s-csi--node--driver--xwvxw-eth0" Sep 13 00:29:10.018245 containerd[1571]: 2025-09-13 00:29:09.974 [INFO][4752] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xwvxw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8759312a-5948-41bc-8173-0a88fa7fe6fe", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-xwvxw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid9c48ba4d4a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:10.018245 containerd[1571]: 2025-09-13 00:29:09.976 [INFO][4752] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-eth0" Sep 13 00:29:10.018245 containerd[1571]: 2025-09-13 00:29:09.976 [INFO][4752] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid9c48ba4d4a ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-eth0" Sep 13 00:29:10.018245 containerd[1571]: 2025-09-13 00:29:09.983 [INFO][4752] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-eth0" Sep 13 00:29:10.018245 containerd[1571]: 2025-09-13 00:29:09.983 [INFO][4752] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--xwvxw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8759312a-5948-41bc-8173-0a88fa7fe6fe", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 28, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c", Pod:"csi-node-driver-xwvxw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid9c48ba4d4a", MAC:"1a:9b:99:87:85:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:29:10.018245 containerd[1571]: 2025-09-13 00:29:10.009 [INFO][4752] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" Namespace="calico-system" Pod="csi-node-driver-xwvxw" WorkloadEndpoint="localhost-k8s-csi--node--driver--xwvxw-eth0" Sep 13 00:29:10.018245 containerd[1571]: time="2025-09-13T00:29:10.016073935Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.342371613s" Sep 13 00:29:10.018245 containerd[1571]: time="2025-09-13T00:29:10.017445790Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 13 00:29:10.021368 containerd[1571]: time="2025-09-13T00:29:10.020479755Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 00:29:10.022226 containerd[1571]: time="2025-09-13T00:29:10.022196638Z" level=info msg="CreateContainer within sandbox \"d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 13 00:29:10.040652 systemd-networkd[1487]: calif812ad2f0df: Gained IPv6LL Sep 13 00:29:10.053459 containerd[1571]: time="2025-09-13T00:29:10.052560545Z" level=info msg="Container a0d3fe6268ef133607317373336a1511b88ebebc1e88feed090517eca03b3329: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:10.096488 containerd[1571]: time="2025-09-13T00:29:10.096381583Z" level=info msg="CreateContainer within sandbox \"d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"a0d3fe6268ef133607317373336a1511b88ebebc1e88feed090517eca03b3329\"" Sep 13 00:29:10.097948 containerd[1571]: time="2025-09-13T00:29:10.097446362Z" level=info msg="StartContainer for \"a0d3fe6268ef133607317373336a1511b88ebebc1e88feed090517eca03b3329\"" Sep 13 00:29:10.109015 containerd[1571]: time="2025-09-13T00:29:10.108908157Z" level=info msg="connecting to shim a0d3fe6268ef133607317373336a1511b88ebebc1e88feed090517eca03b3329" address="unix:///run/containerd/s/02071faaefa4b8cdf39e4a5031be3351515ecb60873b058e44012e40b76449a8" protocol=ttrpc version=3 Sep 13 00:29:10.161622 containerd[1571]: time="2025-09-13T00:29:10.160864737Z" level=info msg="connecting to shim 0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c" address="unix:///run/containerd/s/24309bed4f1b5b2157ca8ceed9c8760593405e1a3f821bb4607604f5b0b71afd" namespace=k8s.io protocol=ttrpc version=3 Sep 13 00:29:10.180538 systemd[1]: Started cri-containerd-a0d3fe6268ef133607317373336a1511b88ebebc1e88feed090517eca03b3329.scope - libcontainer container a0d3fe6268ef133607317373336a1511b88ebebc1e88feed090517eca03b3329. Sep 13 00:29:10.269150 systemd[1]: Started cri-containerd-0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c.scope - libcontainer container 0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c. Sep 13 00:29:10.316524 systemd-resolved[1406]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 13 00:29:10.402160 kubelet[2698]: I0913 00:29:10.400974 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-gz9hh" podStartSLOduration=43.400948875 podStartE2EDuration="43.400948875s" podCreationTimestamp="2025-09-13 00:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:29:10.399856614 +0000 UTC m=+47.874886877" watchObservedRunningTime="2025-09-13 00:29:10.400948875 +0000 UTC m=+47.875979127" Sep 13 00:29:10.403447 containerd[1571]: time="2025-09-13T00:29:10.402887545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwvxw,Uid:8759312a-5948-41bc-8173-0a88fa7fe6fe,Namespace:calico-system,Attempt:0,} returns sandbox id \"0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c\"" Sep 13 00:29:10.404765 containerd[1571]: time="2025-09-13T00:29:10.404724252Z" level=info msg="StartContainer for \"a0d3fe6268ef133607317373336a1511b88ebebc1e88feed090517eca03b3329\" returns successfully" Sep 13 00:29:11.322000 systemd-networkd[1487]: calid9c48ba4d4a: Gained IPv6LL Sep 13 00:29:12.827135 containerd[1571]: time="2025-09-13T00:29:12.827027161Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\" id:\"51baad7909c97b1c24a927bfe135fefae710531bf292668abcf866c241f99318\" pid:4888 exited_at:{seconds:1757723352 nanos:825669903}" Sep 13 00:29:15.567429 systemd[1]: Started sshd@7-10.0.0.95:22-10.0.0.1:36142.service - OpenSSH per-connection server daemon (10.0.0.1:36142). Sep 13 00:29:15.854450 sshd[4908]: Accepted publickey for core from 10.0.0.1 port 36142 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:15.857131 sshd-session[4908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:15.874096 systemd-logind[1546]: New session 8 of user core. Sep 13 00:29:15.890115 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 13 00:29:16.324880 containerd[1571]: time="2025-09-13T00:29:16.320519353Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:16.324880 containerd[1571]: time="2025-09-13T00:29:16.324617214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 13 00:29:16.327791 containerd[1571]: time="2025-09-13T00:29:16.327672357Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:16.344364 containerd[1571]: time="2025-09-13T00:29:16.344060843Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:16.345006 containerd[1571]: time="2025-09-13T00:29:16.344979967Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 6.324463714s" Sep 13 00:29:16.345098 containerd[1571]: time="2025-09-13T00:29:16.345079655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 00:29:16.380768 containerd[1571]: time="2025-09-13T00:29:16.380279043Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 13 00:29:16.406156 containerd[1571]: time="2025-09-13T00:29:16.404395321Z" level=info msg="CreateContainer within sandbox \"bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 00:29:16.457057 containerd[1571]: time="2025-09-13T00:29:16.455185809Z" level=info msg="Container 2a46d5668886c54ae259e9349a2958d23890b6805f2bc1b2e9783806c7d2c1e7: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:16.516774 sshd[4910]: Connection closed by 10.0.0.1 port 36142 Sep 13 00:29:16.514887 sshd-session[4908]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:16.533283 systemd-logind[1546]: Session 8 logged out. Waiting for processes to exit. Sep 13 00:29:16.535037 systemd[1]: sshd@7-10.0.0.95:22-10.0.0.1:36142.service: Deactivated successfully. Sep 13 00:29:16.538612 systemd[1]: session-8.scope: Deactivated successfully. Sep 13 00:29:16.546979 systemd-logind[1546]: Removed session 8. Sep 13 00:29:16.591796 containerd[1571]: time="2025-09-13T00:29:16.588956413Z" level=info msg="CreateContainer within sandbox \"bce5616ac7ccf641559b03ad48153d125249fe0a6e2bc69d7357a91cf530f4a0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2a46d5668886c54ae259e9349a2958d23890b6805f2bc1b2e9783806c7d2c1e7\"" Sep 13 00:29:16.611435 containerd[1571]: time="2025-09-13T00:29:16.611301286Z" level=info msg="StartContainer for \"2a46d5668886c54ae259e9349a2958d23890b6805f2bc1b2e9783806c7d2c1e7\"" Sep 13 00:29:16.614416 containerd[1571]: time="2025-09-13T00:29:16.612983093Z" level=info msg="connecting to shim 2a46d5668886c54ae259e9349a2958d23890b6805f2bc1b2e9783806c7d2c1e7" address="unix:///run/containerd/s/76a4d7e7e5ef08316031d05a7018e1174dbf067f513138322fb1bc65b78b3f88" protocol=ttrpc version=3 Sep 13 00:29:16.703750 systemd[1]: Started cri-containerd-2a46d5668886c54ae259e9349a2958d23890b6805f2bc1b2e9783806c7d2c1e7.scope - libcontainer container 2a46d5668886c54ae259e9349a2958d23890b6805f2bc1b2e9783806c7d2c1e7. Sep 13 00:29:16.869142 containerd[1571]: time="2025-09-13T00:29:16.868333633Z" level=info msg="StartContainer for \"2a46d5668886c54ae259e9349a2958d23890b6805f2bc1b2e9783806c7d2c1e7\" returns successfully" Sep 13 00:29:17.524948 kubelet[2698]: I0913 00:29:17.524734 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c5df58fcc-l7vmt" podStartSLOduration=32.344688017 podStartE2EDuration="40.522857701s" podCreationTimestamp="2025-09-13 00:28:37 +0000 UTC" firstStartedPulling="2025-09-13 00:29:08.20175856 +0000 UTC m=+45.676788802" lastFinishedPulling="2025-09-13 00:29:16.379928244 +0000 UTC m=+53.854958486" observedRunningTime="2025-09-13 00:29:17.520623068 +0000 UTC m=+54.995653310" watchObservedRunningTime="2025-09-13 00:29:17.522857701 +0000 UTC m=+54.997887943" Sep 13 00:29:21.532043 systemd[1]: Started sshd@8-10.0.0.95:22-10.0.0.1:47022.service - OpenSSH per-connection server daemon (10.0.0.1:47022). Sep 13 00:29:21.686304 sshd[4982]: Accepted publickey for core from 10.0.0.1 port 47022 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:21.690276 sshd-session[4982]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:21.708793 systemd-logind[1546]: New session 9 of user core. Sep 13 00:29:21.724760 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 13 00:29:22.244273 containerd[1571]: time="2025-09-13T00:29:22.244097614Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:22.245983 containerd[1571]: time="2025-09-13T00:29:22.245701977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 13 00:29:22.248808 containerd[1571]: time="2025-09-13T00:29:22.248718359Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:22.256179 containerd[1571]: time="2025-09-13T00:29:22.254851759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:22.260361 containerd[1571]: time="2025-09-13T00:29:22.260123657Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 5.879785743s" Sep 13 00:29:22.260361 containerd[1571]: time="2025-09-13T00:29:22.260184064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 13 00:29:22.298589 containerd[1571]: time="2025-09-13T00:29:22.298542706Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 00:29:22.348594 sshd[4984]: Connection closed by 10.0.0.1 port 47022 Sep 13 00:29:22.348981 sshd-session[4982]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:22.376832 systemd[1]: sshd@8-10.0.0.95:22-10.0.0.1:47022.service: Deactivated successfully. Sep 13 00:29:22.381881 systemd[1]: session-9.scope: Deactivated successfully. Sep 13 00:29:22.389531 containerd[1571]: time="2025-09-13T00:29:22.389487242Z" level=info msg="CreateContainer within sandbox \"818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 13 00:29:22.390829 systemd-logind[1546]: Session 9 logged out. Waiting for processes to exit. Sep 13 00:29:22.395232 systemd-logind[1546]: Removed session 9. Sep 13 00:29:22.557902 containerd[1571]: time="2025-09-13T00:29:22.555136376Z" level=info msg="Container 38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:22.920747 containerd[1571]: time="2025-09-13T00:29:22.915824975Z" level=info msg="CreateContainer within sandbox \"818e7138c5f9fd5da6d187e8154b4886c8ccf15dadacf4d667a1724f20dc6c81\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f\"" Sep 13 00:29:22.920747 containerd[1571]: time="2025-09-13T00:29:22.917519042Z" level=info msg="StartContainer for \"38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f\"" Sep 13 00:29:22.929979 containerd[1571]: time="2025-09-13T00:29:22.928388019Z" level=info msg="connecting to shim 38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f" address="unix:///run/containerd/s/983c7c640d269843062a983cc260cf7bb6a468611e39f2aa0b1b27231bb51ebd" protocol=ttrpc version=3 Sep 13 00:29:23.163994 systemd[1]: Started cri-containerd-38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f.scope - libcontainer container 38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f. Sep 13 00:29:23.437783 containerd[1571]: time="2025-09-13T00:29:23.437726214Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:23.611555 containerd[1571]: time="2025-09-13T00:29:23.609995346Z" level=info msg="StartContainer for \"38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f\" returns successfully" Sep 13 00:29:23.617728 containerd[1571]: time="2025-09-13T00:29:23.616307682Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 13 00:29:23.617728 containerd[1571]: time="2025-09-13T00:29:23.627866147Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 1.328695271s" Sep 13 00:29:23.617728 containerd[1571]: time="2025-09-13T00:29:23.627918959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 00:29:23.631802 containerd[1571]: time="2025-09-13T00:29:23.630192194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 13 00:29:23.670485 containerd[1571]: time="2025-09-13T00:29:23.670274229Z" level=info msg="CreateContainer within sandbox \"4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 00:29:24.493836 containerd[1571]: time="2025-09-13T00:29:24.493600632Z" level=info msg="Container dd0b3a997d9cdf4fe53818da3e95a9fb481ad60ff60a894f9e9625171bff91d3: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:24.641365 containerd[1571]: time="2025-09-13T00:29:24.641285976Z" level=info msg="CreateContainer within sandbox \"4c2850ffda5e708ea4ae63bb7f6c74b1dc351e8fe337594454b1b73a48e0ffad\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"dd0b3a997d9cdf4fe53818da3e95a9fb481ad60ff60a894f9e9625171bff91d3\"" Sep 13 00:29:24.645996 containerd[1571]: time="2025-09-13T00:29:24.643723637Z" level=info msg="StartContainer for \"dd0b3a997d9cdf4fe53818da3e95a9fb481ad60ff60a894f9e9625171bff91d3\"" Sep 13 00:29:24.649839 containerd[1571]: time="2025-09-13T00:29:24.648522611Z" level=info msg="connecting to shim dd0b3a997d9cdf4fe53818da3e95a9fb481ad60ff60a894f9e9625171bff91d3" address="unix:///run/containerd/s/5b1aa5b23549233a76901f9365762b0c9af1fe215dc3ac472bc39b0e961c6f0a" protocol=ttrpc version=3 Sep 13 00:29:24.817381 systemd[1]: Started cri-containerd-dd0b3a997d9cdf4fe53818da3e95a9fb481ad60ff60a894f9e9625171bff91d3.scope - libcontainer container dd0b3a997d9cdf4fe53818da3e95a9fb481ad60ff60a894f9e9625171bff91d3. Sep 13 00:29:24.832876 kubelet[2698]: I0913 00:29:24.832804 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-56848bfb48-n8vcn" podStartSLOduration=30.900357174 podStartE2EDuration="44.832774254s" podCreationTimestamp="2025-09-13 00:28:40 +0000 UTC" firstStartedPulling="2025-09-13 00:29:08.361260136 +0000 UTC m=+45.836290378" lastFinishedPulling="2025-09-13 00:29:22.293677216 +0000 UTC m=+59.768707458" observedRunningTime="2025-09-13 00:29:24.832274525 +0000 UTC m=+62.307304767" watchObservedRunningTime="2025-09-13 00:29:24.832774254 +0000 UTC m=+62.307804496" Sep 13 00:29:24.910545 containerd[1571]: time="2025-09-13T00:29:24.910497566Z" level=info msg="TaskExit event in podsandbox handler container_id:\"38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f\" id:\"4808133017aa98e1fad1b2d202e3c979a2aab256b7f38c1b1686ca1f447f0989\" pid:5068 exited_at:{seconds:1757723364 nanos:907265416}" Sep 13 00:29:25.044976 containerd[1571]: time="2025-09-13T00:29:25.036217011Z" level=info msg="StartContainer for \"dd0b3a997d9cdf4fe53818da3e95a9fb481ad60ff60a894f9e9625171bff91d3\" returns successfully" Sep 13 00:29:25.739458 kubelet[2698]: I0913 00:29:25.739255 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c5df58fcc-6mngl" podStartSLOduration=33.884270253 podStartE2EDuration="48.739227054s" podCreationTimestamp="2025-09-13 00:28:37 +0000 UTC" firstStartedPulling="2025-09-13 00:29:08.775035847 +0000 UTC m=+46.250066089" lastFinishedPulling="2025-09-13 00:29:23.629992648 +0000 UTC m=+61.105022890" observedRunningTime="2025-09-13 00:29:25.737260048 +0000 UTC m=+63.212290300" watchObservedRunningTime="2025-09-13 00:29:25.739227054 +0000 UTC m=+63.214257296" Sep 13 00:29:27.376596 systemd[1]: Started sshd@9-10.0.0.95:22-10.0.0.1:47048.service - OpenSSH per-connection server daemon (10.0.0.1:47048). Sep 13 00:29:27.776208 sshd[5109]: Accepted publickey for core from 10.0.0.1 port 47048 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:27.778770 sshd-session[5109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:27.817021 systemd-logind[1546]: New session 10 of user core. Sep 13 00:29:27.822325 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 13 00:29:28.179708 sshd[5118]: Connection closed by 10.0.0.1 port 47048 Sep 13 00:29:28.179726 sshd-session[5109]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:28.187813 systemd-logind[1546]: Session 10 logged out. Waiting for processes to exit. Sep 13 00:29:28.189317 systemd[1]: sshd@9-10.0.0.95:22-10.0.0.1:47048.service: Deactivated successfully. Sep 13 00:29:28.192837 systemd[1]: session-10.scope: Deactivated successfully. Sep 13 00:29:28.197265 systemd-logind[1546]: Removed session 10. Sep 13 00:29:29.874891 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3010700219.mount: Deactivated successfully. Sep 13 00:29:32.776006 containerd[1571]: time="2025-09-13T00:29:32.772640146Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:32.782792 containerd[1571]: time="2025-09-13T00:29:32.782737240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 13 00:29:32.803430 containerd[1571]: time="2025-09-13T00:29:32.802372281Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:32.861814 containerd[1571]: time="2025-09-13T00:29:32.860170655Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:32.861814 containerd[1571]: time="2025-09-13T00:29:32.860967809Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 9.230736379s" Sep 13 00:29:32.861814 containerd[1571]: time="2025-09-13T00:29:32.860993749Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 13 00:29:32.865413 containerd[1571]: time="2025-09-13T00:29:32.864421586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 13 00:29:32.873359 containerd[1571]: time="2025-09-13T00:29:32.872167645Z" level=info msg="CreateContainer within sandbox \"511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 13 00:29:32.967833 containerd[1571]: time="2025-09-13T00:29:32.967729091Z" level=info msg="Container da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:33.055326 containerd[1571]: time="2025-09-13T00:29:33.055201067Z" level=info msg="CreateContainer within sandbox \"511a6921258eee1652df51ef3de19e4217c056f0507f59279dc926a24ec01a7e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\"" Sep 13 00:29:33.060708 containerd[1571]: time="2025-09-13T00:29:33.058962051Z" level=info msg="StartContainer for \"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\"" Sep 13 00:29:33.067082 containerd[1571]: time="2025-09-13T00:29:33.066651881Z" level=info msg="connecting to shim da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb" address="unix:///run/containerd/s/72565944cc1c43bf0fba727d0c00394cf16c83c8e544d6a6f0d9d6d5b5cc577b" protocol=ttrpc version=3 Sep 13 00:29:33.179385 systemd[1]: Started cri-containerd-da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb.scope - libcontainer container da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb. Sep 13 00:29:33.204503 systemd[1]: Started sshd@10-10.0.0.95:22-10.0.0.1:37548.service - OpenSSH per-connection server daemon (10.0.0.1:37548). Sep 13 00:29:33.408432 containerd[1571]: time="2025-09-13T00:29:33.406274558Z" level=info msg="StartContainer for \"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\" returns successfully" Sep 13 00:29:33.486766 sshd[5168]: Accepted publickey for core from 10.0.0.1 port 37548 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:33.501732 sshd-session[5168]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:33.525595 systemd-logind[1546]: New session 11 of user core. Sep 13 00:29:33.541681 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 13 00:29:33.967506 sshd[5184]: Connection closed by 10.0.0.1 port 37548 Sep 13 00:29:33.964496 sshd-session[5168]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:33.990740 systemd-logind[1546]: Session 11 logged out. Waiting for processes to exit. Sep 13 00:29:33.992015 systemd[1]: sshd@10-10.0.0.95:22-10.0.0.1:37548.service: Deactivated successfully. Sep 13 00:29:34.004374 systemd[1]: session-11.scope: Deactivated successfully. Sep 13 00:29:34.009700 containerd[1571]: time="2025-09-13T00:29:34.009007848Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\" id:\"e47342c099bb6075ca2e631da703487cf381f9c7e3b5c5da2d3f0ead81f8d523\" pid:5207 exit_status:1 exited_at:{seconds:1757723374 nanos:8537394}" Sep 13 00:29:34.009862 systemd-logind[1546]: Removed session 11. Sep 13 00:29:35.017362 containerd[1571]: time="2025-09-13T00:29:35.016017070Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\" id:\"6ed094d9c2d54b28abeaff6c5bc56c9f7be94aba70b2fb372ee6c2b48fdd0be3\" pid:5237 exit_status:1 exited_at:{seconds:1757723375 nanos:15552087}" Sep 13 00:29:35.488485 containerd[1571]: time="2025-09-13T00:29:35.486978069Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:35.494650 containerd[1571]: time="2025-09-13T00:29:35.492885627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 13 00:29:35.520604 containerd[1571]: time="2025-09-13T00:29:35.520527054Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:35.527698 containerd[1571]: time="2025-09-13T00:29:35.526555143Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:35.527698 containerd[1571]: time="2025-09-13T00:29:35.527312909Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.66286408s" Sep 13 00:29:35.527698 containerd[1571]: time="2025-09-13T00:29:35.527361653Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 13 00:29:35.532765 containerd[1571]: time="2025-09-13T00:29:35.530431399Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 13 00:29:35.534207 containerd[1571]: time="2025-09-13T00:29:35.533897908Z" level=info msg="CreateContainer within sandbox \"0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 13 00:29:35.589919 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount642611600.mount: Deactivated successfully. Sep 13 00:29:35.596573 containerd[1571]: time="2025-09-13T00:29:35.595192200Z" level=info msg="Container 1d4e9d58cc6b7b69e274574ae0c7fae4192bfa43a5a582e2eaf43c175bcf649f: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:35.633194 containerd[1571]: time="2025-09-13T00:29:35.633137400Z" level=info msg="CreateContainer within sandbox \"0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"1d4e9d58cc6b7b69e274574ae0c7fae4192bfa43a5a582e2eaf43c175bcf649f\"" Sep 13 00:29:35.634378 containerd[1571]: time="2025-09-13T00:29:35.634318769Z" level=info msg="StartContainer for \"1d4e9d58cc6b7b69e274574ae0c7fae4192bfa43a5a582e2eaf43c175bcf649f\"" Sep 13 00:29:35.638208 containerd[1571]: time="2025-09-13T00:29:35.638094161Z" level=info msg="connecting to shim 1d4e9d58cc6b7b69e274574ae0c7fae4192bfa43a5a582e2eaf43c175bcf649f" address="unix:///run/containerd/s/24309bed4f1b5b2157ca8ceed9c8760593405e1a3f821bb4607604f5b0b71afd" protocol=ttrpc version=3 Sep 13 00:29:35.688922 systemd[1]: Started cri-containerd-1d4e9d58cc6b7b69e274574ae0c7fae4192bfa43a5a582e2eaf43c175bcf649f.scope - libcontainer container 1d4e9d58cc6b7b69e274574ae0c7fae4192bfa43a5a582e2eaf43c175bcf649f. Sep 13 00:29:35.762326 containerd[1571]: time="2025-09-13T00:29:35.762160603Z" level=info msg="StartContainer for \"1d4e9d58cc6b7b69e274574ae0c7fae4192bfa43a5a582e2eaf43c175bcf649f\" returns successfully" Sep 13 00:29:38.117414 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2764377368.mount: Deactivated successfully. Sep 13 00:29:38.725610 containerd[1571]: time="2025-09-13T00:29:38.725530068Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:38.727393 containerd[1571]: time="2025-09-13T00:29:38.727315973Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 13 00:29:38.728939 containerd[1571]: time="2025-09-13T00:29:38.728847780Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:38.731884 containerd[1571]: time="2025-09-13T00:29:38.731821061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:38.732824 containerd[1571]: time="2025-09-13T00:29:38.732773768Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.202309886s" Sep 13 00:29:38.732888 containerd[1571]: time="2025-09-13T00:29:38.732828122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 13 00:29:38.734311 containerd[1571]: time="2025-09-13T00:29:38.734018625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 13 00:29:38.735622 containerd[1571]: time="2025-09-13T00:29:38.735588214Z" level=info msg="CreateContainer within sandbox \"d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 13 00:29:38.745833 containerd[1571]: time="2025-09-13T00:29:38.745767774Z" level=info msg="Container 4229687c5543f078a23759ca2223b2588680b9d624b171587abaf7a1bbd52548: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:38.756939 containerd[1571]: time="2025-09-13T00:29:38.756875843Z" level=info msg="CreateContainer within sandbox \"d5afefa1810d97931a6cb30089cb70fa3780105dc5055f55241684e45e8f5e21\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"4229687c5543f078a23759ca2223b2588680b9d624b171587abaf7a1bbd52548\"" Sep 13 00:29:38.757497 containerd[1571]: time="2025-09-13T00:29:38.757456017Z" level=info msg="StartContainer for \"4229687c5543f078a23759ca2223b2588680b9d624b171587abaf7a1bbd52548\"" Sep 13 00:29:38.758617 containerd[1571]: time="2025-09-13T00:29:38.758588397Z" level=info msg="connecting to shim 4229687c5543f078a23759ca2223b2588680b9d624b171587abaf7a1bbd52548" address="unix:///run/containerd/s/02071faaefa4b8cdf39e4a5031be3351515ecb60873b058e44012e40b76449a8" protocol=ttrpc version=3 Sep 13 00:29:38.790658 systemd[1]: Started cri-containerd-4229687c5543f078a23759ca2223b2588680b9d624b171587abaf7a1bbd52548.scope - libcontainer container 4229687c5543f078a23759ca2223b2588680b9d624b171587abaf7a1bbd52548. Sep 13 00:29:38.851847 containerd[1571]: time="2025-09-13T00:29:38.851783830Z" level=info msg="StartContainer for \"4229687c5543f078a23759ca2223b2588680b9d624b171587abaf7a1bbd52548\" returns successfully" Sep 13 00:29:38.980559 systemd[1]: Started sshd@11-10.0.0.95:22-10.0.0.1:37592.service - OpenSSH per-connection server daemon (10.0.0.1:37592). Sep 13 00:29:39.067235 sshd[5326]: Accepted publickey for core from 10.0.0.1 port 37592 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:39.067974 sshd-session[5326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:39.075077 systemd-logind[1546]: New session 12 of user core. Sep 13 00:29:39.081582 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 13 00:29:39.788800 sshd[5328]: Connection closed by 10.0.0.1 port 37592 Sep 13 00:29:39.790821 sshd-session[5326]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:39.802287 systemd[1]: sshd@11-10.0.0.95:22-10.0.0.1:37592.service: Deactivated successfully. Sep 13 00:29:39.805014 systemd[1]: session-12.scope: Deactivated successfully. Sep 13 00:29:39.808151 systemd-logind[1546]: Session 12 logged out. Waiting for processes to exit. Sep 13 00:29:39.813290 systemd[1]: Started sshd@12-10.0.0.95:22-10.0.0.1:37602.service - OpenSSH per-connection server daemon (10.0.0.1:37602). Sep 13 00:29:39.814110 systemd-logind[1546]: Removed session 12. Sep 13 00:29:39.877548 sshd[5342]: Accepted publickey for core from 10.0.0.1 port 37602 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:39.879372 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:39.884150 systemd-logind[1546]: New session 13 of user core. Sep 13 00:29:39.891478 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 13 00:29:39.903678 kubelet[2698]: I0913 00:29:39.903598 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-gx8np" podStartSLOduration=36.98545416 podStartE2EDuration="1m0.90357627s" podCreationTimestamp="2025-09-13 00:28:39 +0000 UTC" firstStartedPulling="2025-09-13 00:29:08.944552214 +0000 UTC m=+46.419582456" lastFinishedPulling="2025-09-13 00:29:32.862674324 +0000 UTC m=+70.337704566" observedRunningTime="2025-09-13 00:29:33.819183952 +0000 UTC m=+71.294214194" watchObservedRunningTime="2025-09-13 00:29:39.90357627 +0000 UTC m=+77.378606512" Sep 13 00:29:40.821523 sshd[5344]: Connection closed by 10.0.0.1 port 37602 Sep 13 00:29:40.822666 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:40.834917 systemd[1]: sshd@12-10.0.0.95:22-10.0.0.1:37602.service: Deactivated successfully. Sep 13 00:29:40.837931 systemd[1]: session-13.scope: Deactivated successfully. Sep 13 00:29:40.840902 systemd-logind[1546]: Session 13 logged out. Waiting for processes to exit. Sep 13 00:29:40.844316 systemd[1]: Started sshd@13-10.0.0.95:22-10.0.0.1:44480.service - OpenSSH per-connection server daemon (10.0.0.1:44480). Sep 13 00:29:40.845772 systemd-logind[1546]: Removed session 13. Sep 13 00:29:40.914556 sshd[5358]: Accepted publickey for core from 10.0.0.1 port 44480 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:40.916794 sshd-session[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:40.922739 systemd-logind[1546]: New session 14 of user core. Sep 13 00:29:40.934550 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 13 00:29:41.135784 sshd[5360]: Connection closed by 10.0.0.1 port 44480 Sep 13 00:29:41.136594 sshd-session[5358]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:41.144263 systemd[1]: sshd@13-10.0.0.95:22-10.0.0.1:44480.service: Deactivated successfully. Sep 13 00:29:41.147133 systemd[1]: session-14.scope: Deactivated successfully. Sep 13 00:29:41.148486 systemd-logind[1546]: Session 14 logged out. Waiting for processes to exit. Sep 13 00:29:41.151898 systemd-logind[1546]: Removed session 14. Sep 13 00:29:42.006583 containerd[1571]: time="2025-09-13T00:29:42.006022910Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:42.025635 containerd[1571]: time="2025-09-13T00:29:42.007515015Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 13 00:29:42.025635 containerd[1571]: time="2025-09-13T00:29:42.009046796Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:42.025635 containerd[1571]: time="2025-09-13T00:29:42.011328522Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 3.277272284s" Sep 13 00:29:42.025635 containerd[1571]: time="2025-09-13T00:29:42.025625639Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 13 00:29:42.026482 containerd[1571]: time="2025-09-13T00:29:42.026435288Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:29:42.028915 containerd[1571]: time="2025-09-13T00:29:42.028881529Z" level=info msg="CreateContainer within sandbox \"0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 13 00:29:42.337288 containerd[1571]: time="2025-09-13T00:29:42.334480338Z" level=info msg="Container 6ea6dacc112d4a1a93e4cd08656bfb535e9fa4b4b79f7ebe321aa2bf2df434c4: CDI devices from CRI Config.CDIDevices: []" Sep 13 00:29:42.470282 containerd[1571]: time="2025-09-13T00:29:42.469900314Z" level=info msg="CreateContainer within sandbox \"0bca5f0a32dc885a9a9130bd97bcb63df74fed3f6be3b046ee760de3a38dec8c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"6ea6dacc112d4a1a93e4cd08656bfb535e9fa4b4b79f7ebe321aa2bf2df434c4\"" Sep 13 00:29:42.477837 containerd[1571]: time="2025-09-13T00:29:42.474490307Z" level=info msg="StartContainer for \"6ea6dacc112d4a1a93e4cd08656bfb535e9fa4b4b79f7ebe321aa2bf2df434c4\"" Sep 13 00:29:42.486965 containerd[1571]: time="2025-09-13T00:29:42.484547791Z" level=info msg="connecting to shim 6ea6dacc112d4a1a93e4cd08656bfb535e9fa4b4b79f7ebe321aa2bf2df434c4" address="unix:///run/containerd/s/24309bed4f1b5b2157ca8ceed9c8760593405e1a3f821bb4607604f5b0b71afd" protocol=ttrpc version=3 Sep 13 00:29:42.577014 systemd[1]: Started cri-containerd-6ea6dacc112d4a1a93e4cd08656bfb535e9fa4b4b79f7ebe321aa2bf2df434c4.scope - libcontainer container 6ea6dacc112d4a1a93e4cd08656bfb535e9fa4b4b79f7ebe321aa2bf2df434c4. Sep 13 00:29:42.884424 containerd[1571]: time="2025-09-13T00:29:42.884330384Z" level=info msg="StartContainer for \"6ea6dacc112d4a1a93e4cd08656bfb535e9fa4b4b79f7ebe321aa2bf2df434c4\" returns successfully" Sep 13 00:29:42.973537 containerd[1571]: time="2025-09-13T00:29:42.973484934Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\" id:\"6b37ab12166a4428fbcc49712def8ac5625cf41fb87943e1824e4ce854b7e2f4\" pid:5407 exited_at:{seconds:1757723382 nanos:971388352}" Sep 13 00:29:43.779980 kubelet[2698]: I0913 00:29:43.779545 2698 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 13 00:29:43.779980 kubelet[2698]: I0913 00:29:43.779605 2698 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 13 00:29:43.984657 kubelet[2698]: I0913 00:29:43.983852 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-647bb8d697-72k8l" podStartSLOduration=9.921258407 podStartE2EDuration="40.983830306s" podCreationTimestamp="2025-09-13 00:29:03 +0000 UTC" firstStartedPulling="2025-09-13 00:29:07.671214441 +0000 UTC m=+45.146244683" lastFinishedPulling="2025-09-13 00:29:38.73378634 +0000 UTC m=+76.208816582" observedRunningTime="2025-09-13 00:29:39.902476422 +0000 UTC m=+77.377506674" watchObservedRunningTime="2025-09-13 00:29:43.983830306 +0000 UTC m=+81.458860548" Sep 13 00:29:46.152144 systemd[1]: Started sshd@14-10.0.0.95:22-10.0.0.1:44504.service - OpenSSH per-connection server daemon (10.0.0.1:44504). Sep 13 00:29:46.233737 sshd[5436]: Accepted publickey for core from 10.0.0.1 port 44504 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:46.235326 sshd-session[5436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:46.240809 systemd-logind[1546]: New session 15 of user core. Sep 13 00:29:46.250491 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 13 00:29:46.626926 sshd[5438]: Connection closed by 10.0.0.1 port 44504 Sep 13 00:29:46.627296 sshd-session[5436]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:46.632594 systemd[1]: sshd@14-10.0.0.95:22-10.0.0.1:44504.service: Deactivated successfully. Sep 13 00:29:46.635257 systemd[1]: session-15.scope: Deactivated successfully. Sep 13 00:29:46.636135 systemd-logind[1546]: Session 15 logged out. Waiting for processes to exit. Sep 13 00:29:46.637819 systemd-logind[1546]: Removed session 15. Sep 13 00:29:51.641303 systemd[1]: Started sshd@15-10.0.0.95:22-10.0.0.1:46346.service - OpenSSH per-connection server daemon (10.0.0.1:46346). Sep 13 00:29:51.694962 sshd[5462]: Accepted publickey for core from 10.0.0.1 port 46346 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:51.696883 sshd-session[5462]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:51.703393 systemd-logind[1546]: New session 16 of user core. Sep 13 00:29:51.710494 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 13 00:29:51.850597 sshd[5464]: Connection closed by 10.0.0.1 port 46346 Sep 13 00:29:51.851736 sshd-session[5462]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:51.857133 systemd[1]: sshd@15-10.0.0.95:22-10.0.0.1:46346.service: Deactivated successfully. Sep 13 00:29:51.859958 systemd[1]: session-16.scope: Deactivated successfully. Sep 13 00:29:51.861742 systemd-logind[1546]: Session 16 logged out. Waiting for processes to exit. Sep 13 00:29:51.863355 systemd-logind[1546]: Removed session 16. Sep 13 00:29:52.231109 containerd[1571]: time="2025-09-13T00:29:52.231062207Z" level=info msg="TaskExit event in podsandbox handler container_id:\"38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f\" id:\"9fd68f1676e112c00381cc8ac0a05dd6f4e766053f6660da2cb56f514300862d\" pid:5488 exited_at:{seconds:1757723392 nanos:230831397}" Sep 13 00:29:53.566661 containerd[1571]: time="2025-09-13T00:29:53.566611312Z" level=info msg="TaskExit event in podsandbox handler container_id:\"38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f\" id:\"6ce790e185f16ac8592e866d49bf697f7bcfde5b836d8a070532c05edc40e0e3\" pid:5512 exited_at:{seconds:1757723393 nanos:566356766}" Sep 13 00:29:53.645001 containerd[1571]: time="2025-09-13T00:29:53.644949469Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\" id:\"25ba3101d26e19aac6f16464a8c65606d7dab95ecf9676df619a298be6ed903e\" pid:5534 exited_at:{seconds:1757723393 nanos:644520152}" Sep 13 00:29:53.661032 kubelet[2698]: I0913 00:29:53.660920 2698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xwvxw" podStartSLOduration=42.03849016 podStartE2EDuration="1m13.660898585s" podCreationTimestamp="2025-09-13 00:28:40 +0000 UTC" firstStartedPulling="2025-09-13 00:29:10.404325323 +0000 UTC m=+47.879355565" lastFinishedPulling="2025-09-13 00:29:42.026733748 +0000 UTC m=+79.501763990" observedRunningTime="2025-09-13 00:29:43.984633241 +0000 UTC m=+81.459663503" watchObservedRunningTime="2025-09-13 00:29:53.660898585 +0000 UTC m=+91.135928828" Sep 13 00:29:55.644452 containerd[1571]: time="2025-09-13T00:29:55.644396002Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\" id:\"185adf8742a6be731916ab3c2a70d18517b6a9c9f1f7186b4add4a3776249af8\" pid:5560 exited_at:{seconds:1757723395 nanos:643766584}" Sep 13 00:29:56.866640 systemd[1]: Started sshd@16-10.0.0.95:22-10.0.0.1:46372.service - OpenSSH per-connection server daemon (10.0.0.1:46372). Sep 13 00:29:56.932110 sshd[5573]: Accepted publickey for core from 10.0.0.1 port 46372 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:29:56.934815 sshd-session[5573]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:29:56.939272 systemd-logind[1546]: New session 17 of user core. Sep 13 00:29:56.949516 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 13 00:29:57.233820 sshd[5575]: Connection closed by 10.0.0.1 port 46372 Sep 13 00:29:57.239730 systemd[1]: sshd@16-10.0.0.95:22-10.0.0.1:46372.service: Deactivated successfully. Sep 13 00:29:57.234198 sshd-session[5573]: pam_unix(sshd:session): session closed for user core Sep 13 00:29:57.242290 systemd[1]: session-17.scope: Deactivated successfully. Sep 13 00:29:57.243483 systemd-logind[1546]: Session 17 logged out. Waiting for processes to exit. Sep 13 00:29:57.245314 systemd-logind[1546]: Removed session 17. Sep 13 00:30:02.249862 systemd[1]: Started sshd@17-10.0.0.95:22-10.0.0.1:60860.service - OpenSSH per-connection server daemon (10.0.0.1:60860). Sep 13 00:30:02.312752 sshd[5590]: Accepted publickey for core from 10.0.0.1 port 60860 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:02.314846 sshd-session[5590]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:02.321328 systemd-logind[1546]: New session 18 of user core. Sep 13 00:30:02.332540 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 13 00:30:02.464793 sshd[5592]: Connection closed by 10.0.0.1 port 60860 Sep 13 00:30:02.465170 sshd-session[5590]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:02.471619 systemd[1]: sshd@17-10.0.0.95:22-10.0.0.1:60860.service: Deactivated successfully. Sep 13 00:30:02.474298 systemd[1]: session-18.scope: Deactivated successfully. Sep 13 00:30:02.476004 systemd-logind[1546]: Session 18 logged out. Waiting for processes to exit. Sep 13 00:30:02.477746 systemd-logind[1546]: Removed session 18. Sep 13 00:30:07.482384 systemd[1]: Started sshd@18-10.0.0.95:22-10.0.0.1:60876.service - OpenSSH per-connection server daemon (10.0.0.1:60876). Sep 13 00:30:07.534683 sshd[5606]: Accepted publickey for core from 10.0.0.1 port 60876 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:07.536507 sshd-session[5606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:07.541836 systemd-logind[1546]: New session 19 of user core. Sep 13 00:30:07.552527 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 13 00:30:07.670693 sshd[5608]: Connection closed by 10.0.0.1 port 60876 Sep 13 00:30:07.671119 sshd-session[5606]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:07.686242 systemd[1]: sshd@18-10.0.0.95:22-10.0.0.1:60876.service: Deactivated successfully. Sep 13 00:30:07.688628 systemd[1]: session-19.scope: Deactivated successfully. Sep 13 00:30:07.689781 systemd-logind[1546]: Session 19 logged out. Waiting for processes to exit. Sep 13 00:30:07.693596 systemd[1]: Started sshd@19-10.0.0.95:22-10.0.0.1:60880.service - OpenSSH per-connection server daemon (10.0.0.1:60880). Sep 13 00:30:07.694944 systemd-logind[1546]: Removed session 19. Sep 13 00:30:07.750589 sshd[5622]: Accepted publickey for core from 10.0.0.1 port 60880 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:07.752362 sshd-session[5622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:07.757427 systemd-logind[1546]: New session 20 of user core. Sep 13 00:30:07.764511 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 13 00:30:08.115878 sshd[5624]: Connection closed by 10.0.0.1 port 60880 Sep 13 00:30:08.116119 sshd-session[5622]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:08.129483 systemd[1]: sshd@19-10.0.0.95:22-10.0.0.1:60880.service: Deactivated successfully. Sep 13 00:30:08.131973 systemd[1]: session-20.scope: Deactivated successfully. Sep 13 00:30:08.133444 systemd-logind[1546]: Session 20 logged out. Waiting for processes to exit. Sep 13 00:30:08.136007 systemd-logind[1546]: Removed session 20. Sep 13 00:30:08.137907 systemd[1]: Started sshd@20-10.0.0.95:22-10.0.0.1:60888.service - OpenSSH per-connection server daemon (10.0.0.1:60888). Sep 13 00:30:08.203436 sshd[5635]: Accepted publickey for core from 10.0.0.1 port 60888 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:08.205642 sshd-session[5635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:08.211423 systemd-logind[1546]: New session 21 of user core. Sep 13 00:30:08.228514 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 13 00:30:10.224885 sshd[5637]: Connection closed by 10.0.0.1 port 60888 Sep 13 00:30:10.225227 sshd-session[5635]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:10.237103 systemd[1]: sshd@20-10.0.0.95:22-10.0.0.1:60888.service: Deactivated successfully. Sep 13 00:30:10.239778 systemd[1]: session-21.scope: Deactivated successfully. Sep 13 00:30:10.240080 systemd[1]: session-21.scope: Consumed 696ms CPU time, 74.2M memory peak. Sep 13 00:30:10.240751 systemd-logind[1546]: Session 21 logged out. Waiting for processes to exit. Sep 13 00:30:10.244103 systemd-logind[1546]: Removed session 21. Sep 13 00:30:10.245527 systemd[1]: Started sshd@21-10.0.0.95:22-10.0.0.1:50228.service - OpenSSH per-connection server daemon (10.0.0.1:50228). Sep 13 00:30:10.302622 sshd[5657]: Accepted publickey for core from 10.0.0.1 port 50228 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:10.304695 sshd-session[5657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:10.310470 systemd-logind[1546]: New session 22 of user core. Sep 13 00:30:10.320818 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 13 00:30:10.882157 sshd[5659]: Connection closed by 10.0.0.1 port 50228 Sep 13 00:30:10.882621 sshd-session[5657]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:10.897523 systemd[1]: sshd@21-10.0.0.95:22-10.0.0.1:50228.service: Deactivated successfully. Sep 13 00:30:10.901435 systemd[1]: session-22.scope: Deactivated successfully. Sep 13 00:30:10.905521 systemd-logind[1546]: Session 22 logged out. Waiting for processes to exit. Sep 13 00:30:10.907209 systemd[1]: Started sshd@22-10.0.0.95:22-10.0.0.1:50230.service - OpenSSH per-connection server daemon (10.0.0.1:50230). Sep 13 00:30:10.910015 systemd-logind[1546]: Removed session 22. Sep 13 00:30:10.962743 sshd[5671]: Accepted publickey for core from 10.0.0.1 port 50230 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:10.965247 sshd-session[5671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:10.972881 systemd-logind[1546]: New session 23 of user core. Sep 13 00:30:10.986262 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 13 00:30:11.244923 sshd[5673]: Connection closed by 10.0.0.1 port 50230 Sep 13 00:30:11.245261 sshd-session[5671]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:11.249902 systemd[1]: sshd@22-10.0.0.95:22-10.0.0.1:50230.service: Deactivated successfully. Sep 13 00:30:11.252711 systemd[1]: session-23.scope: Deactivated successfully. Sep 13 00:30:11.254720 systemd-logind[1546]: Session 23 logged out. Waiting for processes to exit. Sep 13 00:30:11.257519 systemd-logind[1546]: Removed session 23. Sep 13 00:30:12.738816 containerd[1571]: time="2025-09-13T00:30:12.738765597Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\" id:\"2c10b8def0505076a47994eb48168b420751d9c049202c9466f3db766f3d80a2\" pid:5697 exited_at:{seconds:1757723412 nanos:738407288}" Sep 13 00:30:16.263787 systemd[1]: Started sshd@23-10.0.0.95:22-10.0.0.1:50244.service - OpenSSH per-connection server daemon (10.0.0.1:50244). Sep 13 00:30:16.317882 sshd[5711]: Accepted publickey for core from 10.0.0.1 port 50244 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:16.320016 sshd-session[5711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:16.325227 systemd-logind[1546]: New session 24 of user core. Sep 13 00:30:16.333697 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 13 00:30:16.460004 sshd[5713]: Connection closed by 10.0.0.1 port 50244 Sep 13 00:30:16.460408 sshd-session[5711]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:16.466210 systemd[1]: sshd@23-10.0.0.95:22-10.0.0.1:50244.service: Deactivated successfully. Sep 13 00:30:16.468883 systemd[1]: session-24.scope: Deactivated successfully. Sep 13 00:30:16.469761 systemd-logind[1546]: Session 24 logged out. Waiting for processes to exit. Sep 13 00:30:16.471755 systemd-logind[1546]: Removed session 24. Sep 13 00:30:21.474776 systemd[1]: Started sshd@24-10.0.0.95:22-10.0.0.1:55088.service - OpenSSH per-connection server daemon (10.0.0.1:55088). Sep 13 00:30:21.527171 sshd[5726]: Accepted publickey for core from 10.0.0.1 port 55088 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:21.528903 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:21.534186 systemd-logind[1546]: New session 25 of user core. Sep 13 00:30:21.543471 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 13 00:30:21.673768 sshd[5728]: Connection closed by 10.0.0.1 port 55088 Sep 13 00:30:21.674149 sshd-session[5726]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:21.679861 systemd[1]: sshd@24-10.0.0.95:22-10.0.0.1:55088.service: Deactivated successfully. Sep 13 00:30:21.682720 systemd[1]: session-25.scope: Deactivated successfully. Sep 13 00:30:21.683676 systemd-logind[1546]: Session 25 logged out. Waiting for processes to exit. Sep 13 00:30:21.685757 systemd-logind[1546]: Removed session 25. Sep 13 00:30:23.566855 containerd[1571]: time="2025-09-13T00:30:23.566801999Z" level=info msg="TaskExit event in podsandbox handler container_id:\"38e0636b50eaa8dc4da0ba99ed0d26222962e1641d02b614b99a6e34e889d27f\" id:\"5bab780c0815d52a7f0c2e1ceeb0c8587a3878195f55bd41b08801ba8f447251\" pid:5754 exited_at:{seconds:1757723423 nanos:566414025}" Sep 13 00:30:23.725556 containerd[1571]: time="2025-09-13T00:30:23.725435081Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da5e6d6463447ca7ecaed3f6d1d4cccb6d749754a790d82bcfde5b14545e67eb\" id:\"2cb976db180a9f5a6c5ba7e582f543c88742f6ce661dd1001de08d9e986f44c6\" pid:5776 exited_at:{seconds:1757723423 nanos:724970973}" Sep 13 00:30:26.692922 systemd[1]: Started sshd@25-10.0.0.95:22-10.0.0.1:55094.service - OpenSSH per-connection server daemon (10.0.0.1:55094). Sep 13 00:30:26.746739 sshd[5789]: Accepted publickey for core from 10.0.0.1 port 55094 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:26.748860 sshd-session[5789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:26.754266 systemd-logind[1546]: New session 26 of user core. Sep 13 00:30:26.763782 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 13 00:30:26.929413 sshd[5791]: Connection closed by 10.0.0.1 port 55094 Sep 13 00:30:26.929765 sshd-session[5789]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:26.935161 systemd[1]: sshd@25-10.0.0.95:22-10.0.0.1:55094.service: Deactivated successfully. Sep 13 00:30:26.938038 systemd[1]: session-26.scope: Deactivated successfully. Sep 13 00:30:26.939183 systemd-logind[1546]: Session 26 logged out. Waiting for processes to exit. Sep 13 00:30:26.941045 systemd-logind[1546]: Removed session 26. Sep 13 00:30:31.954469 systemd[1]: Started sshd@26-10.0.0.95:22-10.0.0.1:53618.service - OpenSSH per-connection server daemon (10.0.0.1:53618). Sep 13 00:30:32.026496 sshd[5818]: Accepted publickey for core from 10.0.0.1 port 53618 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:32.028556 sshd-session[5818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:32.034293 systemd-logind[1546]: New session 27 of user core. Sep 13 00:30:32.044500 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 13 00:30:32.311441 sshd[5821]: Connection closed by 10.0.0.1 port 53618 Sep 13 00:30:32.312592 sshd-session[5818]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:32.318173 systemd[1]: sshd@26-10.0.0.95:22-10.0.0.1:53618.service: Deactivated successfully. Sep 13 00:30:32.321018 systemd[1]: session-27.scope: Deactivated successfully. Sep 13 00:30:32.323142 systemd-logind[1546]: Session 27 logged out. Waiting for processes to exit. Sep 13 00:30:32.325123 systemd-logind[1546]: Removed session 27. Sep 13 00:30:37.333579 systemd[1]: Started sshd@27-10.0.0.95:22-10.0.0.1:53650.service - OpenSSH per-connection server daemon (10.0.0.1:53650). Sep 13 00:30:37.390858 sshd[5847]: Accepted publickey for core from 10.0.0.1 port 53650 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:37.393426 sshd-session[5847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:37.401415 systemd-logind[1546]: New session 28 of user core. Sep 13 00:30:37.405736 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 13 00:30:37.730914 sshd[5850]: Connection closed by 10.0.0.1 port 53650 Sep 13 00:30:37.731728 sshd-session[5847]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:37.737981 systemd[1]: sshd@27-10.0.0.95:22-10.0.0.1:53650.service: Deactivated successfully. Sep 13 00:30:37.741158 systemd[1]: session-28.scope: Deactivated successfully. Sep 13 00:30:37.744957 systemd-logind[1546]: Session 28 logged out. Waiting for processes to exit. Sep 13 00:30:37.748656 systemd-logind[1546]: Removed session 28. Sep 13 00:30:42.696633 containerd[1571]: time="2025-09-13T00:30:42.696504741Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e0d295e1c83c69c7f31cb855d53d7ccf86c097becefe5b46d6967a285e217d66\" id:\"98ee07daee9807add6703b61feae5bb48d27f1df085bf4b781272f8c0ef229f2\" pid:5875 exited_at:{seconds:1757723442 nanos:695945576}" Sep 13 00:30:42.743963 systemd[1]: Started sshd@28-10.0.0.95:22-10.0.0.1:55374.service - OpenSSH per-connection server daemon (10.0.0.1:55374). Sep 13 00:30:42.817166 sshd[5890]: Accepted publickey for core from 10.0.0.1 port 55374 ssh2: RSA SHA256:mlYU9m+a2feC4Sym7fN+EoNujIcjljhjZFU1t4NzJ4c Sep 13 00:30:42.818936 sshd-session[5890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:30:42.824299 systemd-logind[1546]: New session 29 of user core. Sep 13 00:30:42.830527 systemd[1]: Started session-29.scope - Session 29 of User core. Sep 13 00:30:43.033179 sshd[5892]: Connection closed by 10.0.0.1 port 55374 Sep 13 00:30:43.034812 sshd-session[5890]: pam_unix(sshd:session): session closed for user core Sep 13 00:30:43.039511 systemd-logind[1546]: Session 29 logged out. Waiting for processes to exit. Sep 13 00:30:43.040179 systemd[1]: sshd@28-10.0.0.95:22-10.0.0.1:55374.service: Deactivated successfully. Sep 13 00:30:43.042584 systemd[1]: session-29.scope: Deactivated successfully. Sep 13 00:30:43.044992 systemd-logind[1546]: Removed session 29.